Prosecution Insights
Last updated: April 19, 2026
Application No. 17/496,948

TELE-MANUFACTURING SYSTEM

Non-Final OA §103
Filed
Oct 08, 2021
Examiner
LEWANDROSKI, SARA J
Art Unit
3661
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Edison Welding Institute Inc.
OA Round
5 (Non-Final)
81%
Grant Probability
Favorable
5-6
OA Rounds
2y 10m
To Grant
91%
With Interview

Examiner Intelligence

Grants 81% — above average
81%
Career Allow Rate
470 granted / 582 resolved
+28.8% vs TC avg
Moderate +10% lift
Without
With
+9.9%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
40 currently pending
Career history
622
Total Applications
across all art units

Statute-Specific Performance

§101
5.7%
-34.3% vs TC avg
§103
51.5%
+11.5% vs TC avg
§102
20.7%
-19.3% vs TC avg
§112
19.5%
-20.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 582 resolved cases

Office Action

§103
DETAILED ACTION This Non-Final Office Action is in response to amendments filed 1/9/2026. Claims 1, 13, and 21 have been amended. Claim 15 has been canceled. Claims 1-14 and 16-33 are pending. Response to Arguments Interviews On page 9 of Remarks filed 1/9/2026, the Applicant contends that the Examiner has reneged on recommendations for overcoming the applied prior art in interviews held on 11/3/2025 and 7/18/2025. The Examiner respectfully disagrees. The Applicant appears to conflate the traversal of applied prior art with a formal finding of allowability, which are distinct stages of the examination process. In both interviews, the Examiner had not performed prior art searches for the proposed amendments and merely indicated that the applied prior art did not teach the proposed amendments. For example, in the interview held on 7/18/2025, the Examiner addressed bullet 2 of the Applicant’s agenda, in regards to the request to “discuss whether the Examiner has any suggestions or possible amendments for advancing prosecution efficiently,” by suggesting limitations from paragraph [0028] of the Applicant’s specification, as indicated in the Examiner Interview Summary Record filed 7/30/2025. In the following Office Action mailed 10/10/2025, the Examiner applied a new combination of prior art that included a new reference (i.e. Bugalia) upon further search of the amendments filed 8/4/2025. Similarly, the present Office Action applies a new combination of prior art that includes a new reference (i.e. Liu) upon further search of the amendments filed 1/9/2026. While the goal of an interview is to advance the prosecution of the application, the Examiner is generally not in a position to give a final decision on allowability during the interview itself, due to time constraints indicated in MPEP 713.01(IV). Rejections under 35 U.S.C. 103 On page 11 of Remarks filed 1/9/2026, the Applicant contends that the combined prior art fails to render present claim 1 obvious. The Examiner agrees that the prior art applied in the Office Action mailed 10/10/2025 does not teach the amendments filed 1/9/2026. However, upon further search and consideration, a new reference has been applied in combination to teach the contended features; therefore, the Applicant’s arguments have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Further, due to the amendments filed 1/9/2026, allowable subject matter is indicated below. Key to Interpreting this Office Action For readability, all claim language has been underlined. Citations from prior art are provided at the end of each limitation in parentheses. Any further explanations that were deemed necessary by the Examiner are provided at the end of each claim limitation. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-6, 8-12, 21-26, and 28-32 are rejected under 35 U.S.C. 103 as being unpatentable over Wang et al. (US 2016/0229050 A1), hereinafter Wang, in view of Tanaka (US 5,382,943), hereinafter Tanaka, Tamayo (“Analog vs digital which one for wireless transmission,” Dec. 5, 2020, AVI Latino America, https://www.avilatinoamerica.com/en/2020120514196/articles/systems-integration/analog-vs-digital-which-one-for-wireless-transmission.html), hereinafter Tamayo, Bugalia et al. (“Immersive environment for robotic tele-operation,” July 2, 2015, AIR ’15: Proceedings of the 2015 Conferences on Advances in Robotics), hereinafter Bugalia, Liu et al. (US 2003/0210259 A1), hereinafter Liu, and Passot et al. (US 2015/0127154 A1), hereinafter Passot. Claim 1 Wang discloses the claimed system (see tele-operated robot system 10, depicted in Figure 1 and described in ¶0014 as comprising actuators such as welding guns, spraying guns, grippers, etc.) for manually controlling a manufacturing process remotely (see ¶0025, regarding that robot 12a includes a manufacturing tool 12d, where the operator has direct remote control of the motion of robot 12a and attached processes), comprising: (a) a manufacturing environment (i.e. robot station 12), wherein the manufacturing environment contains manufacturing equipment (i.e. robot 12a) used for or related to a manufacturing process (see ¶0025, regarding that robot 12a includes a manufacturing tool 12d, where the operator has direct remote control of the motion of robot 12a and attached processes), wherein the manufacturing equipment moves within a predetermined number of degrees of freedom (see ¶0012, regarding that robot 12a is a six degree of freedom industrial robot); (b) at least one sensor (i.e. sensor devices 12c) positioned within the manufacturing environment in proximity to the manufacturing equipment (see Figure 1, depicting the sensor devices 12c positioned within robot station 12 in proximity to robot 12a), wherein the at least one sensor is configured to gather data from the manufacturing environment (see ¶0012, regarding sensor devices 12c include cameras, microphones, position sensors, proximity sensors, and force sensors, that observe the robot station 12). Wang further discloses that the claimed system comprises (c) a computation device 18 in communication with the plurality of sensors for receiving data from sensors (see ¶0026-0027, regarding that computation device 18 receives and processes output from sensors devices 12c that are not configured as smart sensors), and (d) at least one processor (i.e. data processing device 14c) in communication with computation device 18 (see Figure 1, depicting computation device 18 in communication with data processing device 14c, where the processed sensor data is used for display on monitoring device 14b, as described in ¶0032) that includes software (see ¶0045, with respect to Figure 6, regarding a first computing system may be structured to receive input from an operator control device, process the received input (34, 42), and provide a resulting output to the communication link (40)). The combination of control function (logic) blocks 34 and 42 implemented in a computing system reasonably teaches “software” on the computing system, given that a control function (logic) inherently requires programming of a computing device. The computation device 18 of Wang may be reasonably modified to include at least one digitizer, given that sensor devices 12c, defined as cameras, microphones, position sensors, proximity sensors, and force sensors in ¶0012 of Wang, are commonly known to output analog sensor data, and analog data is commonly converted to digital data prior to transmission over a wireless communication link, so as to achieve more efficient propagation, in light of Tanaka and Tamayo. Specifically, Tanaka teaches a known system that comprises at least one digitizer (i.e. A/D converter 15) in communication with camera 3, microphone 5, and other sensors 4, 6, 28 (similar to the at least one sensor taught by Wang) (see col. 3, line 31-col. 4, line 65) to produce digital signals from the sensor data (see col. 5, lines 27-39) for transmission to a control center 7 (similar to the at least one processor of Wang) via radio transmission (similar to the communication link 16, described in ¶0020 of Wang) (see col. 5, lines 61-65), and thus, control center 7 is in communication with the at least one digitizer. In Wang, data received from the sensors is used to control welding operations of a robotic arm. In Tanaka, data received from the sensors is used for monitoring building security. However, it is the well-known configuration of including a digitizer to convert analog sensor data to digital sensor data for transmission to a remote computing system that is modified by Tanaka; therefore, the particular application of the sensor data does not influence this combination. Since the systems of Wang and Tanaka are directed to the same purpose, i.e. transmitting data from similar sensors, e.g., camera and microphone, to a remote location via wireless communication, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the computation device 18 of Wang to include at least one digitizer in communication with the at least one sensor for receiving data from sensors and in communication with at least one processor, in the same manner that Tanaka converts signals from sensors into digital signals for wireless transmission to a control center, with the predictable result of converting the signal into a form that suffers less interference (sixth paragraph of the first page, beginning with “As both transmission systems…” of Tamayo). While Wang discloses the use of a monitoring (display) device 14b located at operator station 14 that shows actual data about the robot motion and attached processes, e.g., camera images (see ¶0017-0018), where the cameras are associated with remote sensor devices 12c at robot station 12 (see ¶0012, with respect to Figure 1), Wang, as modified by Tanaka, does not further disclose that the “digitizer” convert[s] the data into one or more three-dimensional digital maps of the manufacturing environment and that the “software” is for receiving and analyzing the one or more three-dimensional digital maps, such that in response to receiving and analyzing the one or more three-dimensional digital maps, the software is further configured to (i) enable a haptic response based on one or more shapes detected in the manufacturing environment, (ii) detect and alert the user of variations or obstacles present in the manufacturing environment during the manufacturing process, and (iii) align the manufacturing equipment and the at least one manual controller to a same reference plane or field of view based on a desired location in the manufacturing environment. However, it would be obvious to modify the digitized sensor data of Wang and Tanaka to be further processed into a 3D digital map for analysis by a processor associated with a remote operator station, in light of Bugalia, given that no specific analysis or use of the 3D digital map is claimed. Further, it would be obvious to incorporate the steps of (i), (ii), and (iii) into the combination of Wang and Tanaka, in light of Bugalia and Liu, due to the separate and distinct nature of their associated functions, without any claimed relationships to the 3D digital map. Specifically, Bugalia teaches a similar system (see Figure 1) in which a slave robotic arm (similar to the manufacturing equipment taught by Wang) residing at remote block is controlled by a human operator and master interface (haptic device) (similar to the manual controller taught by Wang) residing at control block over a network block (see section 3, with respect to Figure 1). Bugalia further teaches that video manager captures video at the remote location using cameras (see section 3.2.1) (similar to the digitizer taught by the combination of Wang and Tanaka) for converting image data (similar to the data taught by Wang) into one or more three-dimensional digital maps of the actual environment at the remote block, defined as pertaining to an industrial robotic arm in section 3 (similar to the manufacturing environment taught by Wang) (see sections 4.1-4.2.3, with respect to Figures 5 and 6, regarding that image data of the actual environment is processed to reconstruct the remote environment and model the robotic arm, workbench, and pellets in the reconstructed environment as a 3D virtual model, where the 3D scene manager renders the reconstructed model of the remote environment, as discussed in section 3.1.1). Similar to the system of Wang, the cameras of Bugalia include cameras separate from (“in proximity to”) the slave robotic arm at the remote block (see section 3.2.1, with respect to Figure 1). Bugalia further teaches 3D scene manager of the control application (similar to the software executed by the processor taught by Wang) for receiving and analyzing the one or more three-dimensional digital maps (see section 3.1.1, with respect to Figure 1, regarding that 3D scene manager renders the 3D constructed model of the remote environment, where visual feedback is provided based on the detected objects and robotic arm, as described in section 3.1.2, received from remote block, as described in sections 3.1.5 and 3.2.5; sections 4.2.1-4.2.3, regarding the positions of the objects and robotic arm are detected from image data) of an industrial remote environment (similar to the manufacturing environment taught by Wang) (see section 3-3.1.1, with respect to Figure 1, regarding that the slave robotic arm is an industrial robotic arm operating in a remote environment; first paragraph of section 3, regarding that the framework is generic and can be used in a variety of tele-operation tasks). Given that the framework of Bugalia is defined as being generic and can be used in a variety of teleoperation tasks, where the robotic arm may be defined as an industrial robotic arm (see section 3), the remote actual environment of Bugalia may reasonably pertain to “manufacturing.” Bugalia further teaches that in response to receiving and analyzing the one or more three-dimensional digital maps, the control manager is further configured to detect and alert the user of variations or obstacles present in the actual remote environment (see section 3.1.2, regarding that visual feedback in the form of warnings are provided based on distances between various objects and the robotic arm, e.g., highlighting objects nearest to the gripper or flashing the screen with red color in response to collision warnings; sections 4.2.1-4.2.3, regarding the positions of the objects and robotic arm are detected from image data), and align the robotic arm (similar to the manufacturing equipment taught by Wang) and the haptic device (similar to the manual controller taught by Wang) to a same reference plane or field of view based on a desired location in the actual remote environment (see section 3.1.3, regarding control manager aligns the coordinate systems of the haptic device and the virtual view are aligned so that movement of the haptic device correspond to movement of the actual robotic arm). Since the systems of Wang and Bugalia are directed to the same purpose, i.e. remotely controlling a robotic arm using a manual controller, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the at least one digitizer of Wang and Tanaka, so as to further perform converting the data into one or more three-dimensional digital maps of the manufacturing environment, and to have modified the at least one processor of Wang, so as to further perform receiving and analyzing the at least one three-dimensional digital map, such that in response to receiving and analyzing the one or more three-dimensional digital maps, the software is further configured to (ii) detect and alert the user of variations or obstacles present in the manufacturing environment during the manufacturing process, and (iii) align the manufacturing equipment and the at least one manual controller to a same reference place or field of view based on a desired location in the manufacturing environment, in the same manner that a 3D virtual model of the remote actual environment of Bugalia is used for providing visual feedback of obstacles and aligning the actual robotic arm with the coordinate system of the haptic device, with the predictable result of providing an intuitive, user friendly, and reliable platform for robotic tele-operation (second paragraph under section 1 of Bugalia). While Bugalia further teaches that in response to receiving and analyzing the one or more three-dimensional digital maps, the control manager is configured to enable haptic feedback of the robotic arm gripper (see second paragraph of section 3.2.1), Bugalia does not specifically teach to enable a haptic response based on one or more shapes detected in the manufacturing environment. However, modifying the haptic feedback of Bugalia to be based on detected shapes in the environment would be obvious, in light of Liu. Specifically, Liu teaches a similar telerobot system in which a human operates a master unit (similar to the manual controller taught by Wang) while a slave unit (similar to the manufacturing equipment taught by Wang) operates in a different hazardous environment (similar to the manufacturing environment taught by Wang) (see ¶0007). Liu further teaches the known technique of enabl[ing] a haptic response based on one or more shapes detected in the different environment (see ¶0044-0045, regarding that a multi-tactile joystick comprises a force-feedback joystick with a tactile array covering the handle, so as to simultaneously render large scale haptic forces and small scale tactile effects that include vibratory, where the large scale contact forces define overall shape of the object and small scale tactile forces define the surface texture, as described in ¶0043; ¶0038-0040, regarding that the surface features rendered on the tactile array is based on a tactile map based on actual object surface properties derived using pictures taken from a real environment). No particular sensors are claimed for the “detection.” Since the systems of Wang, Bugalia, and Liu are directed to the same purpose, i.e. providing a telerobot that includes a master unit as a haptic joystick, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the response to receiving and analyzing the one or more three-dimensional digital maps taught by Bugalia, so as to further (i) enable a haptic response based on one or more shapes detected in the manufacturing environment, in the same manner that the multi-tactile joystick of Liu renders large and small scale haptic forces that define the overall shape and surface texture of an object in the environment, with the predictable result of providing a single haptic interface that is able to simulate both large scale and forces and textures (¶0013 of Liu), thus accounting for the disadvantage of limited feedback available to force-feedback devices (¶0010 of Liu), applicable to the robotic arm gripper of Bugalia, defined as force-feedback device in section 3.1.2, and the teleoperation input device of Wang, defined as a force-feedback device in ¶0030. Wang further discloses that the claimed system comprises (e) at least one manual controller (i.e. input devices 14a) in communication with the at least one processor (see Figure 1, depicting communication between input devices 14a and data processing device 14c), wherein the at least one manual controller receives a motion input from a user of the at least one manual controller (see ¶0016, regarding that an operator uses the input device 14a to create continuous motion signals (position or speed signals)). The input device 14a (i.e. “manual controller”), depicted in Figure 1 of Wang, inherently moves with a predetermined number of degrees of freedom, given that the motion of the input device 14a may directly correspond to motion of the robot (see ¶0032), and a particular number of degrees of freedom associated with similar input devices is well known in the art due to structural limitations. Wang further discloses that the software on the processor mathematically transforms the motion input into corresponding motion commands (see ¶0034, regarding control function block 42 is implemented in a computing device associated with block 34, where the logic of block 34 determines how the motion of the input device 14a is reflected on the robot side, as described in ¶0032, and control function block 42 limits the position and/or velocity references utilized in controlling the robot in order to meet the user preset force limit requirements, as described in ¶0034) that are sent to the manufacturing equipment by the at least one processor (see ¶0029, regarding that the position and velocity of robot 12a is controlled based on a position or velocity reference signal received from the operator station 14 through communications link 16). While the citations reference Figures 3, 4, and 6, the components with shared reference numerals are described as being identical among the Figures in ¶0033. Wang further discloses that the manufacturing equipment, which is physically remote from the at least one controller (see ¶0011, with respect to Figure 1, depicting the remote robot station 12 and the operator station 14), executes the motion commands in real-time during the manufacturing process to produce a motion output (see ¶0021-0022, regarding that the operator has direct remote control of the motion of robot 12a and attached processes using a reliable real-time communication link 16). Given Wang’s disclosure of direct remote control (see ¶0022), Wang does not further disclose that the motion output of the manufacturing equipment differs from the motion input on the at least one manual controller, wherein the predetermined number of degrees of freedom of the manufacturing equipment differ from the predetermined number of degrees of freedom of the at least one manual controller. However, it is well known in the art to use a manual controller with less degrees of freedom to control manufacturing equipment, such as a robotic arm, with more degrees of freedom, in light of Passot. Specifically, Passot teaches a similar system (see Figures 1 and 2), in which robotic arm 100 (similar to the manufacturing equipment taught by Wang) is remotely controlled by a user via control apparatus 200 (similar to the at least one manual controller taught by Wang) (see ¶0047, regarding that during training and/or operation, the user is able to remotely control a target number of controllable degrees of freedom (CDOF) of the robot simultaneously). Passot further teaches that the number of degrees of freedom that the control apparatus 200 supports (e.g., one CDOF) is less than the number of degrees of freedom that the robotic arm 100 supports (e.g., two CDOFs) (see ¶0053), where the control apparatus 200 may be embodied as a more complex 6DOF element (similar to the predetermined number of degrees of freedom of the at least one manual controller taught by Wang) (see ¶0052), and the method 600 can be used with any number of degrees of freedom, e.g., for a device with six degrees of freedom (similar to the predetermined number of degrees of freedom of the manufacturing equipment taught by Wang), the control apparatus may be configured with one or two degrees of freedom (see ¶0091). By providing different degrees of freedom between the robotic arm 100 and the control apparatus 200, Passot inherently teaches that motion output by the robotic arm 100 differs from the motion input by the user to control apparatus 200, as is evident in Figures 1 and 2. While Wang is directed to a simpler telerobotic system in comparison to Passot, it is the technique of providing manufacturing equipment and a manual controller with different degrees of freedom that is modified by Passot; therefore, the complexity of the overall system does not influence this combination. Since the systems of Wang and Passot are directed to the same purpose, i.e. remote control of a robotic arm, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the manufacturing equipment and the at least one manual controller of Wang, such that the predetermined number of degrees of freedom of the manufacturing equipment differ from the predetermined number of degrees of freedom of the at least one manual controller, which would inherently result in the motion output of the manufacturing equipment differ[ring] from the motion input on the at least one manual controller, in light of Passot, with the predictable result of enabling the use of a simpler remote control device to train and operate robotic devices with multiple CDOFs (¶0129 of Passot). Claim 2 Wang further discloses that the claimed system comprises a computer network (i.e. communication link 16, depicted in Figure 1) across which the at least one processor communicates with the manufacturing equipment (see ¶0020, regarding the definition of communication link 16 as a network, such as LAN or WLAN; Figure 1, depicting the communication of the data processing device 14c with robot 12a as being performed over communication link 16). Claims 3 and 23 Wang further discloses that the manufacturing equipment includes welding equipment, measurement equipment, inspection equipment, remote assembly equipment, or combinations thereof (see ¶0014, regarding that the robot station 12 includes grippers, fixtures, welding guns, spraying guns, spotlights, and conveyors mounted to the robot). While the embodiments described in ¶0014 of Wang may reasonably teach any of the limitations of welding equipment, measurement equipment, inspection equipment, or remote assembly equipment, only one of these limitations is required to be taught by Wang. Claims 4 and 24 Wang, as modified by Passot, further discloses that the predetermined number of degrees of freedom of the manufacturing equipment moves with at least three degrees of freedom (see ¶0012, regarding that robot 12a is a six degree of freedom industrial robot). The limitation of “at least three degrees of freedom” may be reasonably taught by more than three degrees of freedom, e.g., six degrees of freedom. See ¶0091 and ¶0047 of Passot, regarding embodiments relating to a six or more CDOF robotic arm. Claims 5 and 25 Wang, as modified by Passot, further discloses that the predetermined number of degrees of freedom of the manufacturing equipment moves with at least six degrees of freedom (see ¶0012, regarding that robot 12a is a six degree of freedom industrial robot). See ¶0091 and ¶0047 of Passot, regarding embodiments relating to a six or more CDOF robotic arm. Claims 6 and 26 Wang further discloses that the at least one sensor is an optical sensor or an auditory sensor (see ¶0012, regarding that sensor devices 12c include cameras and microphones). Claims 8 and 28 Wang further discloses that the at least one processor is a computer (see ¶0018, regarding that data processing device 14c is an industrial PC or a PLC). Claims 9 and 29 Wang further discloses that the at least one manual controller is a hand-held stylus, a computer mouse, or a joystick (see ¶0016, regarding that input device 14a is a joystick or stylus-type device). Only one of a hand-held stylus, computer mouse, or joystick is required to be taught by prior art. Wang is applied to teach either the limitation of a hand-held stylus or joystick. Claims 10 and 30 Passot further teaches that the control apparatus 200 (similar to the manual controller taught by Wang) may support any number of degrees of freedom, e.g., controller with three degrees of freedom (see ¶0091-0092), and therefore, Wang, as modified by Passot and discussed in the rejection of claim 1, further teaches that the predetermined number of degrees of freedom of the at least one manual controller includes at least three degrees of freedom. Claims 11 and 31 Passot further teaches that the control apparatus 200 (similar to the manual controller taught by Wang) may support any number of degrees of freedom (see ¶0091-0092) and may be embodied as a 6DOF controller (see ¶0052), and therefore, Wang, as modified by Passot and discussed in the rejection of claim 1, further teaches that the predetermined number of degrees of freedom of the at least one manual controller includes at least six degrees of freedom. Claims 12 and 32 Wang further discloses that the at least one manual controller is configured to provide haptic feedback to the user of the controller (see ¶0030, regarding that the input device 14a is a haptic device that incorporates force feedback in the form of a vibration). Claim 21 The combination of Wang, Tanaka, Bugalia, Liu, and Passot is applied to claim 21 for the reasons discussed above regarding claim 1. Wang further discloses the limitation of (a) installing manufacturing equipment (i.e. robot 12a) used for or related to a manufacturing process in a manufacturing environment (see ¶0025, regarding that robot 12a includes a manufacturing tool 12d, where the operator has direct remote control of the motion of robot 12a and attached processes). The limitation of “installing” is interpreted in light of the Applicant’s disclosure, in that the manufacturing equipment exists in a particular manufacturing environment. The limitations of (b) through (e) are interpreted in light of the Applicant’s disclosure, such that no computer-controlled “positioning” or “connecting” is performed. The “connecting” of the processor to the digitizer in step (d) is interpreted in light of the Applicant’s disclosure, in which the “processor” (100) is connected to “digitizer” (600) over a wireless network (see Figure 1), and therefore, the limitation of “connecting” does not require a physical connection. Claim 22 Wang further discloses that the claimed method comprises providing computer network (i.e. communication link 16, depicted in Figure 1) across which the at least one processor communicates with the manufacturing equipment (see ¶0020, regarding the definition of communication link 16 as a network, such as LAN or WLAN; Figure 1, depicting the communication of the data processing device 14c with robot 12a as being performed over communication link 16). Claims 7 and 27 are rejected under 35 U.S.C. 103 as being unpatentable over Wang in view of Tanaka, Tamayo, Bugalia, and Liu, and Passot, and in further view of Lee et al. (US 2021/0347053 A1), hereinafter Lee, and Perlin (US 2020/0241296 A1), hereinafter Perlin. Claims 7 and 27 The combination of Wang and Tanaka does not further disclose that the digitizer converts the data received from the at least one sensor into a point cloud. However, the converted point cloud data is not used in any subsequent operations, and therefore, it would be reasonable to combine prior art to teach this feature, in light of Lee. Specifically, Lee teaches a similar system 100 in which robot 101 (similar to the manufacturing equipment taught by Wang) at remote location A is controlled by a tele-operator at location B over network 104 (see ¶0023-0034, with respect to Figure 1), where the telerobotic system is applicable in fields such as manufacturing (see ¶0003). Lee further teaches the technique of converting image data (similar to the data received from the at least one sensor taught by Wang) into a point cloud (see ¶0009, regarding that a point cloud is generated by matching the detected keypoints to one or more 3D points in a stored map, where the matching of keypoints to the stored 3D map points is performed by module 106A, as described in ¶0057, with respect to computing device 106 depicted in Figure 1). This claimed “conversion” is performed by the computing device 106 of Lee (similar to the processor of Wang), not a “digitizer,” in light of the combination of Wang and Tanaka. However, Lee discloses alternative embodiments in which the functionality of the modules can be arranged in a variety of configurations (see ¶0036), and therefore, it would be obvious to re-configure the operations of the computing device 103 and computing device 106, such that the computing device 103 provided at location A (similar to the digitizer taught by the combination of Wang and Tanaka) converts the data received from the at least one sensor into a point cloud. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system of Lee, such that computing device 106 converts the data received from the at least one sensor into a point cloud, in light of alternative embodiments described in ¶0036, with the predictable result of decreasing the processing requirements of computing device 106 that may be embedded into the AR/VR headset 110 (see ¶0037 of Lee), where the headset would be advantageously lighter in weight with smaller processing components (¶0056 of Perlin). Further, since the systems of Wang and Lee are directed to the same purpose, i.e. remotely controlling a robot used in manufacturing processes where images of the robot location are transmitted to the remote operator location, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the digitizer taught by the combination of Wang and Tanaka to further convert the data received from the at least one sensor into a point cloud, in light of Lee, with the predictable result of recreating a dynamic scene (¶0051 of Lee) with a correct sense of scale (¶0044 of Lee). Claim 33 is rejected under 35 U.S.C. 103 as being unpatentable over Wang in view of Tanaka, Tamayo, Bugalia, Liu, and Passot, and in further view of Aldridge et al. (US 2020/0368904 A1), hereinafter Aldridge. Claim 33 While Wang further discloses that “the motion commands executed on the manufacturing equipment” include position or speed signals (see ¶0016-0017), where the robot performs arc welding tasks (see ¶0049) using a welding gun (see ¶0014), Wang does not specifically disclose that the motion commands executed on the manufacturing equipment include weld travel direction and weld travel speed, weld weave width, weld weave speed, weave orientation with respect to the face of a weld, torch travel angle, torch workpiece angle, and torch tip roll. However, all of these particular “motion commands” are intrinsic to manual welding procedures, and therefore, the manual “motion commands” generated by a tele-operated welding robot would inherently perform these “motion commands,” in light of Aldridge. Specifically, Aldridge teaches a similar method in which a controller device 102 (similar to the manual controller taught by Wang) wirelessly transmits control signals to the robotic device 120 (similar to the manufacturing equipment taught by Wang) (see ¶0048, with respect to Figure 1), where the robotic device 120 is configured as an industrial robot that performs welding applications (see ¶0043). Aldridge further teaches that the operator uses the handheld controller to control the orientation and translation position of the torch attached to the robot arm (see ¶0035), which may include repositioning the workpiece with respect to the torch (see ¶0037) (e.g. torch travel angle, torch workpiece angle, and/or torch tip roll), so as to move the torch across the weld path using a manually controlled weave (e.g., weld weave width, weld weave speed, and/or weave orientation with respect to the face of a weld), where the robot follows the exact path of the operator’s hand (e.g., weld travel direction and/or weld travel speed) (see ¶0035). Aldridge further teaches additional embodiments in which weld travel speed is controlled by the variable press of a trigger on the handheld controller in ¶0036, as opposed to the manual operation discussed in ¶0035, as well as implementing preset functions that include weave pattern, dwell, wire speed, etc. (e.g., weld weave width and/or weld weave speed) while performing the weld (see ¶0034). Since the systems of Wang and Aldridge are directed to the same purpose, i.e. remotely controlling a robotic device used for welding with a remote controller, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the motion commands executed on the manufacturing equipment of Wang to specifically include weld travel direction, weld travel speed, weld weave width, weld weave speed, weave orientation with respect to the face of a weld, torch travel angle, torch workpiece angle, and torch tip roll, in the same manner that Aldridge controls the orientation and translation position of the torch attached to the robot arm, which may include repositioning the workpiece with respect to the torch, using a manually controlled weave, where the robot follows the exact path of the operator’s hand, with the predictable result of performing remote welding by someone who knows how to weld, so as to provide for a more cost effective system that doesn’t take a long time to program (¶0112 of Aldridge). Allowable Subject Matter Claims 13, 14, and 16-20 are allowed. The following is a statement of reasons for the indication of allowable subject matter: The closest prior art of record, Wang, Tanaka, Bugalia, Liu, and Shahoian et al. (US 6,184,868 B1), taken alone or in combination, does not teach the claimed system for manually controlling a welding process remotely, comprising: (a) a welding environment, wherein the welding environment contains welding equipment used for or related to a welding process, and wherein the welding equipment moves with at least six degrees of freedom; (b) at least one sensor positioned within the welding environment in proximity to the welding equipment, wherein the at least one sensor is configured to gather data from the welding environment; (c) at least one digitizer in communication with the at least one sensor for receiving data from the sensor and converting the data into one or more three-dimensional digital maps of the welding environment, wherein the one or more three-dimensional digital maps includes weld joint shapes, weld joint variations or obstacles occurring in the welding environment, and weld joint locations, or combinations thereof; (d) at least one processor in communication with the at least one digitizer, wherein the at least one processor includes software for receiving and analyzing the one or more three-dimensional digital maps; (e) at least one manual controller in communication with the at least one processor, wherein the manual controller moves with at least six degrees of freedom, wherein the at least one manual controller receives a motion input from a user of the at least one manual controller, wherein the software on the at least one processor mathematically transforms the motion input into corresponding motion commands that are sent to the welding equipment by the at least one processor, and wherein the welding equipment, which is physically remote from the at least one controller, executes the motion commands in real-time during the welding process to produce a motion output, wherein the motion output of the welding equipment differs from the motion input on the at least one manual controller, and wherein the motion output replicates physical movements or motions of an actual human welder; and (f) wherein the software of the at least one processor is further configured to enable a haptic response based on the weld joint shapes; alert the user of the joint variations or obstacles; and align the welding equipment and the at least one manual control to a same reference plane or field of view based on the weld joint locations. Specifically, no reasonable combination of prior art can be made to teach the three-dimensional digital map(s) as including weld joint shapes, weld joint variations or obstacles occurring in the welding environment, and weld joint locations, such that the processor enables a haptic response based on the weld joint shapes, alerts the user of the joint variations or obstacles, and aligns the welding equipment and the at least one manual control to a same reference plane or field of view based on the weld joint locations, in light of the overall claim. The claimed invention would not have been obvious to one of ordinary skill in the art before the effective filing date. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Specifically, Christensen (“Virtual Environments for Telerobotic Shared Control,” 1993, Deneb Robotics) teaches constructing a World Model based on an environment map for providing visual feedback to an operator performing welding-related tasks (see sections 5, 6, and 7), Muzilla et al. (US 2018/0361493 A1) teaches real-time remote welding (see abstract), and Batzler et al. (US 2016/0089751 A1) teaches remotely controlling welding operation (see ¶0070-0071, with respect to Figure 8). Any inquiry concerning this communication or earlier communications from the examiner should be directed to Sara J Lewandroski whose telephone number is (571)270-7766. The examiner can normally be reached Monday-Friday, 9 am-5 pm ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ramya P Burgess can be reached at (571)272-6011. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SARA J LEWANDROSKI/Examiner, Art Unit 3661
Read full office action

Prosecution Timeline

Oct 08, 2021
Application Filed
Jul 13, 2024
Non-Final Rejection — §103
Sep 18, 2024
Response Filed
Dec 11, 2024
Final Rejection — §103
Feb 18, 2025
Response after Non-Final Action
Mar 18, 2025
Request for Continued Examination
Mar 19, 2025
Response after Non-Final Action
Mar 21, 2025
Non-Final Rejection — §103
Jul 11, 2025
Interview Requested
Jul 18, 2025
Examiner Interview Summary
Jul 18, 2025
Applicant Interview (Telephonic)
Aug 04, 2025
Response Filed
Sep 30, 2025
Final Rejection — §103
Oct 23, 2025
Interview Requested
Nov 03, 2025
Examiner Interview (Telephonic)
Nov 04, 2025
Examiner Interview Summary
Nov 10, 2025
Response after Non-Final Action
Jan 09, 2026
Request for Continued Examination
Feb 13, 2026
Response after Non-Final Action
Feb 21, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600245
POWER CONTROL APPARATUS FOR VEHICLE
2y 5m to grant Granted Apr 14, 2026
Patent 12600371
CONTROL DEVICE, CONTROL METHOD AND NON-TRANSITORY STORAGE MEDIUM
2y 5m to grant Granted Apr 14, 2026
Patent 12596519
AUTONOMOUS MOBILE BODY, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING APPARATUS
2y 5m to grant Granted Apr 07, 2026
Patent 12576987
COMPUTER-BASED SYSTEMS AND METHODS FOR FACILITATING AIRCRAFT APPROACH
2y 5m to grant Granted Mar 17, 2026
Patent 12571180
CONTROLLING AN EXCAVATION OPERATION BASED ON LOAD SENSING
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
81%
Grant Probability
91%
With Interview (+9.9%)
2y 10m
Median Time to Grant
High
PTA Risk
Based on 582 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month