Prosecution Insights
Last updated: April 19, 2026
Application No. 18/179,796

TOOL CALIBRATION FOR MANUFACTURING ROBOTS

Final Rejection §103
Filed
Mar 07, 2023
Examiner
STIEBRITZ, NOAH WILLIAM
Art Unit
3658
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Path Robotics Inc.
OA Round
2 (Final)
67%
Grant Probability
Favorable
3-4
OA Rounds
2y 6m
To Grant
51%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
12 granted / 18 resolved
+14.7% vs TC avg
Minimal -16% lift
Without
With
+-15.6%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
44 currently pending
Career history
62
Total Applications
across all art units

Statute-Specific Performance

§101
18.6%
-21.4% vs TC avg
§103
61.7%
+21.7% vs TC avg
§102
11.1%
-28.9% vs TC avg
§112
8.0%
-32.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 18 resolved cases

Office Action

§103
DETAILED ACTION This is a Final Office Action on the Merits in response to communications filed by applicant on October 8th, 2025. Claims 1-23 are currently pending and examined below. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment The amendments to the Claims filed on October 8th, 2025 have been entered. Claims 1-2, 6, 8, 11, 13, 15, 19, and 21-22 are currently amended and pending, and claims 3-5, 7, 8-10, 12, 14, 16-18, 20, and 23 are original, unamended, and pending. The amendments to the Drawings filed on October 8th, 2025 have been entered and have overcome each and every object set forth in the previous Non-Final Office Action mailed April 9th, 2025. The amendments to the Specifications filed on October 8th, 2025 have been entered and have overcome each and every object set forth in the previous Non-Final Office Action mailed April 9th, 2025. Information Disclosure Statement The Information Disclosure Statement(s) filed on 10/08/2025 is/are being considered by the examiner. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-2, 4, 9, 19, and 23 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 2009/0118894 A1 ("Eldridge") in view of US 10926414 B2 ("Huang") in further view of US 2021/0065356 A1 ("Fisher"). Regarding claim 1, Eldridge teaches a method for calibrating a tool center point (TCP) of a robotic welding system, the method comprising (Eldridge: ¶ 0006, “An embodiment of the present invention may further comprise a computerized method for calculating a tool frame tool center point relative to a wrist-frame of a robot for a tool attached at a wrist of the robot using a camera…”, ¶ 0033, “The robot manipulator is typically made up of two subsections, the body and arm 108 and the wrist 110. A tool 112 used by a robot 102 to perform desired tasks is typically attached at the wrist 110 of the robot manipulator 102.”, ¶ 0102, “FIGS. 9A-C show images 900,910,920 of example Metal-Inert Gas (MIG) welding torches. FIG. 9A is an example image of a first type 900 of a MIG welding torch tool. FIG. 9B is an example image of a second type 910 of a MIG welding torch tool. FIG. 9C is an example image of a third type 920 of a MIG welding torch tool.”): the plurality of images containing at least a portion of a protrusion extending from a tip of a weldhead of the robotic welding system (Eldridge: Figures 11A-C, ¶ 0102, “While welding torches have the same basic parts ( e.g., neck 902, gas cup 904, and wire 906), the actual shape and material of the parts 902, 904, 906 may vary significantly, which can make image processing difficult.”. The cited figures and passage shows that the protrusion extending from the weldhead (i.e. the welding wire) is included in the images captured by the camera.), wherein the weldhead is attached to a robot arm of the robotic welding system (Eldridge: ¶ 0033, “The robot manipulator is typically made up of two subsections, the body and arm 108 and the wrist 110. A tool 112 used by a robot 102 to perform desired tasks is typically attached at the wrist 110 of the robot manipulator 102.”, ¶ 0102, “FIGS. 9A-C show images 900,910,920 of example Metal-Inert Gas (MIG) welding torches. FIG. 9A is an example image of a first type 900 of a MIG welding torch tool. FIG. 9B is an example image of a second type 910 of a MIG welding torch tool. FIG. 9C is an example image of a third type 920 of a MIG welding torch tool. While welding torches have the same basic parts ( e.g., neck 902, gas cup 904, and wire 906), the actual shape and material of the parts 902, 904, 906 may vary significantly, which can make image processing difficult.”. The cited passages show that the tool can be a welder and said welder is attached to a robotic arm.); (b) identifying by a controller of the robotic welding system the protrusion extending from the weldhead in the plurality of images (Eldridge: ¶ 0109, “Usually the TCP of the tool 1102 is defined to be where the wire exits the nozzle 1124, so in step 4 of the process for finding the TCP in the camera image 1000, the algorithm is really searching for the end of the gas cup of the tool 1124. For some embodiments, the TCP may alternatively be defined to be the actual end of the torch tool 1102 at the tip of the wire 1126.”. The cited passage shows that the welding wire can be identified instead of the end of the gas cap.); (c) defining by the controller a longitudinal axis of the protrusion based on the protrusion identified in the plurality of images (Eldridge: Figure 11B, ¶ 0109, “FIG. 11B is an example image 1110 showing the sub-process for step 3 of refining the orientation 1116 of the tool 1102 by searching for the sides 1112 of the tool 1102 in the process for locating the TCP (1124 or 1126) of the tool 1102 on the camera image 1000. Step 3 to find a refined orientation 1116 of the tool 1102 of the process for finding the TCP (1124 or 1126) in the camera image 1000 is necessary because the neck of the torch tool 1102 may cause the fitted ellipse 1104 to have a slightly different orientation (i.e., rough orientation 1114) than the nozzle of the tool 1102.”. As can be seen from the cited figure and passage, this step involves finding a line that passes through the TCP that defines the orientation of the tool. One of ordinary skill in the art would see from the graphical representation in Figure 11B that this is a longitudinal axis.); and (d) identifying by the controller a location of the weldhead based on the protrusion identified in the plurality of images and the defined longitudinal axis of the protrusion (Eldridge: ¶ 0109, “FIG. 11C is an example image 1120 showing the sub-process for step 4 of searching 1122 for the TCP (1124 or 1126) at the end of tool 1102 in the overall process for locating the TCP (1124 or 1126) of the tool 1102 on the camera image 1000. The search 1122 to the end of the tool 1102 for the TCP (1124 or 1126) may be performed by searching along the refined tool orientation 1116 for the TCP (1124 or 1126).”). Eldridge does not teach (a) receiving a plurality of images captured from a plurality of image sensors of the robotic welding system, and wherein the plurality of image sensors is attached to the robot arm; and that the location of the weldhead is a location in three-dimensional space. Huang, in the same field of endeavor, teaches (a) receiving a plurality of images captured from a plurality of image sensors of the robotic welding system (Huang: Column 2 lines 45-57, “As shown in FIG. 1, the system 1 for calibrating tool center point of robot includes a first image sensor 11, a second image sensor 12, and a controller 13. In an embodiment, the first image sensor 11 and the second image sensor 12 may be a camera or other similar devices.”, Column 9 lines 59-67 – Column 10 lines 1-10, “As shown in FIG. 7, the method of the system 1 for calibrating tool center point of robot includes the following steps. Step S71: Providing a first image sensor 11 having a first image central axis A. Step S72: Providing a second image sensor 12 having a second image central axis B not parallel to the first image central axis A and intersecting the first image central axis A at an intersection point I. Step S73: Controlling a robot R to repeatedly move the tool center point TCP of the tool T thereof between the first image central axis A and the second image central axis B. Step S74: Recording a calibration point including the coordinates of the joints J1-J 6 of the robot R when the tool center point TCP overlaps the intersection point I. Step S75: Repeating the above steps to generate a plurality of the calibration points. Step S76: Calculating the coordinate of the tool center point TCP according to the calibration points.”. The cited passages show that a plurality of cameras are used and each camera captures an image of the tool at different points.), and that the location of the weldhead is a location in three-dimensional space (Huang: Column 8 lines 18-27, “After that, the coordinate (Tx, Ty, Tz) of the tool center point TCP in relation to the coordinate system of the flange facing F, and the coordinate (Px, Py, Pz) of the tool center point TCP in relation to the coordinate system (xR, yR, zR) of the robot R can be obtained. Finally, the calibration process of the tool center point TCP is finished.”). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the robotic welding system taught in Eldridge with plurality of cameras that take a plurality of images and the location of the weldhead being 3D taught in Huang with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because it would have been obvious to try. The use of a three-dimensional pose of a robot is used commonly to control robots in a variety of applications and it is necessary to know the position and orientation of the tool so that the robot can properly carry out its task. A person of ordinary skill in the art would have had the technological capabilities and knowledge to recognize this and implement such considerations. No inventive effort would be required. Furthermore, the use of multiple cameras with differing viewpoints is a common method to get three-dimensional information from images. A person of ordinary skill in the art would have had the technological capabilities and knowledge to implement multiple cameras to get three-dimensional information from the capture images. No inventive effort would be required. Eldridge in view of Huang does not teach and wherein the plurality of image sensors is attached to the robot arm. Fisher, in the same field of endeavor, teaches and wherein the plurality of image sensors is attached to the robot arm (Fisher: Figures 2A-B, ¶ 0043, “IGS. 2A and 2B illustrate an example robot 120 according to an example embodiment. Robot 120 may include a body 126, an end effector 122 holding a tool 123, and a camera 124.”. The cited passage and figures clearly show that the camera is couples to the end-effector and tool of the robot.); Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the method taught in Eldridge in view of Huang with and wherein the plurality of image sensors is attached to the robot arm taught in Fisher with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because the combination would have yielded predictable results. Both Eldridge and Huang teach the use of imaging devices to capture images of the tool of the robot, sop modifying the system taught in Eldridge in view of Huang such that the imaging device is coupled to the end-effector and tool as taught in Fisher would not change or introduce new functionality. Furthermore, such a modification only requires changing the location of a known sensor. No inventive effort would have been required. Regarding claim 2, Eldridge in view of Huang in further view of Fisher teaches wherein the plurality of image sensors comprises a pair of cameras disposed on the weldhead and arranged stereoscopically in relation to the weldhead (Huang: Figure 4, Column 9 lines 59-67 , “As shown in FIG. 7, the method of the system 1 for calibrating tool center point of robot includes the following steps. Step S71: Providing a first image sensor 11 having a first image central axis A. Step S72: Providing a second image sensor 12 having a second image central axis B not parallel to the first image central axis A and intersecting the first image central axis A at an intersection point I.”, Fisher: Figures 2A-B, ¶ 0043, “IGS. 2A and 2B illustrate an example robot 120 according to an example embodiment. Robot 120 may include a body 126, an end effector 122 holding a tool 123, and a camera 124.”). Eldridge in view of Huang teaches a method of calibrating the tool center position of a robotic welding system using camera stereoscopically arranged in relation to the weldhead. Fisher teaches wherein the camera is disposed on the weldhead. A person of ordinary skill in the art would have had the technological capabilities required to have attached the stereoscopically arranged cameras taught in Eldridge in view of Huang to the weldhead of the robot as taught in Fisher. Furthermore, such a modification would require simply changing the location of the cameras, a task well within the technological capabilities of a person of ordinary skill in the art. Such a modification would not have change or introduced new functionality. No inventive effort would have been required. Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, that the combination of Eldridge in view of Huang in further view of Fisher teaches the limitations of claim 2. Regarding claim 4, Eldridge in view of Huang in further view of Fisher teaches wherein the protrusion comprises a welding wire (Eldridge: Figures 11A-C, ¶ 0102, “While welding torches have the same basic parts ( e.g., neck 902, gas cup 904, and wire 906), the actual shape and material of the parts 902, 904, 906 may vary significantly, which can make image processing difficult.”, ¶ 0109, “Usually the TCP of the tool 1102 is defined to be where the wire exits the nozzle 1124, so in step 4 of the process for finding the TCP in the camera image 1000, the algorithm is really searching for the end of the gas cup of the tool 1124. For some embodiments, the TCP may alternatively be defined to be the actual end of the torch tool 1102 at the tip of the wire 1126.”). Regarding claim 9, Eldridge in view of Huang in further view of Fisher teaches wherein (d) comprises identifying a pose in 3D space of the weldhead (Eldridge: Eldridge: ¶ 0074, “For some tools and processes, finding only the TCP relationship to the wrist frame is adequate. For example, if a touch probe extends directly along the joint axis of the last joint (i.e., the wrist), the orientation of the tool may be assumed to be equal to the orientation of the wrist.”, ¶ 0075, “One method of finding the tool orientation is to move the tool into a known orientation in the world coordinate frame. The wrist pose may then be recorded and the relative orientation between the tool and the wrist may be computed.”, Huang: Column 8 lines 18-27, “After that, the coordinate (Tx, Ty, Tz) of the tool center point TCP in relation to the coordinate system of the flange facing F, and the coordinate (Px, Py, Pz) of the tool center point TCP in relation to the coordinate system (xR, yR, zR) of the robot R can be obtained. Finally, the calibration process of the tool center point TCP is finished.” The two refences in combination clearly teach determining the 3D pose of the weldhead.). Regarding claim 19, Eldridge teaches a system for calibrating a tool center point (TCP) of a robotic welding system comprising a weldhead attached to a robot arm, the system comprising (Eldridge: ¶ 0006, “An embodiment of the present invention may further comprise a computerized method for calculating a tool frame tool center point relative to a wrist-frame of a robot for a tool attached at a wrist of the robot using a camera…”, ¶ 0033, “The robot manipulator is typically made up of two subsections, the body and arm 108 and the wrist 110. A tool 112 used by a robot 102 to perform desired tasks is typically attached at the wrist 110 of the robot manipulator 102.”, ¶ 0102, “FIGS. 9A-C show images 900,910,920 of example Metal-Inert Gas (MIG) welding torches. FIG. 9A is an example image of a first type 900 of a MIG welding torch tool. FIG. 9B is an example image of a second type 910 of a MIG welding torch tool. FIG. 9C is an example image of a third type 920 of a MIG welding torch tool.”. The cited passages clearly teach that a weldhead is attached to a robot arm.): a processor (Eldridge: ¶ 0134, “The computer may have computer accessible memory (e.g., hard drive, flash drive, RAM, etc.) to store information and or programs needed to implement the algorithms/processes to find the tool-frame relative to the wrist-frame of the robot. The computer may send commands to and receive data from the robot and robot controller as necessary to find then relative tool-frame.”); a non-transitory memory (Eldridge: ¶ 0134, “The computer may have computer accessible memory (e.g., hard drive, flash drive, RAM, etc.) to store information and or programs needed to implement the algorithms/processes to find the tool-frame relative to the wrist-frame of the robot.”); and an application stored in the non-transitory memory that, when executed by the processor (Eldridge: ¶ 0134, “The computer may have computer accessible memory (e.g., hard drive, flash drive, RAM, etc.) to store information and or programs needed to implement the algorithms/processes to find the tool-frame relative to the wrist-frame of the robot.”): the plurality of images containing at least a portion of a protrusion extending from a tip of the weldhead (Eldridge: Figures 11A-C, ¶ 0102, “While welding torches have the same basic parts ( e.g., neck 902, gas cup 904, and wire 906), the actual shape and material of the parts 902, 904, 906 may vary significantly, which can make image processing difficult.”. The cited figures and passage shows that the protrusion extending from the weldhead (i.e. the welding wire) is included in the images captured by the camera.); identifies the protrusion extending from the weldhead in the plurality of images (Eldridge: ¶ 0109, “Usually the TCP of the tool 1102 is defined to be where the wire exits the nozzle 1124, so in step 4 of the process for finding the TCP in the camera image 1000, the algorithm is really searching for the end of the gas cup of the tool 1124. For some embodiments, the TCP may alternatively be defined to be the actual end of the torch tool 1102 at the tip of the wire 1126.”. The cited passage shows that the welding wire can be identified instead of the end of the gas cap.); defines a longitudinal axis of the protrusion based on the protrusion identified in the plurality of images (Eldridge: Figure 11B, ¶ 0109, “FIG. 11B is an example image 1110 showing the sub-process for step 3 of refining the orientation 1116 of the tool 1102 by searching for the sides 1112 of the tool 1102 in the process for locating the TCP (1124 or 1126) of the tool 1102 on the camera image 1000. Step 3 to find a refined orientation 1116 of the tool 1102 of the process for finding the TCP (1124 or 1126) in the camera image 1000 is necessary because the neck of the torch tool 1102 may cause the fitted ellipse 1104 to have a slightly different orientation (i.e., rough orientation 1114) than the nozzle of the tool 1102.”. As can be seen from the cited figure and passage, this step involves finding a line that passes through the TCP that defines the orientation of the tool. One of ordinary skill in the art would see from the graphical representation in Figure 11B that this is a longitudinal axis.); and identifies a location space of the weldhead based on the protrusion identified in the plurality of images and the defined longitudinal axis of the protrusion (Eldridge: ¶ 0109, “FIG. 11C is an example image 1120 showing the sub-process for step 4 of searching 1122 for the TCP (1124 or 1126) at the end of tool 1102 in the overall process for locating the TCP (1124 or 1126) of the tool 1102 on the camera image 1000. The search 1122 to the end of the tool 1102 for the TCP (1124 or 1126) may be performed by searching along the refined tool orientation 1116 for the TCP (1124 or 1126).”). Eldridge does not teach receives a plurality of images captured from a plurality of image sensors attached to the robot arm, and that the location of the weldhead is a location in three-dimensional space. Huang, in the same field of endeavor, receives a plurality of images captured from a plurality of image sensors (Huang: Column 2 lines 45-57, “As shown in FIG. 1, the system 1 for calibrating tool center point of robot includes a first image sensor 11, a second image sensor 12, and a controller 13. In an embodiment, the first image sensor 11 and the second image sensor 12 may be a camera or other similar devices.”, Column 9 lines 59-67 – Column 10 lines 1-10, “As shown in FIG. 7, the method of the system 1 for calibrating tool center point of robot includes the following steps. Step S71: Providing a first image sensor 11 having a first image central axis A. Step S72: Providing a second image sensor 12 having a second image central axis B not parallel to the first image central axis A and intersecting the first image central axis A at an intersection point I. Step S73: Controlling a robot R to repeatedly move the tool center point TCP of the tool T thereof between the first image central axis A and the second image central axis B. Step S74: Recording a calibration point including the coordinates of the joints J1-J 6 of the robot R when the tool center point TCP overlaps the intersection point I. Step S75: Repeating the above steps to generate a plurality of the calibration points. Step S76: Calculating the coordinate of the tool center point TCP according to the calibration points.”. The cited passages show that a plurality of cameras are used and each camera captures an image of the tool at different points.), and that the location of the weldhead is a location in three-dimensional space (Huang: Column 8 lines 18-27, “After that, the coordinate (Tx, Ty, Tz) of the tool center point TCP in relation to the coordinate system of the flange facing F, and the coordinate (Px, Py, Pz) of the tool center point TCP in relation to the coordinate system (xR, yR, zR) of the robot R can be obtained. Finally, the calibration process of the tool center point TCP is finished.”). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the system for calibrating the robotic welding system taught in Eldridge with plurality of cameras that take a plurality of images and the location of the weldhead being 3D taught in Huang with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because it would have been obvious to try. The use of a three-dimensional pose of a robot is used commonly to control robots in a variety of applications and it is necessary to know the position and orientation of the tool so that the robot can properly carry out its task. A person of ordinary skill in the art would have had the technological capabilities and knowledge to recognize this and implement such considerations. No inventive effort would be required. Furthermore, the use of multiple cameras with differing viewpoints is a common method to get three-dimensional information from images. A person of ordinary skill in the art would have had the technological capabilities and knowledge to implement multiple cameras to get three-dimensional information from the capture images. No inventive effort would be required. Eldridge in view of Huang does not teach a plurality of image sensors attached to the robot arm. Fisher, in the same field of endeavor, teaches a plurality of image sensors attached to the robot arm (Fisher: Figures 2A-B, ¶ 0043, “IGS. 2A and 2B illustrate an example robot 120 according to an example embodiment. Robot 120 may include a body 126, an end effector 122 holding a tool 123, and a camera 124.”. The cited passage and figures clearly show that the camera is couples to the end-effector and tool of the robot.); Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the system for calibrating a tool center point of a robotic welding system taught in Eldridge in view of Huang with a plurality of image sensors attached to the robot arm taught in Fisher with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because the combination would have yielded predictable results. Both Eldridge and Huang teach the use of imaging devices to capture images of the tool of the robot, sop modifying the system taught in Eldridge in view of Huang such that the imaging device is coupled to the end-effector and tool as taught in Fisher would not change or introduce new functionality. Furthermore, such a modification only requires changing the location of a known sensor. No inventive effort would have been required. Regarding claim 23, Eldridge in view of Huang in further view of Fisher teaches wherein the application, when executed by the processor: identifies a location in three-dimensional (3D) space of the weldhead based on the protrusion identified in the plurality of images and the defined longitudinal axis of the protrusion (Eldridge: ¶ 0109, “FIG. 11C is an example image 1120 showing the sub-process for step 4 of searching 1122 for the TCP (1124 or 1126) at the end of tool 1102 in the overall process for locating the TCP (1124 or 1126) of the tool 1102 on the camera image 1000. The search 1122 to the end of the tool 1102 for the TCP (1124 or 1126) may be performed by searching along the refined tool orientation 1116 for the TCP (1124 or 1126).”, Huang: Column 8 lines 18-27, “After that, the coordinate (Tx, Ty, Tz) of the tool center point TCP in relation to the coordinate system of the flange facing F, and the coordinate (Px, Py, Pz) of the tool center point TCP in relation to the coordinate system (xR, yR, zR) of the robot R can be obtained. Finally, the calibration process of the tool center point TCP is finished.”). Claim(s) 3 and 10 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 2009/0118894 A1 ("Eldridge") in view of US 10926414 B2 ("Huang") in further view of US 2021/0065356 A1 ("Fisher") in further view of US 2018/0154518 A1 ("Rossano") Regarding claim 3, Eldridge in view of Huang in further view of Fisher does not teach wherein (c) comprises identifying a trajectory in 3D space of the longitudinal axis of the protrusion. Rossano, in the same field of endeavor, teaches (Rossano: ¶ 0047, “The first path 164 can correspond to a predetermined path the TCP 162 of the robot 106, such as, for example a path the robot 106 is planned or selected to take either before or during the operation of the robot 106, and can be depicted by the GUI 136 on the display 124 in a variety of manners.”) Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the robotic welding system taught in Eldridge in view of Huang in further view of Fisher with the trajectory planning taught in Rossano with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because it would have been obvious to try. To facilitate the automation of a robotic system configured to carry out a task, it is necessary to define the robot’s trajectory in space. This is done so that the robot can move from a starting position to an ending position, avoid obstacles, or when the task itself involves following a path (i.e., welding). A person of ordinary skill in the art would have had the technological capabilities and knowledge to recognize the necessity of trajectory planning and be able to implement it in the robotic welding system taught in Eldridge in view of Huang in further view of Fisher. No inventive effort would have been required. Regarding claim 10, Eldridge in view of Huang in further view of Fisher does not teach wherein the plurality of image sensors comprises at least a portion of a local sensor unit or a global sensor unit of the robotic welding system. Rossano, in the same field of endeavor, teaches wherein the plurality of image sensors comprises at least a portion of a local sensor unit or a global sensor unit of the robotic welding system (Rossano: ¶ 0023, “Optionally, the robot station 102 can also include one or more sensors 114 that can be used in connection with observing the robot station 102, including the robot 106 and workpieces or parts on which the robot 106 and/or end effector 108 are performing work. Examples of such sensors 114 can include imaging capturing devices, microphones, position sensors, proximity sensors, accelerometers, motion sensors, and/or force sensors among other types of sensors and sensing devices.”. As stated in the cited passage the robot can be configured to use any number of the listed sensors.). Therefore, it would have been obvious to a person of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the robotic welding system taught in Eldridge in view of Huang in further view of Fisher with the sensor unit comprising a plurality of sensors including image sensors taught in Rossano. A person of ordinary skill in the art would have been motivated to make this modification because the resulting combine system would have yielded the predictable result of obtaining additional sensor information. All of the sensors listed are known in the art and have known outputs. Furthermore, person of ordinary skill in the would have had the technological capabilities to incorporate any one or combination of these sensor with the robotic welding system taught in Eldridge in view of Huang in further view of Fisher and be able to make use of the data the sensors output. Using the additional sensors would not change the function of the robotic welding system or introduce new functionality. Using the sensors in the robotic welding system would not cause the sensors themselves to function differently. No inventive effort would have been required. Claim(s) 5, 7-8, 20, and 22 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 2009/0118894 A1 ("Eldridge") in view of US 10926414 B2 ("Huang") in further view of US 2021/0065356 A1 ("Fisher") in further view of US 2011/0029132 A1 ("Nemmers"). Regarding claim 5, Eldridge in view of Huang in further view of Fisher does not teach wherein (b) comprises: (b1) annotating at least one of the plurality of images to indicate a base of the protrusion and a tip of the protrusion located opposite the base of the protrusion identified in the plurality of images. Nemmers, in the same field of endeavor, teaches wherein (b) comprises: (b1) annotating at least one of the plurality of images to indicate a base of the protrusion and a tip of the protrusion located opposite the base of the protrusion identified in the plurality of images (Nemmers: Figure 12, ¶ 0064, “In order to teach the user tool for the tandem welding tool 200, four points (i.e. targets) 206, 208, 210, 212 need to be located in three dimension space, as shown in FIG. 11.”, ¶ 0066, “ A tool center point 214 based upon the wire 202 can be created from the direction of the reference line R1, the origin point 206 and a pre-defined distance (e.g. 17 mm) associated with the robotic tool. The process is repeated for the wire 204 using the points 210, 212 to generate reference line R2 and tool center point 216.”. As can be seen from the cited figure and passages, the base of the welding wire and the tip (defined as the TCP) are annotated on the image.). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the robotic welding system taught in Eldridge in view of Huang in further view of Fisher with the method of annotating an image of the protrusion taught in Nemmers with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because such a method of identifying the base and tip of the protrusion can be done automatically and within seconds (Nemmers: ¶ 0091, “Specifically, a user tool adjustment can be completed in two to three seconds and the user frames are taught automatically.”). Regarding claim 7, Eldridge in view of Huang in further view of Fisher does not teach wherein (d) comprises identifying the location in 3D space of the weldhead based on a first projection of the protrusion captured in a first image of the plurality of images, a second projection of the protrusion captured in a second image of the plurality of images that is different from the first image, and on a known length extending between a base of the protrusion and a tip of the protrusion. Nemmers, in the same field of endeavor, teaches wherein (d) comprises identifying the location in 3D space of the weldhead based on a first projection of the protrusion captured in a first image of the plurality of images, a second projection of the protrusion captured in a second image of the plurality of images that is different from the first image, and on a known length extending between a base of the protrusion and a tip of the protrusion (Nemmers: Figure 12, ¶ 0064, “As more clearly shown in FIG. 10, while the robotic tool is positioned within the field of view of the image generating device, a portion of the robotic tool (i.e. parent tool) is initially located, as identified by an outline 205. In order to teach the user tool for the tandem welding tool 200, four points (i.e. targets) 206, 208, 210, 212 need to be located in three dimension space, as shown in FIG. 11. Specifically, while the robotic tool is in the first position, the processor 14 analyzes the first image to locate the first point 206 and an XYZ coordinate representing a position where a pre-defined view line passes through a calibration plane defined during the calibration step 102. As a non-limiting example, the XYZ coordinate of the view line intersection, the first point 206, and the known focal point of the image generating device 22 are each disposed along a first three dimensional line. The processor 14 then repeats the analysis to locate points 208, 210, 212 in a fashion similar to that described above for the point 206.”, ¶ 0065, “The robotic tool is then rotated and the process is repeated to re-locate the points 206, 208, 210, 212 in the second image of the robotic tool.”, ¶ 0066, “ A tool center point 214 based upon the wire 202 can be created from the direction of the reference line R1, the origin point 206 and a pre-defined distance (e.g. 17 mm) associated with the robotic tool. The process is repeated for the wire 204 using the points 210, 212 to generate reference line R2 and tool center point 216. A user tool frame can be calculated from the data points defined above. Specifically, a reference line R3 through the tool center points 214, 216 defines the X direction. The cross product of the reference line R1 and the reference line R3 defines the XZ plane. Since the origin point, the X axis, and the XZ plane are known, there is enough information to calculate the entire user tool (frame), as appreciate by one skilled in the art.”. The cited passages show that the 3D position of the weldhead is determined by using the position of the base of the welding wire, the tip of the welding wire, and the know length of the wire. Furthermore, the two images used are different from each other, as the robot has been rotated between the first and second images used. One of ordinary skill in the art would see that a 2D image is a projection of the 3D object captured by the imaging device.). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the robotic welding system taught in Eldridge in view of Huang in further view of Fisher with the method of identifying the 3D location of the weldhead taught in Nemmers with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because such a method of identifying 3D position of the weldhead can be done automatically and within seconds (Nemmers: ¶ 0091, “Specifically, a user tool adjustment can be completed in two to three seconds and the user frames are taught automatically.”). Regarding claim 8, Eldridge in view of Huang in further view of Fisher does not teach wherein (d) comprises: (d1) triangulating a location in 3D space of a tip of the protrusion based on a first projection of the tip of the protrusion captured in a first image of the plurality of images and a second projection of the tip of the protrusion captured in a second image of the plurality of images that is different from the first image; and (d2) identifying the location of the tip of the weldhead based on the location in 3D space of the tip of the protrusion and on a known length extending between a base of the protrusion and the tip of the protrusion. Nemmers, in the same field of endeavor, teaches wherein (d) comprises: (d1) triangulating a location in 3D space of a tip of the protrusion based on a first projection of the tip of the protrusion captured in a first image of the plurality of images and a second projection of the tip of the protrusion captured in a second image of the plurality of images that is different from the first image (Nemmers: Figure 12, ¶ 0064, “As more clearly shown in FIG. 10, while the robotic tool is positioned within the field of view of the image generating device, a portion of the robotic tool (i.e. parent tool) is initially located, as identified by an outline 205. In order to teach the user tool for the tandem welding tool 200, four points (i.e. targets) 206, 208, 210, 212 need to be located in three dimension space, as shown in FIG. 11. Specifically, while the robotic tool is in the first position, the processor 14 analyzes the first image to locate the first point 206 and an XYZ coordinate representing a position where a pre-defined view line passes through a calibration plane defined during the calibration step 102. As a non-limiting example, the XYZ coordinate of the view line intersection, the first point 206, and the known focal point of the image generating device 22 are each disposed along a first three dimensional line. The processor 14 then repeats the analysis to locate points 208, 210, 212 in a fashion similar to that described above for the point 206.”, ¶ 0065, “The robotic tool is then rotated and the process is repeated to re-locate the points 206, 208, 210, 212 in the second image of the robotic tool.”. The cited passages show that the 3D position of the weldhead is determined by using the position of the base of the welding wire and the tip of the welding wire. Furthermore, the two images used are different from each other, as the robot has been rotated between the first and second images used. One of ordinary skill in the art would see that a 2D image is a projection of the 3D object captured by the imaging device.); and (d2) identifying the location of the tip of the weldhead based on the location in 3D space of the tip of the protrusion and on a known length extending between a base of the protrusion and the tip of the protrusion (Nemmers: ¶ 0066, “ A tool center point 214 based upon the wire 202 can be created from the direction of the reference line R1, the origin point 206 and a pre-defined distance (e.g. 17 mm) associated with the robotic tool. The process is repeated for the wire 204 using the points 210, 212 to generate reference line R2 and tool center point 216.”. The cited passage clearly shows using the position of the base of the protrusion and the known length of the protrusion to determine the 3D position of the tip.). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the robotic welding system taught in Eldridge in view of Huang in further view of Fisher with the method of identifying the 3D location of the weldhead taught in Nemmers with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because such a method of identifying 3D position of the weldhead can be done automatically and within seconds (Nemmers: ¶ 0091, “Specifically, a user tool adjustment can be completed in two to three seconds and the user frames are taught automatically.”). Regarding claim 20, Eldridge in view of Huang in further view of Fisher does not teach wherein the application, when executed by the processor: annotates at least one of the plurality of images to indicate a base of the protrusion and a tip of the protrusion located opposite the base of the protrusion identified in the plurality of images. Nemmers, in the same field of endeavor, teaches wherein the application, when executed by the processor: annotates at least one of the plurality of images to indicate a base of the protrusion and a tip of the protrusion located opposite the base of the protrusion identified in the plurality of images (Nemmers: Figure 12, ¶ 0064, “In order to teach the user tool for the tandem welding tool 200, four points (i.e. targets) 206, 208, 210, 212 need to be located in three dimension space, as shown in FIG. 11.”, ¶ 0066, “ A tool center point 214 based upon the wire 202 can be created from the direction of the reference line R1, the origin point 206 and a pre-defined distance (e.g. 17 mm) associated with the robotic tool. The process is repeated for the wire 204 using the points 210, 212 to generate reference line R2 and tool center point 216.”. As can be seen from the cited figure and passages, the base of the welding wire and the tip (defined as the TCP) are annotated on the image.). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the system for calibrating a TCP of a robotic welding system taught in Eldridge in view of Huang in further view of Fisher with the method of annotating an image of the protrusion taught in Nemmers with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because such a method of identifying the base and tip of the protrusion can be done automatically and within seconds (Nemmers: ¶ 0091, “Specifically, a user tool adjustment can be completed in two to three seconds and the user frames are taught automatically.”). Regarding claim 22, Eldridge in view of Huang in further view of Fisher does not teach wherein the application, when executed by the processor: triangulates a location in 3D space of a tip of the protrusion based on a first projection of the tip of the protrusion captured in a first image of the plurality of images and a second projection of the tip of the protrusion captured in a second image of the plurality of images that is different from the first image; and identifies the location of the tip of the weldhead based on the location in 3D space of the tip of the protrusion and on a known length extending between a base of the protrusion and the tip of the protrusion. Nemmers, in the same field of endeavor, teaches wherein the application, when executed by the processor: triangulates a location in 3D space of a tip of the protrusion based on a first projection of the tip of the protrusion captured in a first image of the plurality of images and a second projection of the tip of the protrusion captured in a second image of the plurality of images that is different from the first image (Nemmers: Figure 12, ¶ 0064, “As more clearly shown in FIG. 10, while the robotic tool is positioned within the field of view of the image generating device, a portion of the robotic tool (i.e. parent tool) is initially located, as identified by an outline 205. In order to teach the user tool for the tandem welding tool 200, four points (i.e. targets) 206, 208, 210, 212 need to be located in three dimension space, as shown in FIG. 11. Specifically, while the robotic tool is in the first position, the processor 14 analyzes the first image to locate the first point 206 and an XYZ coordinate representing a position where a pre-defined view line passes through a calibration plane defined during the calibration step 102. As a non-limiting example, the XYZ coordinate of the view line intersection, the first point 206, and the known focal point of the image generating device 22 are each disposed along a first three dimensional line. The processor 14 then repeats the analysis to locate points 208, 210, 212 in a fashion similar to that described above for the point 206.”, ¶ 0065, “The robotic tool is then rotated and the process is repeated to re-locate the points 206, 208, 210, 212 in the second image of the robotic tool.”. The cited passages show that the 3D position of the weldhead is determined by using the position of the base of the welding wire and the tip of the welding wire. Furthermore, the two images used are different from each other, as the robot has been rotated between the first and second images used. One of ordinary skill in the art would see that a 2D image is a projection of the 3D object captured by the imaging device.); and identifies the location of the tip of the weldhead based on the location in 3D space of the tip of the protrusion and on a known length extending between a base of the protrusion and the tip of the protrusion (Nemmers: ¶ 0066, “ A tool center point 214 based upon the wire 202 can be created from the direction of the reference line R1, the origin point 206 and a pre-defined distance (e.g. 17 mm) associated with the robotic tool. The process is repeated for the wire 204 using the points 210, 212 to generate reference line R2 and tool center point 216.”. The cited passage clearly shows using the position of the base of the protrusion and the known length of the protrusion to determine the 3D position of the tip.). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filling date of the claimed invention, to have combine the system for calibrating the TCP of a robotic welding system taught in Eldridge in view of Huang in further view of Fisher with the method of identifying the 3D location of the weldhead taught in Nemmers with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification because such a method of identifying 3D position of the weldhead can be done automatically and within seconds (Nemmers: ¶ 0091, “Specifically, a user tool adjustment can be completed in two to three seconds and the user frames are taught automatically.”). Claim(s) 6 and 21 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 2009/0118894 A1 ("Eldridge") in view of US 10926414 B2 ("Huang") in further view of US 2021/0065356 A1 ("Fisher") in further view of US 2011/0029132 A1 ("Nemmers") in further view of US 6603870 B2 ("Bascle"). Regarding claim 6, Eldridge in view of Huang in further view of Fisher in further view of Nemmers does not teach wherein (c) comprises: (c1) defining a first plane in a first image of the plurality of images based on the annotated base of the protrusion; (c2) defining a second plane in a second image of the plurality of images based on the annotated tip of the protrusion; and (c3) intersecting the first plane with the second plane to define the longitudinal axis of the protrusion. Bascle, in the same field of endeavor, teaches wherein (c) comprises: (c1) defining a first plane in a first image of the plurality of images based on the annotated base of the protrusion (Bascle: Column 9 lines 47-67, “Apparatus for positioning or aligning a biopsy needle for proper insertion into the body of a patient from a selected point on a surface of the body, so as to enter in a straight line passing through a designated target region within the body, in conjunction with an imaging system utilizing radiation from a first source position for deriving a first radiographic image on a first image plane of a portion of the body including a first image of the selected point and a first image of the target region, the first source po
Read full office action

Prosecution Timeline

Mar 07, 2023
Application Filed
Mar 27, 2025
Non-Final Rejection — §103
Oct 08, 2025
Response Filed
Nov 03, 2025
Final Rejection — §103
Mar 27, 2026
Request for Continued Examination
Apr 10, 2026
Response after Non-Final Action

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602063
LOAD HANDLING SYSTEM AND LOAD HANDLING METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12575900
Steerable Eversion Robot System and Method of Operating the Steerable Eversion Robot System
2y 5m to grant Granted Mar 17, 2026
Patent 12552043
METHOD FOR CONTROLLING ROBOTIC ARM, ELECTRONIC DEVICE, AND COMPUTER-READABLE STORAGE MEDIUM
2y 5m to grant Granted Feb 17, 2026
Patent 12472640
CONTROL METHOD AND SYSTEM FOR ARTICLE TRANSPORTATION BASED ON MOBILE ROBOT
2y 5m to grant Granted Nov 18, 2025
Patent 12467759
VEHICLE WITH SWITCHABLE FORWARD AND BACKWARD CONFIGURATIONS, CONTROL METHOD, AND CONTROL PROGRAM
2y 5m to grant Granted Nov 11, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
67%
Grant Probability
51%
With Interview (-15.6%)
2y 6m
Median Time to Grant
Moderate
PTA Risk
Based on 18 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month