Prosecution Insights
Last updated: April 19, 2026
Application No. 18/214,961

COBOT WELDING TRAJECTORY CORRECTION WITH SMART VISION

Final Rejection §103
Filed
Jun 27, 2023
Examiner
ROBARGE, TYLER ROGER
Art Unit
3658
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
American Air Liquide, Inc.
OA Round
2 (Final)
77%
Grant Probability
Favorable
3-4
OA Rounds
2y 8m
To Grant
86%
With Interview

Examiner Intelligence

Grants 77% — above average
77%
Career Allow Rate
17 granted / 22 resolved
+25.3% vs TC avg
Moderate +9% lift
Without
With
+9.1%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
34 currently pending
Career history
56
Total Applications
across all art units

Statute-Specific Performance

§101
13.6%
-26.4% vs TC avg
§103
56.7%
+16.7% vs TC avg
§102
12.3%
-27.7% vs TC avg
§112
16.2%
-23.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 22 resolved cases

Office Action

§103
DETAILED ACTION This Office Action is taken in response to Applicant’s Amendment and Remarks filed on 08/7/2025 regarding Application No. 18/214,961 originally filed on 06/27/2023. Claims 1-6, 9 and 11-13 are pending for consideration: Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments The applicant argues “The examiner correctly states that Gneiting '781 fails to expressly disclose the multicolor light source and multi-color selectivity requires by instant claim 1. The examiner then proposes that Weis '764 remedy this deficiency. The Abstract and paragraph 28 are specifically cited as disclosing these limitations. Applicants cannot find these limitations in either of these citations… It is unclear which document the examiner is actually referencing, as paragraph 42 of Weis '764 is referenced, but the specification of Weis '764 only has 39 paragraphs. The examiner seems to be citing Weis '764 as having the term "diffuse lighting system" indicate multi-color LEDs… It is therefore unclear to Applicants how over Gneiting '781 in view of Weis '764 renders the instant claims obvious. For this reason, this rejection is improper and should be vacated.” [Remarks, p. 5-7] The examiner respectfully disagrees. Weis ‘764 was submitted in translation in the original file wrapper and was used in the non-final rejection. As shown in that foreign machine translation of record, Weis discloses both (1) a light source comprising a set of LEDs around the camera and (2) lighting for image processing that is selectable among multiple colors, with a teaching that one color (green) is preferred because it improves edge visibility. Weis discloses “a diffuse LED lighting system around the camera” in Abstract, further disclosing “The diffuse lighting system (3) is made up of a set of LEDs (Light-Emitting Diode), positioned around the camera (4)” in ¶28. In addition to this, Weis discloses “Lighting, for image processing, can be in the following colors: blue, white, green and red. Green lighting highlights the edges of the weld bead more than other colors” in ¶42. Claim 14 of Weis likewise recites that “the lighting is in the colors blue, green, red or white, preferably green with at least 500 lux of lighting.” Taken together, these passages teach a multi-color light source (a set of LEDs capable of providing blue, white, green, and red lighting) that is configured to have multi-color selectivity (the system can use any of the listed colors for image processing, with green specifically selected to highlight weld-bead edges). Under the broadest reasonable interpretation, Weis’s set of LEDs which can illuminate the workpiece in blue, white, green and red and which preferably uses green because it enhances edge detection meets this limitation. Applicant’s difficulty in locating these teachings is likely due to the differences in paragraph numbering between various machine translations. As shown in the translation of Weis ‘764 previously entered into the record, the document includes dual paragraph identifiers (i.e., “[0042] [0036]”), which may cause a given passage to be labeled with one number in Espacenet and a different number in the supplied translation. PNG media_image1.png 581 711 media_image1.png Greyscale The fact that applicant relied on a different machine translation with only 39 numbered paragraphs does not negate the clear disclosure in the translation of Weis that is of record in this application. Accordingly, Gneiting ’781 in view of Weis ’764 continues to teach or suggest the limitations of claim 1. Gneiting discloses the enclosure with the camera and light source positioned adjacent the welding torch and protected from fumes and spatter (as per ¶7, ¶10, ¶35). Weis teaches the multi-color, color-selectable LED lighting around the camera for image processing and joint/edge detection (as per Abstract, ¶28, ¶42, Claim 14). It would have been obvious for one of ordinary skill in the art before the effective filing date to have modified the protective enclosure of Gneiting with the robot welding system as taught by Weis so that the welding vision system could take advantage of known benefits of different illumination colors (i.e. selecting green illumination to better highlight weld-bead edges) - thereby improving edge detection and alignment in the robotic welding system. Therefore, the applicant’s argument(s) are unpersuasive. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-4, 9, and 11 are rejected under 35 U.S.C. 103 as being unpatentable over Gneiting (US Pub. No. 20120325781) in view of Weis (WO Pub. No. 2021127764). As per Claim 1, Gneiting discloses of a protective enclosure (as per Abstract), comprising: a high-resolution camera, (as per “A camera is disposed in the housing and removably coupled thereto. The camera has a lens oriented towards the closure member. The system further includes a light source disposed in the housing and removably coupled thereto such that the light source is disposed adjacent to the camera” in ¶7, as per “The image-recording device 234 can be a video camera, a digital camera, a film-based camera, or any other known image-recording device. For example, the image-recording device 234 can be an XC-56 camera manufactured by Sony Electronics Inc.” in ¶39) a means of dust and welding fume protection configured to automatically close (as per “compression of the foam or compressible material 218 forms a seal between the interior of the housing 200 and the outside environment. As a result, during a welding operation, for example, smoke, fumes, dust, spatter, etc. are unable to infiltrate the interior of the housing 200 due to the seal. “ in ¶38, as per “The closure member 214 can be mechanically pivoted between the open position 260 and closed position 250, and any position therebetween, by a cylinder assembly 220. The cylinder assembly 220 can be controlled hydraulically, electronically, pneumatically, or by any other known method.” in ¶35, as per and protect the high-resolution camera and multi-color light source during a welding operation. (as per “The enclosure further includes an image-recording device disposed within the housing and coupled thereto, the image-recording device including a lens oriented towards the bottom panel. A light source is disposed within the housing and coupled thereto, the light source being disposed adjacent the image-recording device and configured to emit light in a direction towards the bottom panel.” in ¶10, as per “the image-recording device and light source can be substantially enclosed by the housing in the open and closed positions.” in ¶12, as per “wherein, during the welding process, the camera and light source are enclosed by the housing and closure member.” in Claim 1) Gneiting fails to expressly disclose: a multi-color light source configured to have multi-color selectivity, Weis discloses of detecting and aligning joints in robotic welding (as per Abstract), a multi-color light source (as per “diffuse lighting system (3) is made up of a set of LEDs (Light-Emitting Diode), positioned around the camera (4).” in ¶28) configured to have multi-color selectivity, (as per “Lighting, for image processing, can be in the following colors: blue, white, green and red. Green lighting highlights the edges of the weld bead more than other colors.” in ¶42) In this way, Weis operates to generate references for robot alignment (¶42). Like Gneiting, Weis is concerned with robotic welding systems. It would have been obvious for one of ordinary skill in the art before the effective filing date to have modified the protective enclosure of Gneiting with the robot welding system as taught by Weis to enable another standard means of applying different color lighting to the workpiece (¶42). As per Claim 2, the combination of Gneiting and Weis teaches or suggests all limitations of Claim 1. Gneiting further discloses a control system configured to control a power source and the movement of the robotic welding arm. (as per “includes a controllable robot having a robotic arm and a welding device coupled to the robotic arm. The welding device is configured to perform the welding process. The system also includes a housing coupled to the robotic arm adjacent the welding device and a closure member pivotably coupled to one end of the housing.” in ¶7, as per Fig. 1) As per Claim 3, the combination of Gneiting and Weis teaches or suggests all limitations of Claim 2. Gneiting fails to expressly discloses a first object to be welded, wherein: the first object to be welded has a boundary, the first object to be welded is positioned on a background having uniform color, thereby producing a visual contrast between the boundary of the first object to be welded and the background, the robot welding arm is configured to be positioned such that the boundary is visible to the high-resolution camera, wherein the multi-color light source is configured to project light on the boundary, wherein the high-resolution camera is configured to detect the boundary, thereby producing an image, wherein the control system is configured to process the image to produce a 2D model of the first object to be welded. See Claim 2 for teachings of Gneiting. Gneiting further discloses a first object to be welded, wherein: the first object to be welded has a boundary, (as per “edge identification and primitive identification methods are used. A border separates the object from the rest of the scene contained in the image” in ¶34) the first object to be welded is positioned on a background having uniform color (Fig. 2), thereby producing a visual contrast between the boundary of the first object to be welded and the background, (as per “Thresholding comprises a form of image segmentation that is based on the difference between the pixel values that make up different objects in an image. The most common methods are: binary thresholding, thresholding with mean smoothing, thresholding with Gaussian smoothing, thresholding by the Otsu method. Binary thresholding is the most common, where with the characteristics of the objects that you want to isolate, the image is segmented into two groups, being pixels with a gray level above a threshold and below the threshold.” in ¶33, as per “Green lighting highlights the edges of the weld bead more than other colors.” in ¶42) the robot welding arm is configured to be positioned such that the boundary is visible to the high-resolution camera, (as per “monocular camera (4) is a passive monocular digital camera intended to capture images of the joint in front of the welding torch at a known position.” in ¶28, as per “, the system (10) is coupled to the robot arm and the welding torch (6), images captured of the metal sheet with a top view at the front of the torch (7) are processed and the joint center line (8) identified, allowing automation of the welding process.” in ¶27) wherein the multi-color light source is configured to project light on the boundary, (as per “diffuse lighting system (3) is made up of a set of LEDs (Light-Emitting Diode), positioned around the camera (4).” in ¶28) wherein the high-resolution camera is configured to detect the boundary, thereby producing an image, (as per “monocular camera (4) is a passive monocular digital camera intended to capture images of the joint in front of the welding torch at a known position.” in ¶28) wherein the control system is configured to process the image to produce a 2D model of the first object to be welded. (as per “a) Capture of the image with a top view of the joint; b) Image processing by noise filtering: average filter with 11x11 or 5x5 mask, in which a convolution operation is performed to remove noise; c) … e) Processing for edge detection: Sobel algorithm that calculates the intensity gradients in the image; f) Identification of the edges of interest: the 4 most expressive are selected among all the edges found, characterizing the chamfer edges (9); g) Estimation of the chamfer center line: the chamfer center line is calculated based on the 4 lines found previously; h) Conversion of pixels into length units: from camera calibration, the relationship between pixels and millimeters is made based on the perspective projection model” in ¶42, as per “a. the conversion of primitives into chamfer dimension in the image domain (pixels) and then b. the conversion from pixels to the physical world (metric).” in ¶39) In this way, Weis operates to generate references for robot alignment (¶42). Like Gneiting, Weis is concerned with robotic welding systems. It would have been obvious for one of ordinary skill in the art before the effective filing date to have modified the protective enclosure of Gneiting with the robot welding system as taught by Weis to enable another standard means of applying different color lighting to the workpiece (¶42). Such modification also allows the system to use edge/primitive identification to separate the object from the rest of the scene in the image (¶34). As per Claim 4, the combination of Gneiting and Weis teaches or suggests all limitations of Claim 2. Gneiting fails to expressly disclose a first object to be welded, wherein the first object to be welded comprises an edge, wherein: the robot welding arm is positioned such that the edge is visible to the high-resolution camera, the multi-color light source is configured to project light on the edge, software detects the edge of the first object in the image, the control system processes the image to produce a first 2D model of the first object to be welded. See Claim 2 for teachings of Weis. Weis further discloses a first object to be welded, wherein the first object to be welded comprises an edge, wherein: the robot welding arm is positioned such that the edge is visible to the high-resolution camera, (as per “monocular camera (4) is a passive monocular digital camera intended to capture images of the joint in front of the welding torch at a known position.” in ¶28, as per “the system (10) is coupled to the robot arm and the welding torch (6), images captured of the metal sheet with a top view at the front of the torch (7) are processed and the joint center line (8) identified, allowing automation of the welding process.” in ¶27) the multi-color light source is configured to project light on the edge, (as per “diffuse lighting system (3) is made up of a set of LEDs (Light-Emitting Diode), positioned around the camera (4).” in ¶28) software detects the edge of the first object in the image, (as per “edge identification and primitive identification methods are used. A border separates the object from the rest of the scene contained in the image” in ¶34) the control system processes the image to produce a first 2D model of the first object to be welded. (as per “a) Capture of the image with a top view of the joint; b) Image processing by noise filtering: average filter with 11x11 or 5x5 mask, in which a convolution operation is performed to remove noise; c) … e) Processing for edge detection: Sobel algorithm that calculates the intensity gradients in the image; f) Identification of the edges of interest: the 4 most expressive are selected among all the edges found, characterizing the chamfer edges (9); g) Estimation of the chamfer center line: the chamfer center line is calculated based on the 4 lines found previously; h) Conversion of pixels into length units: from camera calibration, the relationship between pixels and millimeters is made based on the perspective projection model” in ¶42, as per “a. the conversion of primitives into chamfer dimension in the image domain (pixels) and then b. the conversion from pixels to the physical world (metric).” in ¶39) In this way, Weis operates to generate references for robot alignment (¶42). Like Gneiting, Weis is concerned with robotic welding systems. It would have been obvious for one of ordinary skill in the art before the effective filing date to have modified the protective enclosure of Gneiting with the robot welding system as taught by Weis to enable another standard means of applying different color lighting to the workpiece (¶42). Such modification also allows the system to use edge/primitive identification to separate the object from the rest of the scene in the image (¶34). As per Claim 9, the combination of Gneiting and Weis teaches or suggests all limitations of Claim 2. Gneiting fails to expressly disclose two or more objects to be welded, wherein the two or more objects to be welded comprise two or more edges, wherein: the robot welding arm is positioned such that the two or more edges are visible to the high-resolution camera, the multi-color light source is configured to project light on the workpieces, the high-resolution camera detects the two or more edges of the two or more objects to be welded. See Claim 2 for teachings of Weis. Weis further two or more objects to be welded, wherein the two or more objects to be welded comprise two or more edges, wherein: the robot welding arm is positioned such that the two or more edges are visible to the high-resolution camera, (as per “monocular camera (4) is a passive monocular digital camera intended to capture images of the joint in front of the welding torch at a known position.” in ¶28, as per “the system (10) is coupled to the robot arm and the welding torch (6), images captured of the metal sheet with a top view at the front of the torch (7) are processed and the joint center line (8) identified, allowing automation of the welding process.” in ¶27) the multi-color light source is configured to project light on the workpieces, (as per “diffuse lighting system (3) is made up of a set of LEDs (Light-Emitting Diode), positioned around the camera (4).” in ¶28, as per “Lighting, for image processing, can be in the following colors: blue, white, green and red. Green lighting highlights the edges of the weld bead more than other colors.” in ¶42) the high-resolution camera detects the two or more edges of the two or more objects to be welded. (Fig. 3, as per “Figure 3 shows an illustration of the image captured by the system with a top view of the joint (12), the identified chamfer edges (9), the end of the joint (11) and the calculated joint centerline (8).” in ¶27) In this way, Weis operates to generate references for robot alignment (¶42). Like Gneiting, Weis is concerned with robotic welding systems. It would have been obvious for one of ordinary skill in the art before the effective filing date to have modified the protective enclosure of Gneiting with the robot welding system as taught by Weis to enable another standard means of applying different color lighting to the workpiece (¶42). Such modification also allows the system to use edge/primitive identification to separate the object from the rest of the scene in the image (¶34). As per Claim 11, the combination of Gneiting and Weis teaches or suggests all limitations of Claim 9. Gneiting fails to expressly disclose: the two or more objects to be welded each comprise two or more waypoints that define the welding path, software calculates a new trajectory for two or more objects to be welded. See Claim 9 for teachings of Weis. Weis further discloses: the two or more objects to be welded each comprise two or more waypoints that define the welding path, (as per “Figure 3 shows an illustration of the image captured by the system with a top view of the joint (12), the identified chamfer edges (9), the end of the joint (11) and the calculated joint centerline (8).” in ¶27, as per “Generation of references for robot alignment: from the difference between the center of the camera and the center line of the calculated joint (8), the center of the chamfer.” in ¶42 software calculates a new trajectory for two or more objects to be welded. (as per “a robotic or automated welding process for butt joints by means of a system and method for generating trajectory references for the robot.” in Abstract, as per “robots can perform welding on parts without prior knowledge of the path to be taken.” in ¶5, as per “Generation of references for robot alignment: from the difference between the center of the camera and the center line of the calculated joint” in ¶42) In this way, Weis operates to generate references for robot alignment (¶42). Like Gneiting, Weis is concerned with robotic welding systems. It would have been obvious for one of ordinary skill in the art before the effective filing date to have modified the protective enclosure of Gneiting with the robot welding system as taught by Weis to enable another standard means of applying different color lighting to the workpiece (¶42). Such modification also allows the system to use edge/primitive identification to separate the object from the rest of the scene in the image (¶34). Claim(s) 5-6 are rejected under 35 U.S.C. 103 as being unpatentable over Gneiting (US Pub. No. 20120325781) in view of Weis (WO Pub. No. 2021127764) in further view of Aldridge (US Pub. No. 20200368904). As per Claim 5, the combination of Gneiting and Weis teaches or suggests all limitations of Claim 4. Gneiting and Weis fail to expressly disclose wherein a user generates a first trajectory. Aldridge discloses of remote robotic welding (as per Abstract), wherein a user generates a first trajectory. In this way, Aldridge operates to improve access during welding (¶42). Like Gneiting and Weis, Aldridge is concerned with robotic systems. It would have been obvious for one of ordinary skill in the art before the effective filing date to have modified the protective enclosure of Gneiting and the robot welding system as taught by Weis with the remote robotic welding system of Aldridge to enable another standard means of facilitating remote manual welding using a handheld controller (¶140). As per Claim 6, the combination of Gneiting, Weis, and Aldridge teaches or suggests all limitations of Claim 5. Gneiting fails to expressly disclose wherein software generates a second trajectory for the robotic welding arm including two or more waypoints. See Claim 5 for teachings of Weis. Weis further discloses wherein software generates a second trajectory for the robotic welding arm including two or more waypoints. (as per “a robotic or automated welding process for butt joints by means of a system and method for generating trajectory references for the robot.” in Abstract, as per “robots can perform welding on parts without prior knowledge of the path to be taken.” in ¶5, as per “Generation of references for robot alignment: from the difference between the center of the camera and the center line of the calculated joint” in ¶42) In this way, Weis operates to generate references for robot alignment (¶42). Like Gneiting and Aldridge, Weis is concerned with robotic systems. It would have been obvious for one of ordinary skill in the art before the effective filing date to have modified the protective enclosure of Gneiting and the remote robotic welding system of Aldridge with the robot welding system as taught by Weis to enable another standard means of applying different color lighting to the workpiece (¶42). Such modification also allows the system to use edge/primitive identification to separate the object from the rest of the scene in the image (¶34). Claim(s) 12 is rejected under 35 U.S.C. 103 as being unpatentable over Gneiting (US Pub. No. 20120325781) in view of Weis (WO Pub. No. 2021127764) in further view of Watanbe (US Pub. No. 20150224649). As per Claim 12, the combination of Gneiting and Weis teaches or suggests all limitations of Claim 3. Gneiting fails to expressly disclose: detecting a 2D edge of a second object to be welded, calculating the displacement between the first object and the second object wherein the minimum measurable displacement between the first object to be welded and the second object to be welded is 0.5 mm in x and y direction, and 0.1 degree for rotation error. See Claim 3 for teachings of Weis. Weis further discloses: detecting a 2D edge of a second object to be welded, (as per Figure 3, as per “edge identification and primitive identification methods are used. A border separates the object from the rest of the scene contained in the image” in ¶34, as per Figure 3) In this way, Weis operates to generate references for robot alignment (¶42). Like Gneiting and Watanbe, Weis is concerned with robotic systems. It would have been obvious for one of ordinary skill in the art before the effective filing date to have modified the protective enclosure of Gneiting and the robot system of Watanbe with the robot welding system as taught by Weis to enable another standard means of applying different color lighting to the workpiece (¶42). Such modification also allows the system to use edge/primitive identification to separate the object from the rest of the scene in the image (¶34). Weis and Gneiting fail to expressly disclose: calculating the displacement between the first object and the second object wherein the minimum measurable displacement between the first object to be welded and the second object to be welded is 0.5 mm in x and y direction, and 0.1 degree for rotation error Watanbe discloses of a robot system using visual feedback (as per Abstract), comprising: calculating the displacement between the first object and the second object wherein the minimum measurable displacement between the first object to be welded and the second object to be welded is 0.5 mm in x and y direction, (as per “it is determined whether the current robot position is in the target arrival state (S204). This determination may be made by comparing the magnitude of the amount of robot movement calculated at S203 with the previously set threshold. For example, with a threshold of 0.5 mm, if the calculated amount of robot movement is less than 0.5 mm it is possible to determine that the robot is in the target arrival state.” in ¶53) and 0.1 degree for rotation error. (as per “As for Δθ, i.e., the apparent orientation change of the object 4 in the image, since the Z-axis of the tool coordinate system Σt is perpendicular to the table 5 surface, the robot 1 may be rotated by Δθ about the Z-axis of the tool coordinate system.” in ¶102, as per “when the difference between at least, one feature quantity among the position, attitude and size of the object placed at the second object position… is greater than a predetermined value, the robot movement amount calculator calculates the amount of movement by regarding the second robot position as a revised initial position, … is set as a revised second robot position.) In this way, Watanbe operates to correct the operation of a robot using a camera (¶2). Like Gneiting and Weis, Watanbe is concerned with robotic systems. It would have been obvious for one of ordinary skill in the art before the effective filing date to have modified the protective enclosure of Gneiting and the robot welding system as taught by Weis with the robot system of Watanbe to enable another standard means of performing correct operations on the object when the object is placed at a position different from that taught by the program. (¶3). Claim(s) 13 is rejected under 35 U.S.C. 103 as being unpatentable over Gneiting (US Pub. No. 20120325781) in view of Weis (WO Pub. No. 2021127764) in view of Watanbe (US Pub. No. 20150224649) in further view of Kuzmin (US Pub. No. 20190080446). As per Claim 13, the combination of Gneiting, Weis, and Watanbe teaches or suggests all limitations of Claim 12. Gneiting, Weis, and Watanbe fail to expressly disclose wherein a first 2D model is compared to the edge of the second object, and any variation in size or shape is reported. Kuzmin discloses of automated defect detection (as per Abstract), wherein a first 2D model is compared to the edge of the second object, and any variation in size or shape is reported. (as per “he optical device 106 may capture a first image of a surface edge of a first hole 402 on a first workpiece having no detectable defects. The optical device 106 may capture a second image of a surface edge of a first hole 402 on a second workpiece having FOD at a location on the surface edge thereof. The system may compare the first image with the second image to detect the presence of FOD on the second workpiece.” in ¶30, as per “At step 722, the second image is compared with the first image to confirm the first feature of the second workpiece is in compliance with the specification.” in ¶40, as per “detect a defect in the three-dimensional workpiece when an actual dimension of a feature of the features is not identical to a desired dimension of the feature;” in Claim 13) In this way, Kuzmin operates to detect defects (Abstract). Like Gneiting, Watanbe, and Weis, Kuzmin is concerned with robotic systems. It would have been obvious for one of ordinary skill in the art before the effective filing date to have modified the protective enclosure of Gneiting, the robot system of Watanbe, and the robot welding system as taught by Weis with the automated defect detection of Kuzmin to enable another standard means of detecting a change in size or shape of a workpiece (Claim 13). Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to TYLER R ROBARGE whose telephone number is (703)756-5872. The examiner can normally be reached Monday - Friday, 8:00 am - 5:00 pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ramón Mercado can be reached at (571) 270-5744. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /T.R.R./Examiner, Art Unit 3658 /TRUC M DO/Primary Examiner, Art Unit 3658
Read full office action

Prosecution Timeline

Jun 27, 2023
Application Filed
Apr 03, 2025
Non-Final Rejection — §103
Aug 07, 2025
Response Filed
Nov 23, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12583117
WORKPIECE PROCESSING APPARATUS
2y 5m to grant Granted Mar 24, 2026
Patent 12552029
CONTROLLING MOVEMENT TO AVOID RESONANCE
2y 5m to grant Granted Feb 17, 2026
Patent 12485922
SYSTEM AND METHOD FOR MODIFYING THE LONGITUDINAL POSITION OF A VEHICLE WITH RESPECT TO ANOTHER VEHICLE TO INCREASE PRIVACY
2y 5m to grant Granted Dec 02, 2025
Patent 12459129
METHOD FOR MOTION OPTIMIZED DEFECT INSPECTION BY A ROBOTIC ARM USING PRIOR KNOWLEDGE FROM PLM AND MAINTENANCE SYSTEMS
2y 5m to grant Granted Nov 04, 2025
Patent 12456343
SYSTEMS AND METHODS FOR SUPPLYING ENERGY TO AN AUTONOMOUS VEHICLE VIA A VIRTUAL INTERFACE
2y 5m to grant Granted Oct 28, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
77%
Grant Probability
86%
With Interview (+9.1%)
2y 8m
Median Time to Grant
Moderate
PTA Risk
Based on 22 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month