Prosecution Insights
Last updated: April 19, 2026
Application No. 17/900,510

Welding Line Detection System

Final Rejection §103§112
Filed
Aug 31, 2022
Examiner
CAMMARATA, MICHAEL ROBERT
Art Unit
2667
Tech Center
2600 — Communications
Assignee
Daihen Corporation
OA Round
4 (Final)
70%
Grant Probability
Favorable
5-6
OA Rounds
2y 4m
To Grant
99%
With Interview

Examiner Intelligence

Grants 70% — above average
70%
Career Allow Rate
213 granted / 305 resolved
+7.8% vs TC avg
Strong +36% interview lift
Without
With
+35.9%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
46 currently pending
Career history
351
Total Applications
across all art units

Statute-Specific Performance

§101
4.5%
-35.5% vs TC avg
§103
45.8%
+5.8% vs TC avg
§102
21.1%
-18.9% vs TC avg
§112
24.6%
-15.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 305 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant’s arguments with respect to claim 1 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. See Hirayama which is now also applied to amended claim 1 to meet the amended features of detecting and presenting plural candidate welding lines. Although Hirayama was applied in the last office action to demonstrate obviousness of displaying/presenting a welding lines to a user (see rejection of claim 7) no argument was presented against this prior art reference; as such no further response is necessary. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: program production unit in claims 4 and 8. In contrast to claim 1 which recites structure (processor and memory) for performing various functions, claims 4 and 8 recite no such structure and one of ordinary skill would not recognize a product production “unit” to denote a structural element. Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 4 and 8 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 4 recites “a program production unit that produces a work program for performing welding on the basis of the selected welding line” and claim 8 recites “specifying, in the three-dimensional orthogonal coordinate system, a welding line based on the selected welding line”. There is no antecedent basis for the “selected” welding line as no selection is performed in the claims. Claim Rejections - 35 USC § 103 The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. Claims 1, 5, 7, 9, and 10 are rejected under 35 U.S.C. 103 as being unpatentable over Yoon (KR 102125167 B1), Kim (US 20210325541 A1), Becker (US 2014/0272835 A1), and Hirayama (US 20240331575 A1). A marked-up machine translation of Yoon has previously been provided with a prior office action, all cross-references are with respect to this translation and the mark-ups are hereby incorporated by reference to further demonstrate claim mapping. Claim 1 In regards to claim 1, Yoon discloses a welding line detection system {see abstract, Figs. 2, 7 and cites below} comprising: a terminal comprising an image sensor that generates a digital image of an object to be welded {Fig. 1 (copied below) shows automated welding apparatus 100 including an image acquisition module 105 as further discussed on pg. 4}, and a distance measurement sensor for measuring a distance from the distance measurement sensor to the object to be welded, PNG media_image1.png 410 630 media_image1.png Greyscale a processor and a memory configured to store instructions {pg. 3 discloses hardware and software implementations both of which include processor and memory} which, when executed, cause the processor to detect a position of a marker included in the digital image and generate a three-dimensional orthogonal coordinate system based on the position of the marker, {pgs. 4-5 the same marker is detected in the raw image and depth information such that, accordingly, accurate 3D information may be provided on the welding object WO and 3D welding line on the object including a specific position of the marker based on the digital image}; acquire point group data of the object to be welded from the distance measurement sensor and plot the acquired point group data in the three-dimensional orthogonal coordinate system, wherein a specific position is set as an origin of the three-dimensional orthogonal coordinate system {depth obtaining unit 120, Fig. 2, pgs. 4-5, acquires point group data of the object to be welded and provides (plots) the depth data (acquired point group data) in the three-dimensional orthogonal coordinate system while noting that coordinate systems have an origin set to “a specific position” as broadly claimed}; detect Becker is a highly relevant and analogous reference from the same field of welding and solves a similar problem of calibrating a welding system using markers. Moreover, Becker teaches an AR (augmented reality) system for welding training that uses markers. See Figs. 1, 2, 4, 23, including a camera 16, [0034]-0035] Becker also teaches detecting a position of a marker included in the digital image, the marker being fixedly disposed on a work surface, and detect a specific position of the marker based on the digital image {See Fig. 4 copied below and [0055], including markers 95, 96 attached to welding surface 88 (work surface), imaged by camera 16 and detected and used to calibrate/determine the 3-D position and orientation of the welding surface 88 relative to the camera 16. See also [0095]-[0097] in which an augmented reality simulation may display a virtual weld bead on the display device such the weld bead appears to be on the workpiece. PNG media_image2.png 734 496 media_image2.png Greyscale It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to have modified Yoon which already detects a position of a marker included in the digital image, generates a three-dimensional orthogonal coordinate system based on the position of the marker, and detects a specific position of the marker based on the digital image such that the marker is fixedly disposed on a work surface on which the object to be welded is disposed because doing so permits calibration of the coordinate system of the welding surface (work surface) relative to the camera as motivated by Becker in [0055]. Kim is an exemplary citation being provided to backup the challenged Official Notice utilized in the last Office Action. Kim is reasonably pertinent to the problem being solved by teaching a sensor structure particularly adapted to gathering both visible light images and depth images (distance measurements) in a compact, integrated structure. See Figs. 1, 3A and cites below. Kim also teaches wherein the image sensor and the distance measurement sensor are disposed in a known relationship and acquire data at the same time. See Figs. 1, 3A illustrating hybrid sensor 106, 300A in which RBG (Red, Green Blue) sensors are disposed in a common array along with IR/depth sensors 304 disposed in a known relationship and detecting IR for sensing distance/depth, [0005], [0024], [0049], [0082], [0139], using a Bayer filter in a 3D sensing system that simplifies alignment between the image and depth sensors as per [0048]. It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to have modified Yoon’s image acquisition module 105 and depth acquisition module such that wherein the image sensor and the distance measurement sensor are disposed in a known relationship and acquire data at the same time as taught by Kim Notice because doing so increases the accuracy of the combined image and depth data, because doing so simplifies alignment between the image and depth sensors as motivated by Kim in [0048]; because there is a reasonable expectation of success and/or because doing so merely combines prior art elements according to known methods to yield predictable results. Yoon detects a welding line of the object to be welded based upon the point group data plotted in the three-dimensional orthogonal coordinate system, Yoon is not relied upon to disclose that the detecting welding line is a plurality of candidate lines or presenting the plurality of candidate welding lines to a user Hirayama is an analogous reference from the same field of determining a welding line. See Abstract, Figs. 1, 2 including welding operation creation unit 551, 3D calculation unit 54, [0002]-[0003], [0034]. Hirayama also teaches a display unit that displays the image; and a control unit that controls a content to be displayed on the display unit and wherein the control unit causes the display unit to display the welding line detected by the welding line detection unit, with the welding line being superimposed on the image {see monitor MN1, [0050], Fig. 5, display welding line St2, Fig. 8, superimposed welding lines WS11, WS12, Figs. 9-20, [0060]-[0064], [0114]-[0117], [0131]-[0133]}. Hirayama further teaches detecting welding line that is a plurality of candidate lines and presenting the plurality of candidate welding lines to a user {See above cites while noting that at least two “candidate” welding lines WS11 and WS 12; WS741, WS742, WS743, WS744 are detected and presented/displayed to a user. See Figures 8 and 25 copied below in which plural candidate welding lines WS11 and WS11 are detected and presented/displayed to a user in the form of planar display or stereoscopic display. Further as to “candidate” welding line see [0304]-[0307] in which an operator selects one welding line WS741 among several other detected and displayed welding lines WS742, WS743, WS744 as shown in Fig. 25 such that the system can be further calibrated and set-up for welding operations specific to each of the plural candidate welding lines. PNG media_image3.png 560 808 media_image3.png Greyscale PNG media_image4.png 618 874 media_image4.png Greyscale It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to have modified Yoon which already detects a welding line of the object to be welded based upon the point group data plotted in the three-dimensional orthogonal coordinate system such that the detected welding line is a plurality of candidate welding lines and presenting the plurality of candidate welding lines to a user as taught by Hirayama because doing so permits the system to be further calibrated and set-up for welding operations specific to each of the plural candidate welding lines as motivated by Hirayama, because there is a reasonable expectation of success and/or because doing so merely combines prior art elements according to known methods to yield predictable results. Claim 5 In regards to claim 5, Yoon is not relied upon to disclose but Becker teaches wherein the marker is an AR marker {See [0055], including markers 95, 96 attached to welding surface 88 (work surface), imaged by camera 16 and detected and used to calibrate/determine the 3-D position and orientation of the welding surface 88 relative to the camera 16. See also [0095]-[0097] in which the markers and the 3-D positions they establish are used as AR markers in an augmented reality simulation may display a virtual weld bead on the display device such the weld bead appears to be on the workpiece. See also augmented reality mode 252, virtual reality simulation 260, Figs. 14, 15, 22, 23, 24, [0093]-[0098]} It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to have modified Yoon which already detects a position of a marker included in the digital image, generates a three-dimensional orthogonal coordinate system based on the position of the marker, and detects a specific position of the marker based on the digital image such that the marker is fixedly disposed on a work surface on which the object to be welded is disposed because doing so permits calibration of the coordinate system of the welding surface (work surface) relative to the camera as motivated by Becker in [0055]; and such that wherein the marker is an AR marker as also taught by Becker because using the markers as AR markers advantageously enables the AR markers and the 3-D positions they establish for use in an augmented reality simulation that may display a virtual weld bead on the display device such the weld bead appears to be on the workpiece as motivated by Becker; because AR markers are equivalent, conventional alternatives the application of which to determining a coordinate system is not only well known but highly predictable and advantageous by enabling an AR support image and environment such that the operator can intuitively grasp the position and orientation of the end effector while welding; because there is a reasonable expectation of success and/or because doing so merely combines prior art elements according to known methods to yield predictable results. Claim 7 In regards to claim 7, Yoon is not relied upon to disclose wherein the photographing terminal further includes: Hirayama is an analogous reference from the same field of determining a welding line. See Abstract, Figs. 1, 2 including welding operation creatin unit 551, 3D calculation unit 54, [0002]-[0003], [0034], Hirayama also teaches a display unit that displays the image; and a control unit that controls a content to be displayed on the display unit and wherein the control unit causes the display unit to display the candidate welding line superimposed on the digital image of the object to be welded {see monitor MN1, [0050], Fig. 5, display welding line St2, Fig. 8, superimposed welding lines WS11, WS12, Figs. 8-25, [0060]-[0064], [0114]-[0117], [0131]-[0133]. See above cites while noting that at least two “candidate” welding lines WS11 and WS 12; WS741, WS742, WS743, WS744 are detected and presented/displayed to a user. See Figures 8 and 25 copied above in which plural candidate welding lines WS11 and WS11 are detected and presented/displayed to a user in the form of planar display or stereoscopic display. Further as to “candidate” welding line see [0304]-[0307] in which an operator selects one welding line WS741 among several other detected and displayed welding lines WS742, WS743, WS744 as shown in Fig. 25 such that the system can be further calibrated and set-up for welding operations specific to each of the plural candidate welding lines.}. It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to have modified Yoon to include a display and such that wherein the control unit causes the display unit to display the candidate welding line superimposed on the digital image of the object to be welded as taught by Hirayama because such superimposed welding lines in the AR environment provide an advantageous teaching aid to the apprentice welder thereby increasing welding accuracy and productivity, because doing so permits the system to be further calibrated and set-up for welding operations specific to each of the plural candidate welding lines as motivated by Hirayama, because there is a reasonable expectation of success, and/or because doing so merely combines prior art elements according to known methods to yield predictable results. Claims 9 and 10 Yoon is not relied upon to disclose but Hirayama teaches wherein the instructions, when executed, further cause the processor to: (claim 9) receive a user input selection of one of the plurality of candidate welding lines {See above cites while noting that at least two “candidate” welding lines WS11 and WS 12; WS741, WS742, WS743, WS744 are detected and presented/displayed to a user. See Figures 8 and 25 copied below in which plural candidate welding lines WS11 and WS11 are detected and presented/displayed to a user in the form of planar display or stereoscopic display. Further as to “candidate” welding line see [0304]-[0307] in which an operator selects one welding line WS741 among several other detected and displayed welding lines WS742, WS743, WS744 as shown in Fig. 25 such that the system can be further calibrated and set-up for welding operations specific to each of the plural candidate welding lines}; and (claim 10) in response to the received user input, set the selected one of the plurality of candidate welding lines as a selected welding line of the object to be welded. {Figs. 1, 2 including welding operation creation unit 551, 3D calculation unit 54, [0002]-[0003], [0034], [0043]-[0054] host device 1 controls welding executed by welding robot MC1 based on the welding execution commands derived from the welding lines including the welding lines WS741 among several other detected and displayed welding lines WS742, WS743, WS744 as shown in Fig. 25 via the selected welding operation program}. It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to have modified Yoon which already detects a welding line of the object to be welded based upon the point group data plotted in the three-dimensional orthogonal coordinate system such that the detected welding line is a plurality of candidate welding lines and presenting the plurality of candidate welding lines to a user, receive a user input selection of one of the plurality of candidate welding lines, and, in response to the received user input, set the selected one of the plurality of candidate welding lines as a selected welding line of the object to be welded as taught by Hirayama because doing so permits the system to be further calibrated and set-up for welding operations specific to each of the plural candidate welding lines as motivated by Hirayama, because there is a reasonable expectation of success and/or because doing so merely combines prior art elements according to known methods to yield predictable results. Claims 2-4 are rejected under 35 U.S.C. 103 as being unpatentable over Yoon, Becker, Kim, and Hirayama as applied to claim 1 above, and further in view of Gilliland (US 5999642 A). Claim 2 In regards to claim 2, Yoon is not relied upon to disclose wherein the instructions, when executed, further cause the processor to determine, on the basis of the point group data, a plurality of surfaces corresponding to the object to be welded, and determine a line of intersection between two surfaces included in the plurality of surfaces to be a candidate welding line. Gilliland is an analogous reference from the same field of welding line detection including a photographing terminal that photographs an image of an object to be welded {Fig. 2 scanning heads 16A, 16B, Fig, 3, column 11, line 60—column 12, line 47}; a coordinate system setting unit that sets a user coordinate system based on a marker included in the photographed image {see column 1, lines 23-40; column 9, lines 7-23}. Gilliland also teaches wherein the instructions, when executed, further cause the processor to determine, on the basis of the point group data, a plurality of surfaces corresponding to the object to be welded, and determine a line of intersection between two surfaces included in the plurality of surfaces to be a candidate welding line {see column 11, lines 19-; Figs. 7A-D column 15, line 32—column 16, line 67; Figs. 8A-B column 17, line 19—57}. It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to have modified Yoon to include wherein the instructions, when executed, further cause the processor to determine, on the basis of the point group data, a plurality of surfaces corresponding to the object to be welded, and determine a line of intersection between two surfaces included in the plurality of surfaces to be a candidate welding line as taught by Gilliland because doing so enables a robot to perform automatic welding on the intersection line as per Figs. 3, step 319 and Fig. 7D thereby increasing efficiency and reducing manual labor, because there is a reasonable expectation of success and/or because doing so merely combines prior art elements according to known methods to yield predictable results. Claim 3 In regards to claim 3, Yoon is not relied upon to disclose but Gilliland teaches wherein the instructions, when executed, further cause the processor to determine at least a portion of the line of intersection between the two surfaces to be the candidate welding line {see above cites for claim 2}. It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to have modified Yoon to include wherein the instructions, when executed, further cause the processor to determine at least a portion of the line of intersection between the two surfaces to be the candidate welding line as taught by Gilliland because doing so enables a robot to perform automatic welding on the intersection line as per Figs. 3, step 319 and Fig. 7D thereby increasing efficiency and reducing manual labor because there is a reasonable expectation of success and/or because doing so merely combines prior art elements according to known methods to yield predictable results. Claim 4 In regards to claim 4, Yoon discloses teaches a program production unit that produces a work program for performing welding on the basis of the selected welding line {welding instruction acquisition unit 160, Fig. 2, pg. 5-6}. Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over Yoon, Becker, Kim, Hirayama, and Gilliland as applied to claim 4 above, and further in view of Kumagai (US 20190200000 A1). Claim 8 In regards to claim 8, Yoon discloses wherein the program product unit is configured to produce the work program by: specifying, in the three-dimensional orthogonal coordinate system, a welding line based on the selected welding line {welding instruction acquisition unit 160, Fig. 2, pg. 5-6 specifies a 3D welding line including welding coordinates in a 3D orthogonal coordinate system}; setting, for the specified welding line, one or more settings of a torch configured to perform the welding {Welding instruction (WI) information includes a welding interval and welding on/off torch settings to welding robot WR)}; determining, in the three-dimensional orthogonal coordinate system, a trajectory of one or more portions of a manipulator configured to move the torch during welding {the welding Robot (WR) implicitly determines the robot manipulator trajectory to move the torch along the 3D welding line based on the welding instructions, pgs. 5-6}; and Kumagi is analogous reference from the same field of robotic welding and uses markers to determine positions and coordinate frames. See Figs. 1, including markers 7 for position detection, cameras 3, laser scanner 30 that detects 3D position of the workpiece, [0023]-[0026] Kumagai also teaches determining, in the three-dimensional orthogonal coordinate system, a trajectory of one or more portions of a manipulator configured to move the torch during welding {see teaching data for the robot 10 and/or welding data to perform welding on the workpiece, [0065]-[0071]; and converting the trajectory determined the three-dimensional orthogonal coordinate system to a trajectory in a coordinate system of the manipulator {see [0027]-[0030], [0065]-[0071]}. It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to have modified Yoon to include determining, in the three-dimensional orthogonal coordinate system, a trajectory of one or more portions of a manipulator configured to move the torch during welding and converting the trajectory determined the three-dimensional orthogonal coordinate system to a trajectory in a coordinate system of the manipulator as taught by Kumagai because doing so enables a robot to perform automatic welding based on the positional references of markers detected in the camera coordinate system thereby increasing efficiency and reducing manual labor because there is a reasonable expectation of success and/or because doing so merely combines prior art elements according to known methods to yield predictable results. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Ge *US 2018/0111267 A1) disclose a welding sequencing/scheduling method that identifies plural candidate welding lines S0-S17 fig. 5 and presents them to a user for editing, fig. 6, including deleting candidate welding segments from the schedule and adjusting welding parameters for each candidate welding segment as shown in Fig. 7. PNG media_image5.png 474 472 media_image5.png Greyscale PNG media_image6.png 486 448 media_image6.png Greyscale Krause (US 20200078948 A1) discloses that AR markers and systems for industrial robots and welding is conventional. See [0003], [0013]-[0115]. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Michael R Cammarata whose telephone number is (571)272-0113. The examiner can normally be reached M-Th 7am-5pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Matthew Bella can be reached at 571-272-7778. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MICHAEL ROBERT CAMMARATA/Primary Examiner, Art Unit 2667
Read full office action

Prosecution Timeline

Aug 31, 2022
Application Filed
Nov 30, 2024
Non-Final Rejection — §103, §112
Feb 11, 2025
Response Filed
Mar 21, 2025
Final Rejection — §103, §112
May 16, 2025
Examiner Interview Summary
May 16, 2025
Applicant Interview (Telephonic)
May 30, 2025
Request for Continued Examination
Jun 02, 2025
Response after Non-Final Action
Jun 23, 2025
Non-Final Rejection — §103, §112
Sep 18, 2025
Response Filed
Oct 09, 2025
Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602797
RECONSTRUCTION OF BODY MOTION USING A CAMERA SYSTEM
2y 5m to grant Granted Apr 14, 2026
Patent 12586171
METHODS AND SYSTEMS FOR GRADING DEVICES
2y 5m to grant Granted Mar 24, 2026
Patent 12579597
Point Group Data Synthesis Apparatus, Non-Transitory Computer-Readable Medium Having Recorded Thereon Point Group Data Synthesis Program, Point Group Data Synthesis Method, and Point Group Data Synthesis System
2y 5m to grant Granted Mar 17, 2026
Patent 12579835
INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND COMPUTER-READABLE RECORDING MEDIUM FOR DISTINGUISHING OBJECT AND SHADOW THEREOF IN IMAGE
2y 5m to grant Granted Mar 17, 2026
Patent 12567283
FACIAL RECOGNITION DATABASE USING FACE CLUSTERING
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
70%
Grant Probability
99%
With Interview (+35.9%)
2y 4m
Median Time to Grant
High
PTA Risk
Based on 305 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month