Prosecution Insights
Last updated: April 19, 2026
Application No. 17/733,830

SYSTEMS AND METHODS FOR ANALYZING WELD QUALITY

Non-Final OA §102§103
Filed
Apr 29, 2022
Examiner
DULANEY, BENJAMIN O
Art Unit
2683
Tech Center
2600 — Communications
Assignee
General Electric Company
OA Round
3 (Non-Final)
62%
Grant Probability
Moderate
3-4
OA Rounds
2y 11m
To Grant
74%
With Interview

Examiner Intelligence

Grants 62% of resolved cases
62%
Career Allow Rate
349 granted / 565 resolved
At TC average
Moderate +12% lift
Without
With
+11.9%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
26 currently pending
Career history
591
Total Applications
across all art units

Statute-Specific Performance

§101
4.9%
-35.1% vs TC avg
§103
52.1%
+12.1% vs TC avg
§102
25.7%
-14.3% vs TC avg
§112
15.4%
-24.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 565 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant's arguments filed 11/25/25 have been fully considered but they are not persuasive. Regarding applicant’s argument for claim 1 and 11, on page 8, that Kitchen does not disclose an association between historic labeled weld feature data and historic welding process parameters, examiner disagrees. Feature data for the actual process of machine training/learning is based upon the only two available categories of variables for welding in Kitchen: imaging data of the weld (including surface topology) and “weld parameters” encompassing all other settings associated with welding (list of parameters in paragraph 4 for example). If, as paragraph 70 states, weld parameters are integrated into deep learning for interpreting the weld image data, then at least some feature vectors utilized by the machine learning model incorporates (i.e. “associates”) the weld parameter data. This interpretation is further supported by paragraph 110 which specifically states that weld feature vectors are based upon (“associated”) the weld images and the weld parameters. Therefore the argument is overcome and the previous rejection remains. Regarding applicant’s argument for claim 14, on page 9, that Kitchen does not disclose determining a weld parameter is out of range, examiner disagrees. Paragraphs 39-41 discloses utilizing weld parameters in a CNN to find valid and invalid (i.e. defect causing) parameters that can then be automatically adjusted during a weld to avoid anomalies (paragraph 84). That is to say, the disclosed model determines some weld parameters to be invalid (i.e. “out of range”) and then modifies those parameters. Therefore the disputed limitation is taught and the previous rejection remains. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. 1) Claim(s) 1-7 and 11-13 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by U.S. patent application publication 2021/0318673 by Kitchen et al. 2) Regarding claim 1, Kitchen teaches a system for analyzing weld quality, the system comprising: a controller having at least one processor and at least one memory device (paragraph 8; CPU and memories disclosed), the at least one memory device storing at least one machine learning algorithm executable to process surface topology data (paragraph 37 and 41; learning model utilizes imaged topology data) and welding process parameters (paragraphs 36, 70 and 89; various welding parameters disclosed) to identify a weld characteristic from a plurality of pre-defined weld characteristics (paragraphs 70 and 91; defects are identified from known defects), the at least one memory device further storing instructions that when executed by the at least one processor causes the at least one processor to perform operations, wherein the controller is configured to: receive labeled weld feature data for a first plurality of historic welds having a plurality of historic weld features, the labeled weld feature data identifying historic weld characteristics and historic welding parameters associated with the plurality of historic weld features (paragraphs 73 and 100; feature data trains the model and can be labeled, paragraph 70 discloses incorporating weld parameters into the learning algorithm); determine relationships between the plurality of historic weld features and the historic weld characteristics via the at least one machine learning algorithm (paragraphs 71, 107 and 108; features are identified and connected to defects), the relationships including associations between the plurality of historic weld features and the historic welding process parameters (paragraph 70; integrating weld parameters into the learning model is disclosed while paragraphs 71 and 72 describe the feature detection and vector outputs from the model thereby associating the parameters and features; weld parameters are integrated into deep learning for interpreting the weld image data, then at least some feature vectors utilized by the machine learning model incorporates [i.e. “associates”] the weld parameter data; this interpretation is further supported by paragraph 110 which specifically states that weld feature vectors are based upon [“associated”] the weld images and the weld parameters); receive post-weld surface topology data associated with a weld from one or more inspection devices; receive at least one welding process parameter associated with the weld; extract at least one weld feature from at least one of the post-weld surface topology data or the at least one welding process parameter; identify at least one weld characteristic of the weld from the plurality of pre-defined weld characteristics based on the relationships between the plurality of historic weld features and the historic weld characteristics (paragraphs 110 and 111; weld data is input to the trained model for identification of defects). 3) Regarding claim 2, Kitchen teaches the system of claim 1, wherein the at least one weld characteristic includes at least one of a surface discontinuity or a subsurface discontinuity (paragraph 61; surface and subsurface defects can be identified). 4) Regarding claim 3, Kitchen teaches the system of claim 1, wherein the controller is further configured to: generate a plurality of weld classifiers; and assign at least one weld classifier of the plurality of weld classifiers to the weld based on the at least one weld feature and the relationships between the plurality of historic weld features and the historic weld characteristics (paragraphs 70 and 91; model assigns a classification [i.e. good or bad] to each weld based on plurality of defects). 5) Regarding claim 4, Kitchen teaches the system of claim 1, wherein the controller is further configured to generate a weld classification report based on the at least one weld characteristic (paragraph 47; weld defects can be displayed). 6) Regarding claim 5, Kitchen teaches the system of claim 1, wherein the controller is further configured to process the post-weld surface topology data to extract the at least one weld feature (paragraph 110; features are determined and classified). 7) Regarding claim 6, Kitchen teaches the system of claim 1, wherein the at least one weld feature includes at least one of a shape, a dimension, a shape of a weld profile, a dimension of the weld profile, or a statistical feature of the weld (paragraph 60; shape and dimension of welds are determined). 8) Regarding claim 7, Kitchen teaches the system of claim 1, further comprising a laser scanner and wherein the laser scanner forms the one or more inspection devices (paragraph 61; laser scanner can be utilized). 9) Claim 11 is taught in the same manner as described in the rejection of claim 1 above with the exception of: topology data associated with a weld of a component (paragraph 37 and 41; learning model utilizes imaged topology data, NOTE: any weld is performed on a “component”, specific component shown in figure 4C); predicting, via a controller, at least one subsurface defect of the component based on the determined correlations, the at least one welding process parameter, and the received surface topology information (paragraphs 61, 110 and 111; model can predict defects including defect below the surface). 10) Regarding claim 12, Kitchen teaches the method of claim 11, further comprising assigning, via the controller, at least one weld classifier to the weld based on the at least one subsurface defect, and wherein the at least one weld classifier identifies the weld as conforming or non-conforming (paragraph 110; features are classified). 11) Regarding claim 13, Kitchen teaches the method of claim 11, wherein the component is an additively manufactured component, and wherein the weld is an overlapping seam (paragraph 103; model can be applied to additive manufacturing, examiner notes that all welds inherently occupy [thereby overlapping] a seam between objects). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 12) Claim(s) 8 and 14-20 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. patent application publication 2021/0318673 by Kitchen et al. as applied to claim 1 above, and further in view of U.S. patent 11,904,417 by Monjardin et al. 13) Regarding claim 8, Kitchen does not specifically teach the system of claim 1, wherein the controller is further configured to: receive labeled pre-weld surface topology data for a second plurality of historic welds, the labeled pre-weld surface topology data for the second plurality of historic welds identifying historic weld characteristics associated with historic pre-weld surface topology data; determine relationships between the historic pre-weld surface topology data and the historic weld characteristics via at least one machine learning algorithm; receive pre-weld surface topology data associated with the weld; and identify at least one weld characteristic of the weld based on the relationships between the historic pre-weld surface topology data and the historic weld characteristics and the pre-weld surface topology data. Monjardin teaches the system of claim 1, wherein the at least one processor is further configured to: receive labeled pre-weld surface topology data for a second plurality of historic welds, the labeled pre-weld surface topology data for the second plurality of historic welds identifying historic weld characteristics associated with historic pre-weld surface topology data; determine relationships between the historic pre-weld surface topology data and the historic weld characteristics via at least one machine learning algorithm; receive pre-weld surface topology data associated with the weld; and identify, via the controller, at least one weld characteristic of the weld based on the relationships between the historic pre-weld surface topology data and the historic weld characteristics and the pre-weld surface topology data (column 4, lines 1-13 and 44-61; image data prior to welding is collected and compared to historical data through machine learning to determine weld flaws). Kitchen and Monjardin are combinable because they are both from the machine learning weld classification field of endeavor. It would have been obvious to a person of ordinary skill in the art at the time the invention was effectively filed to combine Kitchen with Monjardin to add pre-weld surface data. The motivation for doing so would have been “to ensure portions are correct” (column 4, line 7). Therefore it would have been obvious to combine Kitchen and Monjardin to obtain the invention of claim 8. 14) Regarding claim 14, Kitchen teaches a method of analyzing weld quality, the method comprising: receiving post-weld surface topology data associated with a weld from one or more inspection devices (paragraph 37 and 41; learning model utilizes imaged topology data); receiving at least one welding process parameter associated with the weld from one or more welding devices (paragraphs 36, 70 and 89; various welding parameters disclosed); extracting the post- weld surface topology data, and the at least one welding process parameter; and determining at least one weld characteristic associated with the weld by analyzing the at least one weld feature via a trained machine learning algorithm configured to identify weld characteristics based on weld features, the trained machine learning algorithm receiving the at least one weld feature as input and identifying the at least one weld characteristic associated with the weld as output (paragraphs 110 and 111; weld data is input to the trained model for output identification of defects); and determining whether the at least one welding process parameter is out of range for the weld via the trained machine learning algorithm, the trained machine learning algorithm receiving the at least one welding process parameter (paragraphs 39-41; weld parameters are utilized in a CNN to find valid and invalid [i.e. defect causing] parameters that can then be automatically adjusted during a weld to avoid anomalies [paragraph 84], that is to say, the disclosed model determines some weld parameters to be invalid [i.e. “out of range”] and then modifies those parameters). Kitchen does not specifically teach receiving pre-weld surface topology data; extracting at least one weld feature from the pre-weld surface topology data. Monjardin teaches receiving pre-weld surface topology data; extracting at least one weld feature from the pre-weld surface topology data (column 4, lines 1-13 and 44-61; image data prior to welding is collected and compared to historical data through machine learning to determine weld flaws). Kitchen and Monjardin are combinable because they are both from the machine learning weld classification field of endeavor. It would have been obvious to a person of ordinary skill in the art at the time the invention was effectively filed to combine Kitchen with Monjardin to add pre-weld surface data. The motivation for doing so would have been “to ensure portions are correct” (column 4, line 7). Therefore it would have been obvious to combine Kitchen and Monjardin to obtain the invention of claim 14. 15) Regarding claim 15, Kitchen teaches the method of claim 14, further comprising: assigning at least one weld classifier to the weld based on the at least one weld characteristic (paragraphs 70 and 91; weld quality is classified). 16) Regarding claim 16, Kitchen teaches the method of claim 15, wherein the at least one weld classifier identifies whether the weld conforms to at least one predetermined weld standard (paragraph 70; model determination of “good” or “defect” is a standard). 17) Regarding claim 17, Kitchen teaches the method of claim 15, further comprising: receiving inspection verification information, the inspection verification information including at least one of visual inspection or volumetric inspection results for the weld; comparing the inspection verification information to the at least one weld classifier; and updating the trained machine learning algorithm based on the comparing of the inspection verification information to the at least one weld classifier (paragraph 81; model can learn based on images annotated with inspection results, thereby updating the model). 18) Regarding claim 18, Kitchen teaches the method of claim 17, wherein updating the trained machine learning algorithm includes adding the inspection verification information to a training data set for the trained machine learning algorithm (paragraph 81; model can learn based on images annotated with inspection results, thereby updating the model). 19) Regarding claim 19, Kitchen teaches the method of claim 14, wherein a training data set used to train the trained machine learning algorithm comprises historic weld features of welds with known characteristics (paragraphs 73, 100 and 108; labeled feature data from past welds trains the model). 20) Regarding claim 20, Kitchen (as combined with Monjardin as detailed in the rejection of claim 14 above) teaches the method of claim 19, wherein the historic weld features are determined based on historic pre-weld surface topology data (Monjardin; column 4, lines 1-13 and 44-61; image data prior to welding is collected and compared to historical data through machine learning to determine weld flaws) and historic post-weld surface topology data, wherein the historic pre-weld surface topology data and the historic post-weld surface topology data are associated with the welds with known characteristics (paragraph 37, 41 and 108; learning model utilizes imaged topology data that is classified [i.e. known characteristics]). 22) Claim(s) 9 and 10 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. patent application publication 2021/0318673 by Kitchen et al., and further in view of U.S. patent 11,904,417 by Monjardin et al., as applied to claim 8 above, and further in view of U.S. patent 12,145,219 by Komatsu et al. 23) Regarding claim 9, Kitchen does not specifically teach the system of claim 8, wherein the pre-weld surface topology data and the post- weld surface topology data are point cloud data. Komatsu teaches the system of claim 8, wherein the pre-weld surface topology data and the post-weld surface topology data are point cloud data (column 15, lines 21-23; weld shape can be obtained as point cloud data). Kitchen and Komatsu are combinable because they are both from the machine learning weld classification field of endeavor. It would have been obvious to a person of ordinary skill in the art at the time the invention was effectively filed to combine Kitchen with Komatsu to add point cloud data. The motivation for doing so would have been to obtain image data from a laser inspection device. Therefore it would have been obvious to combine Kitchen, Monjardin and Komatsu to obtain the invention of claim 9. 24) Regarding claim 10, Komatsu (as combined with Kitchen in the rejection of claim 9 above) teaches the system of claim 9, wherein the controller is further configured to: transform the point cloud data to image data; obtain an intensity of the image data; generate a weld section for the weld based on the intensity of the image data; and extract the at least one weld feature based on the weld section (column 15, lines 21-35; weld inspection results are obtained from image data which is obtained from a point cloud of the weld, examiner notes that any image data generated from the point cloud [even just binary data] is an expression of “intensity” as the point cloud is an expression of light reflected off of the weld from the laser). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to BENJAMIN O DULANEY whose telephone number is (571)272-2874. The examiner can normally be reached Mon-Fri 10-6. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Merouan Abderrahim can be reached at (571)270-5254. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. BENJAMIN O. DULANEY Primary Examiner Art Unit 2676 /BENJAMIN O DULANEY/Primary Examiner, Art Unit 2683
Read full office action

Prosecution Timeline

Apr 29, 2022
Application Filed
Dec 06, 2024
Non-Final Rejection — §102, §103
Feb 07, 2025
Interview Requested
Feb 14, 2025
Applicant Interview (Telephonic)
Feb 14, 2025
Examiner Interview Summary
Mar 07, 2025
Response Filed
Jun 22, 2025
Final Rejection — §102, §103
Nov 24, 2025
Request for Continued Examination
Dec 02, 2025
Response after Non-Final Action
Dec 12, 2025
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597136
APPARATUS AND METHOD OF RECORDING A STORAGE LOCATION BY SIZE FOR HARVESTED TUBERS
2y 5m to grant Granted Apr 07, 2026
Patent 12592998
NON-TRANSITORY COMPUTER-READABLE MEDIUM HAVING CONTROL INSTRUCTIONS, INFORMATION PROCESSING DEVICE, AND CONTROL METHOD
2y 5m to grant Granted Mar 31, 2026
Patent 12590841
Thermal Imaging for Self-Driving Cars
2y 5m to grant Granted Mar 31, 2026
Patent 12588874
OPTIMIZING CHECKPOINT LOCATIONS ALONG AN INSERTION TRAJECTORY OF A MEDICAL INSTRUMENT USING DATA ANALYSIS
2y 5m to grant Granted Mar 31, 2026
Patent 12586185
BLOOD FLOW EXTRACTION IMAGE FORMING DEVICE, METHOD OF FORMING BLOOD FLOW EXTRACTION IMAGE, AND BLOOD FLOW EXTRACTION IMAGE FORMING PROGRAM
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
62%
Grant Probability
74%
With Interview (+11.9%)
2y 11m
Median Time to Grant
High
PTA Risk
Based on 565 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month