Prosecution Insights
Last updated: April 19, 2026
Application No. 18/572,876

Method for Evaluating the Surface of a Body Component, and Method for Training an Artificial Neural Network

Non-Final OA §103§112
Filed
Dec 21, 2023
Examiner
HUYNH, VAN D
Art Unit
2665
Tech Center
2600 — Communications
Assignee
BAYERISCHE MOTOREN WERKE AKTIENGESELLSCHAFT
OA Round
1 (Non-Final)
87%
Grant Probability
Favorable
1-2
OA Rounds
2y 6m
To Grant
99%
With Interview

Examiner Intelligence

Grants 87% — above average
87%
Career Allow Rate
630 granted / 721 resolved
+25.4% vs TC avg
Moderate +13% lift
Without
With
+13.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
25 currently pending
Career history
746
Total Applications
across all art units

Statute-Specific Performance

§101
8.8%
-31.2% vs TC avg
§103
32.0%
-8.0% vs TC avg
§102
30.9%
-9.1% vs TC avg
§112
14.2%
-25.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 721 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Election/Restrictions Applicant’s election without traverse of Group I (claims 11-18) in the reply filed on 12/24/2025 is acknowledged. Claims 19-26 are withdrawn from further consideration pursuant to 37 CFR 1.142(b), as being drawn to a nonelected invention. Since the restriction requirement properly made, the restriction requirement is now made final. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 11-18 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor, or for pre-AIA the applicant regards as the invention. Claim 11 recites the limitation "the polygon network" in lines 4 and 5. There is insufficient antecedent basis for this limitation in the claim. Examiner suggests replacing "the polygon network" with –the virtual polygon network--. Claims 12-18 are also rejected based on their dependency of the defected parent claim 11 above. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 11-18 is/are rejected under 35 U.S.C. 103 as being unpatentable over de Bonfim Gripp et al., US 2019/0287237 in view of Jovancevic et al., “3D Point Cloud Analysis for Detection and Characterization of Defects on Airplane Exterior Surface”. Regarding claim 11, de Bonfim Gripp discloses a method for evaluating a surface of a body component of a motor vehicle (figs. 3A and 7A-7B; para 0002, 0004, 0037, and 0051; methods for automatic quality inspection of materials and virtual material surfaces of motor vehicles), the method comprising: creating a virtual mesh of points of the surface (fig. 3A, element 302 and fig. 9A, element 920; para 0014, 0075, 0099-0100, and 0102; generating a virtual material surface from which an image may be captured and processed in a virtual environment; mesh of points of the virtual material surface is created); determining at least one variable characterizing a curvature of the mesh of points at at least one node of the mesh of points (fig. 3A, element 308-309; para 0078-0079 and 0102; the curvature is calculated; Alternatively, points of which the calculated curvature is zero may be used as dividing points of the fringe; The statistics of the curvatures calculated for each segment point such as: sum of curvatures, mean, variance, standard deviation, skewness, kurtosis or combinations of these statistics may also be features. The curvatures of the points can be used considering their signal or their absolute value in the calculation of the feature. The derivative of the curvature can be used as a feature in the same way as the curvature); and determining, by an artificial neural network (fig. 3A, element 310; para 0043 and 0080; an artificial neural network), in dependence on the at least one variable characterizing the curvature, at least one output variable characterizing a surface flaw of the surface to evaluate the surface (fig. 3A, elements 310-312; para 0007, 0043, and 0080; the calculated features for each fringe segment serve as input to an artificial neural network, whose outputs are the classification of defects of materials; Defects may be classified according to their level of severity and the source of the error, for example, tool mark, low dent, high bump, wrinkle or waviness). de Bonfim Gripp discloses claim 11 as enumerated above, but de Bonfim Gripp does not explicitly disclose polygon as claimed. However, Jovancevic discloses the polygon surrounding the points which belong to the defect (Section 3.4.2 Step C2: Data Preparation, First and Second paragraphs). Therefore, taking the combined disclosures of de Bonfim Gripp and Jovancevic as a whole, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the polygon as taught by Jovancevic into the invention of de Bonfim Gripp for the benefit of detecting and characterizing defects on an airplane exterior surface (Jovancevic: Abstract). Regarding claim 12, the method according to claim 11, de Bonfim Gripp and Jovancevic in the combination further disclose wherein at least one image of the surface arranged in a capture area of the optical capture device is captured by an optical capture device (de Bonfim Gripp: figs. 8A-8B, element 800; para 0094 and 0096), wherein the virtual polygon network is created in dependence on the image (Jovancevic: Section 3.4.2 Step C2: Data Preparation, First and Second paragraphs). Regarding claim 13, the method according to claim 11, de Bonfim Gripp and Jovancevic in the combination further disclose wherein the virtual polygon network is created (Jovancevic: Section 3.4.2 Step C2: Data Preparation, First and Second paragraphs) in dependence on at least one simulation result of a simulation of at least one step of a production process of the body component (de Bonfim Gripp: para 0099-0102). Regarding claim 14, the method according to claim 12, de Bonfim Gripp and Jovancevic in the combination further disclose wherein the virtual polygon network is created (Jovancevic: Section 3.4.2 Step C2: Data Preparation, First and Second paragraphs) in dependence on at least one simulation result of a simulation of at least one step of a production process of the body component (de Bonfim Gripp: para 0099-0102). Regarding claim 15, the method according to claim 11, de Bonfim Gripp and Jovancevic in the combination further disclose wherein at least one two-dimensional geometric map of the polygon network is formed (Jovancevic: Section 3.4.2 Step C2: Data Preparation, First and Second paragraphs), and the output variable characterizing the surface flaw is determined in dependence on the geometric map by the artificial neural network (de Bonfim Gripp: fig. 3A, elements 310-312; para 0007, 0043, and 0080). Regarding claim 16, the method according to claim 12, de Bonfim Gripp and Jovancevic in the combination further disclose wherein at least one two-dimensional geometric map of the polygon network is formed (Jovancevic: Section 3.4.2 Step C2: Data Preparation, First and Second paragraphs), and the output variable characterizing the surface flaw is determined in dependence on the geometric map by the artificial neural network (de Bonfim Gripp: fig. 3A, elements 310-312; para 0007, 0043, and 0080). Regarding claim 17, the method according to claim 13, de Bonfim Gripp and Jovancevic in the combination further disclose wherein at least one two-dimensional geometric map of the polygon network is formed (Jovancevic: Section 3.4.2 Step C2: Data Preparation, First and Second paragraphs), and the output variable characterizing the surface flaw is determined in dependence on the geometric map by the artificial neural network (de Bonfim Gripp: fig. 3A, elements 310-312; para 0007, 0043, and 0080). Regarding claim 18, the method according to claim 15, de Bonfim Gripp in the combination further disclose wherein at least one pixel of the geometric map is assigned the respective variable characterizing the curvature, wherein in dependence on the at least one pixel assigned the variable characterizing the curvature (fig. 3A; para 0075-0079), the output variable characterizing the surface flaw is determined by the artificial neural network (fig. 3A, elements 310-312; para 0007, 0043, and 0080). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Orzol et al., US 2022/0178838 discloses the invention relates to a method for determining deformations on an object, wherein the object is illuminated and moved while being illuminated. Raghu et al., US 10,346,969 discloses a convolutional neural network may be trained to inspect subjects such as carbon fiber propellers for surface flaws or other damage. Kwant et al., US 2019/0051013 discloses an approach is provided for an asymmetric evaluation of polygon similarity. Any inquiry concerning this communication or earlier communications from the examiner should be directed to VAN D HUYNH whose telephone number is (571)270-1937. The examiner can normally be reached 8AM-6PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Stephen R Koziol can be reached at (408) 918-7630. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /VAN D HUYNH/Primary Examiner, Art Unit 2665
Read full office action

Prosecution Timeline

Dec 21, 2023
Application Filed
Jan 06, 2026
Non-Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602798
METHOD AND APPARATUS FOR GENERATING SUBJECT-SPECIFIC MAGNETIC RESONANCE ANGIOGRAPHY IMAGES FROM OTHER MULTI-CONTRAST MAGNETIC RESONANCE IMAGES
2y 5m to grant Granted Apr 14, 2026
Patent 12602784
MEDICAL DEVICE FOR TRANSCRIPTION OF APPEARANCES IN AN IMAGE TO TEXT WITH MACHINE LEARNING
2y 5m to grant Granted Apr 14, 2026
Patent 12594046
METHOD AND APPARATUS FOR ASSISTING DIAGNOSIS OF CARDIOEMBOLIC STROKE BY USING CHEST RADIOGRAPHIC IMAGES
2y 5m to grant Granted Apr 07, 2026
Patent 12586186
JAUNDICE ANALYSIS SYSTEM AND METHOD THEREOF
2y 5m to grant Granted Mar 24, 2026
Patent 12582345
Systems and Methods for Identifying Progression of Hypoxic-Ischemic Brain Injury
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
87%
Grant Probability
99%
With Interview (+13.4%)
2y 6m
Median Time to Grant
Low
PTA Risk
Based on 721 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month