Prosecution Insights
Last updated: April 19, 2026
Application No. 18/246,479

METHOD AND SYSTEM FOR TRAINING A NEURAL NETWORK

Final Rejection §102§103
Filed
Mar 23, 2023
Examiner
SUN, JIANGENG
Art Unit
2671
Tech Center
2600 — Communications
Assignee
Academy Of Robotics
OA Round
2 (Final)
82%
Grant Probability
Favorable
3-4
OA Rounds
2y 9m
To Grant
96%
With Interview

Examiner Intelligence

Grants 82% — above average
82%
Career Allow Rate
330 granted / 403 resolved
+19.9% vs TC avg
Moderate +14% lift
Without
With
+14.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
22 currently pending
Career history
425
Total Applications
across all art units

Statute-Specific Performance

§101
6.4%
-33.6% vs TC avg
§103
45.3%
+5.3% vs TC avg
§102
25.7%
-14.3% vs TC avg
§112
20.4%
-19.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 403 resolved cases

Office Action

§102 §103
DETAILED ACTION Response to Arguments Objection to Drawings is withdrawn. Rejections under 112 are withdrawn in light of amendment. Applicant's arguments filed on page 8-10 have been fully considered but they are not persuasive. Applicant argues that PNG media_image1.png 88 728 media_image1.png Greyscale This argument is not persuasive in that it is only stated “evaluating each set of weights occurs at least partly concurrently, such that two or more sets of weights in the first plurality are evaluated at the same time”, which does not refer to the detail of how the evaluation is down, therefore it is read on by Lanihun’s parallel processing. Instant applicant may have different evaluation method, such as segmentation results ( Spec. page 4), however, those steps are not recited in the rejected claim(s). Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1-11, 14 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Lanihun (“Evolutionary active vision system: from 2D to 3D”, cited from IDS). Regarding claim 1, Lanihun teaches a computer-implemented method for training a neural network, including: generating a first plurality of sets of weights for a neural network(page 11, §3. 4,The initial population ... consists of 100 or 60 randomly generated genotypes ... each encoding the free parameters of the corresponding neural contro11er, which include all the connection weights ); evaluating each set of weights in the first plurality, wherein evaluating includes, for each set of weights: assigning the set of weights to a neural network(page 11, §3. 4,The initial population ... consists of 100 or 60 randomly generated genotypes ... each encoding the free parameters of the corresponding neural contro11er, which include all the connection weights ); presenting training data to an input of the neural network(page 6, left column, The active vision system autonomously takes an input from a visual scene) ; and calculating a fitness score for the set of weights based on a fitness function that is dependent on an output of the neural network( equation (10) –(12)), wherein evaluating each set of weights occurs at least partly concurrently, such that two or more sets of weights in the first plurality are evaluated at the same time(page 12, §3. 6, Each individual runs its evaluation as a separate process and the respective fitness is communicated to the root process, which in turn carries out the evolution and subsequent generation of a new set of controllers ); and generating a second plurality of sets of weights for the neural network, wherein generating the second plurality of sets of weights includes applying a training algorithm to the sets of weights of the first plurality to generate the second plurality of sets of weights, the second plurality of sets of weights being dependent on the sets of weights of the first plurality and their respective fitness scores( page 11, §3. 4, with the lower bounds corresponding to the integration step size used to update the controller .Generations following the first are produced). Regarding claim 2, Lanihun teaches the method of claim 1, wherein the evaluating is performed concurrently by a cluster of central processing units (CPUs) or by a graphics processing unit (GPU) ( page 12, §3. 6, a parallel computing cluster). Regarding claim 3, Lanihun teaches the method of claim 1 wherein the first plurality is an initial plurality, and wherein the generating of the initial plurality includes randomly generating each set of weights in the initial plurality using a random number generator( Page 11, §3. 4, The initial population for each generation of the evolutionary process consists of 100 or 60 randomly generated genotypes ). Regarding claim 4, Lanihun teaches the method of claim 1, wherein the training algorithm is an artificial evolution algorithm including one or more operations of elitism, mutation, recombination and truncation for generating the second plurality of sets of weights; and the operations are applied to one or more of the sets of weights of the first plurality based on the fitness scores of the sets of weights of the first plurality (Page 11, §3. 4, Generations following the first are produced by a combination of selection with elitism, recombination and mutation) . Regarding claim 5, Lanihun teaches the method of claim 4 wherein generating the first plurality and generating the second plurality includes encoding each set of weights as an artificial chromosome(Page 11, §3. 4, randomly generated genotypes) . Regarding claim 6, Lanihun teaches the method of claim 1, wherein evaluating each of the sets of weights of the first plurality occurs concurrently, such that every set of weights in the first plurality is evaluated at the same time (Page 11, §3.6, to parallelise the implementation using a root and individual subprocesses. Each individual runs its evaluation as a separate process and the respective fitness is communicated to the root process, which in turn carries out the evolution and subsequent generation of a new set of controllers). Regarding claim 7, Lanihun teaches the method of claim 1wherein the training data and the fitness function are dependent on a specific task that the neural network is being trained for, wherein the specific task is image classification( Page 7, §3.2.1, images categorization) , image segmentation, or object recognition( Page 7, §3.2.2, object categorization). Regarding claim 9, Lanihun teaches the method of claim 9(1), wherein, when generating the second plurality, the method further comprises: ranking the set of weights of the first plurality according to their respective fitness scores(Page 11, §3.5, the first, F1(t, c) rewards the agent’s ability to rank the correct category higher than the other categories); and generating the second plurality from the existing plurality by applying the training algorithm to the first plurality(Page 11, §3.5, the second, F2(t, c)) ; wherein the training algorithm manipulates the sets of weights of the first plurality based on their ranking to generate the second plurality(Page 11, §3.5, the second, F2(t, c) rewards the ability to maximise the activation of the correct unit while minimizing the activations of the wrong units). Regarding claim 10, Lanihun teaches the method of claim 1,further including repeating the evaluating step with respect to the second plurality of sets of weights(Page 21, §4.1, we performed 20 evolutionary runs, with 10 runs for each fold of the two-fold cross-validation and each evolutionary run lasted 3000 generations ). Regarding claim 11, Lanihun teaches the method of claim 10, further comprising repeating the generating step and the evaluating step iteratively up to an nth plurality, such that the nth plurality is generated by applying the training algorithm to the sets of weights of the n-lth plurality to generate the nth plurality, the sets of weights of the nth plurality being dependent on the sets of weights of then-lth plurality and their respective fitness scores; wherein n is a positive integer and n > Regarding claim 3 ( Page 21, §4.1, re-evaluated the best genotypes of the last 1000 generations of the evolutionary runs for the categorisation task for the three methods of visual extraction). Regarding claim 14, Lanihun teaches the method of claim 10, further comprising: selecting one or more sets of weights from any plurality( page 15, left column, comparing the pattern of fitness of all runs of the three visual-extraction methods) ; repeating the evaluating step for each of the one or more sets of weights using different training data presented to the input of the neural network( page 15, left column, The re-evaluated best 100 genotypes show that the best performance was by the Active-HOG method, followed by the greyscale method and then the Active-ULBP). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 12 and 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Lanihun in view of Adoni (US 20190130277) Regarding claim 12, Lanihun teaches the method of claim 10. Lanihun does not expressly teach further comprising: receiving a user selection of a set of weights in any plurality; and applying a biasing factor to the fitness score for the selected set of weights, such that the selected set of weights has a greater fitness score. However, Adoni teaches receiving a user selection of a set of weights in any plurality([0043], user can select a characteristic to use as the basis for the weight data); and applying a biasing factor( Fig. 2) to the fitness score for the selected set of weights, such that the selected set of weights has a greater fitness score ([0052], Based on the species score, the genetic algorithm 110 may identify the “fittest” species) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teaching of Lanihun and Adoni, by modifying Lanihun’s nodes with bias as taught by Adoni, and allowing users to select weights as taught by Adoni. One of ordinary skill would have been motivated to do such combination in order to “utilize a genetic algorithm to generate and train a neural network” (Adoni, [0004]). Regarding claim 16, Lanihun teaches the method of claim 14. Lanihan does not expressly teach wherein the selecting one or more sets of weights from any plurality includes selecting all sets of weights across all generations that have a fitness score above a cut-off threshold. However, Adoni teaches selecting one or more sets of weights from any plurality includes selecting all sets of weights across all generations that have a fitness score above a cut-off threshold ([0055],a particular number of the fittest models, or all models having a fitness that exceeds a threshold, may be selected as the subset of models) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teaching of Lanihun and Adoni, by substituting model selection in Lanihun with the threshold method as taught by Adoni. One of ordinary skill would have been motivated to do such combination in order to “utilize a genetic algorithm to generate and train a neural network” (Adoni, [0004]). Claims 30 recites the system for the method in claim 12. Since Lanihun also teaches a system ( Page 12, §3.6, a High-Performance Computing Wales infrastructure ), this claim is also rejected. Claim(s) 13, 15, 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Lanihun. Regarding claim 13, Lanihun teaches the method of claim 10. Lanihun does not expressly teach further comprising: selecting a final set of weights from any plurality; saving the final weights to a memory; and subsequently inputting the final weights into a neural network for identifying a feature of an environment of a vehicle. However, this limitation is well known in the art, as evidenced by Lanihun as selecting a final set of weights from any plurality(page 5, right column, an active vision model… ne controlling the eye movement) ; saving the final weights to a memory; and subsequently inputting the final weights into a neural network for identifying a feature of an environment of a vehicle( page 5, right column, the other for controlling the movement of a simulated car. The output units of the eye controller determined the visual features that were extracted as the car moved through a simulated road). Therefore It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to replace Lanihun’s model in the eye model in the well-known process, Since each individual element and its function are shown in the prior art, albeit shown in separate references, the difference between the claimed subject matter and the prior art rests not on any individual element or function but in the very combination itself- that is in the substitution of Lanihun’s model in the eye model in well-known processes. Thus, the simple substitution of one known element for another producing a predictable result renders the claim obvious to an ordinary skill in the art before the effective filing date of the claimed invention. Regarding claim 15, Lanihun teaches the method of claim 14, further comprising: Lanihun does not expressly teach further comprising: selecting a final set of weights from the one or more sets of weights for which the evaluating step has been repeated; saving the final weights to a memory; and subsequently inputting the final weights into a neural network for identifying a feature of an environment of a vehicle. However, this limitation is well known in the art, as evidenced by Lanihun as selecting a final set of weights from the one or more sets of weights for which the evaluating step has been repeated; saving the final weights to a memory; and subsequently inputting the final weights into a neural network for identifying a feature of an environment of a vehicle(Page 5, right column, Subsequent analysis showed that the system used the gaze shifts: (1) to find relevant features that contributed to successful driving) . Therefore It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to replace Lanihun’s model in the eye model in the well-known process, Since each individual element and its function are shown in the prior art, albeit shown in separate references, the difference between the claimed subject matter and the prior art rests not on any individual element or function but in the very combination itself- that is in the substitution of Lanihun’s model in the eye model in well-known processes. Thus, the simple substitution of one known element for another producing a predictable result renders the claim obvious to an ordinary skill in the art before the effective filing date of the claimed invention. Regarding claim 17, Lanihun teaches the method of claim 1. Lanihun does not expressly teach wherein the method is for training a neural network for controlling the movement of a vehicle. However, this limitation is well known in the art, as evidenced by Lanihun as the method is for training a neural network for controlling the movement of a vehicle (Page 5, right column, Subsequent analysis showed that the system used the gaze shifts: (1) to find relevant features that contributed to successful driving) . Therefore It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to replace Lanihun’s model in the eye model in the well-known process, Since each individual element and its function are shown in the prior art, albeit shown in separate references, the difference between the claimed subject matter and the prior art rests not on any individual element or function but in the very combination itself- that is in the substitution of Lanihun’s model in the eye model in well-known processes. Thus, the simple substitution of one known element for another producing a predictable result renders the claim obvious to an ordinary skill in the art before the effective filing date of the claimed invention. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JIANGENG SUN whose telephone number is (571)272-3712. The examiner can normally be reached 8am to 5pm, EST, M-F. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Randolph Vincent can be reached at 571 272 8243. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. JIANGENG SUN Examiner Art Unit 2661 /Jiangeng Sun/Examiner, Art Unit 2671
Read full office action

Prosecution Timeline

Mar 23, 2023
Application Filed
May 03, 2025
Non-Final Rejection — §102, §103
Nov 10, 2025
Response Filed
Dec 22, 2025
Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12591973
Histological Image Analysis
2y 5m to grant Granted Mar 31, 2026
Patent 12561872
METHOD OF TRAINING IMAGE DECOMPOSITION MODEL, METHOD OF DECOMPOSING IMAGE, ELECTRONIC DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Feb 24, 2026
Patent 12561757
IMAGE SUPER-RESOLUTION NEURAL NETWORKS
2y 5m to grant Granted Feb 24, 2026
Patent 12548122
METHOD FOR FILTERING PERIODIC NOISE AND FILTER USING THE METHOD
2y 5m to grant Granted Feb 10, 2026
Patent 12524859
System And Method For The Visualization And Characterization Of Objects In Images
2y 5m to grant Granted Jan 13, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
82%
Grant Probability
96%
With Interview (+14.0%)
2y 9m
Median Time to Grant
Moderate
PTA Risk
Based on 403 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month