Prosecution Insights
Last updated: April 19, 2026
Application No. 17/749,194

ONLINE CONTINUAL LEARNING METHOD AND SYSTEM

Final Rejection §103§112
Filed
May 20, 2022
Examiner
RIFKIN, BEN M
Art Unit
2123
Tech Center
2100 — Computer Architecture & Software
Assignee
Macronix International Co. Ltd.
OA Round
2 (Final)
44%
Grant Probability
Moderate
3-4
OA Rounds
4y 12m
To Grant
59%
With Interview

Examiner Intelligence

Grants 44% of resolved cases
44%
Career Allow Rate
139 granted / 317 resolved
-11.2% vs TC avg
Strong +16% interview lift
Without
With
+15.6%
Interview Lift
resolved cases with interview
Typical timeline
4y 12m
Avg Prosecution
38 currently pending
Career history
355
Total Applications
across all art units

Statute-Specific Performance

§101
21.8%
-18.2% vs TC avg
§103
42.8%
+2.8% vs TC avg
§102
7.8%
-32.2% vs TC avg
§112
18.1%
-21.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 317 resolved cases

Office Action

§103 §112
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION The instant application having Application No. 17749194 has a total of 10 claims pending in the application. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claims 1-10 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. AS per claims 1 and 6, these claims call for “applying color-jitter to obtain the plurality of view data.” However, there is no support for this within the specification. First, the term “jitter” is never used in the specification. The only time color is discussed at all in the specification is paragraph 0025, and this is in reference to color distortion, which as far as the Examiner can tell simply involves changing the color (“painted by yellow color” or “painted by red color”). Color jitter is a specific type of thing, and nothing here appears to meet the definition of color jitter, and this causes the current claim amendments to be unsupported by the specification, and therefore rejected as new matter under U.S.C. 112(a). As per claims 2-5 and 7-10, these claims are rejected as dependent on a claim rejected under U.S.C. 112(a) for new matter. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Claims 1-4 and 6-9 are rejected under 35 U.S.C. 103 as being unpatentable over Graepel et al (US 20050097068 A1) in view of Caron et al (“Unsupervised Pre-Training of Image Features on Non-Curated Data”) and Fu et al (US 10482603 B1). As per claims 1 and 6, Graepel discloses, “An online continual learning method including: receiving a plurality of training data of a class under recognition” (Pg.1-2, particularly paragraph 0019; EN: this denotes receiving training data of different numbers, each different number being a class). “applying a discrete and deterministic augmentation operation on the plurality of training data of the class under recognition to generate a plurality of intermediate classes” (pg.2, particularly paragraph 0023; EN: this denotes transforming the data including rotation. Each intermediate class is the transformed versions of the original class and the specification explicitly states that rotation is a discrete and deterministic operation). “extracting a plurality of characteristic vectors from the … data” (pg.2, particularly paragraphs 0026-0027; EN: this denotes having the training data features placed into vectors, with the invariant transforms being part of the training objects). “training a model based on a plurality of feature vectors” (pg.2, particularly paragraph 0024; EN: this denotes using the training data to train a classifier). However, Graepel fails to explicitly disclose, “generating a plurality of view data from the intermediate classes by randomly cropping the intermediate classes and applying color-jitter to obtain the plurality of view data” and “view data.” Caron discloses, “generating a plurality of view data from the intermediate classes… to obtain the plurality of view data” and “view data” (pg.2962, particularly section 4.1 and figure 2; EN: this denotes clustering the rotated images, with figure 2 showing visible representations of the rotations (i.e. view data)). Fu discloses, “by randomly cropping the intermediate classes and applying color-jitter” (C15, particularly L42-51; EN: This denotes augmenting the training data set by color jitter and random cropping). Graepel and Caron are analogous art because both involve image based machine learning. Before the effective filing date it would have been obvious to one skilled in the art of image based machine learning to combine the work of Graepel and Caron in order to have a view of rotated images during the machine learning process. The motivation for doing so would be to find “good features [that] can be obtained when training a convnet to discriminate between different image rotations” (Caron, Pg.2961, C1, Rotation as self-supervision section) or in the case of Graepel, allow users of the system to be able to see and understand the rotations that they will be working with. Therefore before the effective filing date it would have been obvious to one skilled in the art of image based machine learning to combine the work of Graepel and Caron in order to have a view of rotated images during the machine learning process. Fu and Graepel modified by Caron are analogous art because both involve image based machine learning. Before the effective filing date it would have been obvious to one skilled in the art of image based machine learning to combine the work of Fu and Graepel modified by Caron in order to further augment the training data with color jitter and random cropping. The motivation for doing so would be to allow the “datasets used for testing and/or training the neural network [to be] augmented in various ways” (Fu, C15, L42-52) or in the4 case of Graepel modified by Caron, allow the system to manipulate the images in the training data to manipulate the image data to provide better generalization for the training of the neural network. Therefore before the effective filing date it would have been obvious to one skilled in the art of image based machine learning to combine the work of Fu and Graepel modified by Caron in order to further augment the training data with color jitter and random cropping. As per claims 2 and 7, Graepel discloses, “training the model based on the … vectors” (pg.2, particularly paragraph 0024; EN: this denotes using the training data to train a classifier). However Graepel fails to explicitly disclose, “projecting the characteristic vectors to generate a plurality of output characteristic vectors”, “based on the output characteristic vectors, wherein the output characteristic vectors from the same intermediate class are attracted to each other, while the output characteristic vectors from the different intermediate class are repelled from each other” Caron discloses, “projecting the characteristic vectors to generate a plurality of output characteristic vectors” (Pg.2961, section 3; EN: this denotes finding and selecting features that are relevant as opposed to all possible features). ‘ “based on the output characteristic vectors, wherein the output characteristic vectors from the same intermediate class are attracted to each other, while the output characteristic vectors from the different intermediate class are repelled from each other” (Pg.2962, particularly section 4.1; EN: this denotes clustering the vectors. Similar vectors will form a cluster of the intermediate class, and be repelled from other clusters). Graepel and Caron are analogous art because both involve image based machine learning. Before the effective filing date it would have been obvious to one skilled in the art of image based machine learning to combine the work of Graepel and Caron in order to group similar image vectors and repel different image vectors. The motivation for doing so would be to “outperforms both RotNet and DeepCluster” (Caron, Pg.2965, C1, Comparison with RotNet and DeepCluster) and provide effective classification (see Caron, Pg.2965, Table 3) or in the case of Graepel, allow users of the system to use the clustering method to improve the training of machine learning algorithms including images. Therefore before the effective filing date it would have been obvious to one skilled in the art of image based machine learning to combine the work of Graepel and Caron in order to group similar image vectors and repel different image vectors. As per claims 3 and 8, Caron discloses, “projecting the characteristic vectors into another dimension space” (Pg.2961, section 3; EN: this denotes finding and selecting features that are relevant as opposed to all possible features). As per claims 4 and 9, Graepel discloses, “performing either rotation or permutation on the plurality of training data of the class under recognition to generate the plurality of intermediate classes” (pg.2, particularly paragraph 0023; EN: this denotes transforming the data including rotation. Each intermediate class is the transformed versions of the original class and the specification explicitly states that rotation is a discrete and deterministic operation). Claim Rejections - 35 USC § 103 Claims 5 and 10 are rejected under 35 U.S.C. 103 as being unpatentable over Graepel et al (US 20050097068 A1) in view of Caron et al (“Unsupervised Pre-Training of Image Features on Non-Curated Data”) and Fu et al (US 10482603 B1) and further in view of Luo et al (US 20140286533 A1). As per claims 5 and 10, Graepel discloses, “performing classification by the model” (pg.1, particularly paragraph 0008; EN: this denotes classifying with the classifier). Graepel fails to explicitly disclose, “performing weight-aware balanced sampling on the characteristic vectors to dynamically adjust a data sampling rate of the class under recognition” and “performing cross entropy on a class result from the model to train the model.” Luo discloses, “performing weight-aware balanced sampling on the characteristic vectors to dynamically adjust a data sampling rate of the class under recognition” (pg.4, particularly paragraph 0041; EN: this denotes sampling different classes at different rates based upon the availability of training data. So larger sets will be weighted less (i.e. weight aware balanced sampling)). Graepel and Luo are analogous art because both involve machine learning. Before the effective filing date it would have been obvious to one skilled in the art of machine learning to combine the work of Graepel and Luo in order to balance sampling in machine learning. The motivation for doing so would be to “deal with unbalanced training data” when training a machine learning algorithm (Luo, Pg.4, paragraph 0041). Therefore before the effective filing date it would have been obvious to one skilled in the art of machine learning to combine the work of Graepel and Luo in order to balance sampling in machine learning. While Graepel fails to explicitly disclose, “performing cross entropy on a class result from the model to train the model”, the Examiner takes Official Notice that the use of cross entropy to improve machine learning is well-known to one of ordinary skill in the art at the time of filing in order to determine the differences in probability values between the output of the model and potential class labels, and using this data to improve training and generalization of the model. Applicant’s failure to traverse the official notice has lead to these limitations being applicant admitted prior art under MPEP 2144.03(C). Response to Arguments Applicant's arguments with respect to claims 1-10 have been considered but are moot in view of the new ground(s) of rejection. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to BEN M RIFKIN whose telephone number is (571)272-9768. The examiner can normally be reached Monday-Friday 9 am - 5 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alexey Shmatov can be reached at (571) 270-3428. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /BEN M RIFKIN/Primary Examiner, Art Unit 2123
Read full office action

Prosecution Timeline

May 20, 2022
Application Filed
Jul 08, 2025
Non-Final Rejection — §103, §112
Aug 21, 2025
Examiner Interview Summary
Aug 21, 2025
Applicant Interview (Telephonic)
Sep 29, 2025
Response Filed
Nov 21, 2025
Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12541685
SEMI-SUPERVISED LEARNING OF TRAINING GRADIENTS VIA TASK GENERATION
2y 5m to grant Granted Feb 03, 2026
Patent 12455778
SYSTEMS AND METHODS FOR DATA STREAM SIMULATION
2y 5m to grant Granted Oct 28, 2025
Patent 12236335
SYSTEM AND METHOD FOR TIME-DEPENDENT MACHINE LEARNING ARCHITECTURE
2y 5m to grant Granted Feb 25, 2025
Patent 12223418
COMMUNICATING A NEURAL NETWORK FEATURE VECTOR (NNFV) TO A HOST AND RECEIVING BACK A SET OF WEIGHT VALUES FOR A NEURAL NETWORK
2y 5m to grant Granted Feb 11, 2025
Patent 12106207
NEURAL NETWORK COMPRISING SPINTRONIC RESONATORS
2y 5m to grant Granted Oct 01, 2024
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
44%
Grant Probability
59%
With Interview (+15.6%)
4y 12m
Median Time to Grant
Moderate
PTA Risk
Based on 317 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month