Prosecution Insights
Last updated: April 18, 2026
Application No. 18/207,459

INFORMATION PROCESSING APPARATUS, DETERMINATION METHOD, AND STORAGE MEDIUM

Non-Final OA §101§102§103
Filed
Jun 08, 2023
Examiner
TANK, ANDREW L
Art Unit
2141
Tech Center
2100 — Computer Architecture & Software
Assignee
NEC Corporation
OA Round
1 (Non-Final)
68%
Grant Probability
Favorable
1-2
OA Rounds
4y 0m
To Grant
99%
With Interview

Examiner Intelligence

Grants 68% — above average
68%
Career Allow Rate
366 granted / 538 resolved
+13.0% vs TC avg
Strong +31% interview lift
Without
With
+31.2%
Interview Lift
resolved cases with interview
Typical timeline
4y 0m
Avg Prosecution
43 currently pending
Career history
581
Total Applications
across all art units

Statute-Specific Performance

§101
12.0%
-28.0% vs TC avg
§103
37.5%
-2.5% vs TC avg
§102
28.6%
-11.4% vs TC avg
§112
13.5%
-26.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 538 resolved cases

Office Action

§101 §102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA. The following action is in response to the original filing of 06/08/2023. Claims 1-9 are pending and have been considered below. Specification The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed. The following title is suggested: “Inference model selection based on determined inference difficulty of input time series data”. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-9 are rejected under 35 U.S.C. 101 because the claimed invention is directed to abstract ideas without significantly more. Regarding claims 1, 3 and 7: Step 1, MPEP 2106.03: These limitations have been determined, under Step 1, to be statutory categories of invention: An information processing apparatus comprising at least one processor [..] (claim 1) An information processing apparatus comprising at least one processor [..] (claim 5) A determination method [..] (claim 7) Step 2A Prong One MPEP 2106.04, 2106.04(a): These limitations represent, under Step 2A Prong One, mathematical concepts such as mathematical relationships, mathematical formulas or equations, or mathematical calculations , MPEP 2106.04(a)(2)(I) : calculating, on the basis of input data constituting a time series, difficulty in inference carried out by inputting at least one piece of the input data to a first-stage inference model [..] (claims 1, 7) calculating, on the basis of training data constituting a time series, difficulty in inference carried out by inputting at least one piece of the training data to a first-stage inference model [..] (claim 3) Step 2A Prong Two, MPEP 2106.04(d): These limitations represent , under Step 2A Prong Two, mere instructions to implement the abstract idea using generic computing tools , MPEP 2106.05 (f): [..] the at least one processor carrying out: [..] (claims 1 and 3) [..] (a) and (b) each being carried out by at least one processor [..] (claim 7) [..] a first difficulty calculation process for calculating [..] (claims 1, 3) [..] a first determination process for determining [..] (claims 1, 3) These limitations represent, under Step 2A Prong Two, mere instructions to apply at a high level of generality, MPEP 2106.05: [..] multiple-stage inference models which are configured such that use of a later-stage inference model achieves higher inference accuracy [..] (claims 1, 3 and 7) [..] determining, on the basis of the difficulty, whether a second- or later-stage inference model will be used for inference with use of the input data [..] )(claims 1 and 7) [..] determining, on the basis of the difficulty, whether the training data will be used for learning of a second- or later-stage inference model [..] (claim 3) Step 2 B , MPEP 2106.05 : These limitations are considered, under Step 2B, insignificant extra-solution activity as being recited at a high level of generality , MPEP 2106.05( d ) : [..] the at least one processor carrying out: [..] (claims 1 and 3) [..] (a) and (b) each being carried out by at least one processor [..] (claim 7) [..] a first difficulty calculation process for calculating [..] (claims 1, 3) [..] a first determination process for determining [..] (claims 1, 3) These limitations are considered, under Step 2B, mere instructions to apply to obtain a solution/outcome, MPEP 2106.05(f): [..] determining, on the basis of the difficulty, whether a second- or later-stage inference model will be used for inference with use of the input data [..] )(claims 1 and 7) [..] determining, on the basis of the difficulty, whether the training data will be used for learning of a second- or later-stage inference model [..] (claim 3) [..] multiple-stage inference models which are configured such that use of a later-stage inference model achieves higher inference accuracy [..] (claims 1, 3 and 7) Regarding claim 2: Step 1, MPEP 2106.03: Analysis of respective parent is incorporated. Step 2A Prong One MPEP 2106.04, 2106.04(a): Analysis of respective parent is incorporated. These limitations represent, under Step 2A Prong One, mathematical concepts such as mathematical relationships, mathematical formulas or equations, or mathematical calculations , MPEP 2106.04(a)(2)(I) : [..] and in the first difficulty calculation process, the at least one processor calculates, as a value indicative of the difficulty in inference, a prediction error in the first data prediction process [..] Step 2A Prong Two, MPEP 2106.04(d): These limitations represent, under Step 2A Prong Two, mere instructions to apply at a high level of generality, MPEP 2106.05: [..] wherein the at least one processor carries out a first data prediction process for predicting the input data that is input to the first-stage inference model [..] [..] the input data being predicted from past input data that is chronologically earlier than the input data [..] Step 2 B , MPEP 2106.05 : These limitations are considered, under Step 2B, mere instructions to apply to obtain a solution/outcome, MPEP 2106.05(f): [..] wherein the at least one processor carries out a first data prediction process for predicting the input data that is input to the first-stage inference model [..] [..] the input data being predicted from past input data that is chronologically earlier than the input data [..] Regarding claim 4: Step 1, MPEP 2106.03: Analysis of respective parent is incorporated. Step 2A Prong One MPEP 2106.04, 2106.04(a): Analysis of respective parent is incorporated. Step 2A Prong Two, MPEP 2106.04(d): These limitations represent , under Step 2A Prong Two, mere instructions to implement the abstract idea using generic computing tools , MPEP 2106.05 (f): [..] the multiple-stage inference models are generated on the basis of a single multilayer neural network model [..] [..] the at least one processor carries out, in learning of the second- or later-stage inference model, an inference model learning process [..] These limitations represent, under Step 2A Prong Two, mere instructions to apply at a high level of generality, MPEP 2106.05: [..] an inference model learning process for also updating a weighting value of an earlier-stage inference model [..] Step 2 B , MPEP 2106.05 : These limitations are considered, under Step 2B, insignificant extra-solution activity as being recited at a high level of generality , MPEP 2106.05( d ) : [..] the multiple-stage inference models are generated on the basis of a single multilayer neural network model [..] [..] the at least one processor carries out, in learning of the second- or later-stage inference model, an inference model learning process [..] These limitations are considered, under Step 2B, mere instructions to apply to obtain a solution/outcome, MPEP 2106.05(f): [..] an inference model learning process for also updating a weighting value of an earlier-stage inference model [..] Regarding claim 5: Step 1, MPEP 2106.03: Analysis of respective parent is incorporated. Step 2A Prong One MPEP 2106.04, 2106.04(a): Analysis of respective parent is incorporated. These limitations represent, under Step 2A Prong One, mathematical concepts such as mathematical relationships, mathematical formulas or equations, or mathematical calculations , MPEP 2106.04(a)(2)(I) : [..] and in the first difficulty calculation process, the at least one processor calculates, as a value indicative of the difficulty in inference with use of the training data at the certain time point , a prediction error in the first data prediction process [..] Step 2A Prong Two, MPEP 2106.04(d): These limitations represent, under Step 2A Prong Two, mere instructions to apply at a high level of generality, MPEP 2106.05: [..] wherein the at least one processor carries out a first data prediction process for using a first prediction model to predict training data at a certain time point among the training data constituting the time series [..] [..] the training data at the certain time point being predicted from training data at a time point chronologically earlier than the certain time point [..] [..] a first prediction model learning process, the first prediction model learning process being a process, carried out by learning with use of the training data constituting the time series, for updating the first prediction model so that the prediction error is decreased [..] Step 2 B , MPEP 2106.05 : These limitations are considered, under Step 2B, mere instructions to apply to obtain a solution/outcome, MPEP 2106.05(f): [..] wherein the at least one processor carries out a first data prediction process for using a first prediction model to predict training data at a certain time point among the training data constituting the time series [..] [..] the training data at the certain time point being predicted from training data at a time point chronologically earlier than the certain time point [..] [..] a first prediction model learning process, the first prediction model learning process being a process, carried out by learning with use of the training data constituting the time series, for updating the first prediction model so that the prediction error is decreased [..] Regarding claim 6: Step 1, MPEP 2106.03: Analysis of respective parent is incorporated. Step 2A Prong One MPEP 2106.04, 2106.04(a): Analysis of respective parent is incorporated. Step 2A Prong Two, MPEP 2106.04(d): These limitations represent, under Step 2A Prong Two, mere instructions to apply at a high level of generality, MPEP 2106.05: [..] determines, in the first determination process, that the training data in which the difficulty exceeds a first threshold will be used for learning of the second- or later-stage inference model [..] [..] carries out a first threshold updating process for updating the first threshold on the basis of a plurality of results of inference that are obtained by inputting, to the first-stage inference model, the training data constituting the time series [..] Step 2 B , MPEP 2106.05 : These limitations are considered, under Step 2B, mere instructions to apply to obtain a solution/outcome, MPEP 2106.05(f): [..] determines, in the first determination process, that the training data in which the difficulty exceeds a first threshold will be used for learning of the second- or later-stage inference model [..] [..] carries out a first threshold updating process for updating the first threshold on the basis of a plurality of results of inference that are obtained by inputting, to the first-stage inference model, the training data constituting the time series [..] Regarding claims 8 and 9: Step 1, MPEP 2106.03: Analysis of respective parent is incorporated. Step 2A Prong One MPEP 2106.04, 2106.04(a): Analysis of respective parent is incorporated. Step 2A Prong Two, MPEP 2106.04(d): These limitations represent , under Step 2A Prong Two, mere instructions to implement the abstract idea using generic computing tools , MPEP 2106.05 (f): [..] A computer-readable non-transitory storage medium storing therein a determination program for causing a computer to function as an information processing apparatus .. the determination program causing the computer to carry out the first difficulty calculation process and the first determination process [..] (claims 8 and 9) Step 2 B , MPEP 2106.05 : These limitations are considered, under Step 2B, insignificant extra-solution activity as being recited at a high level of generality , MPEP 2106.05( d ) : [..] A computer-readable non-transitory storage medium storing therein a determination program for causing a computer to function as an information processing apparatus .. the determination program causing the computer to carry out the first difficulty calculation process and the first determination process [..] (claims 8 and 9) Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis ( i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim s 1- 4 and 6- 9 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Zhang et al., US 2023/0297852 A1 effective filing of 07/29/2020 [“ZHANG”] . Regarding claim 1, ZHANG discloses a n information processing apparatus comprising at least one processor (¶7) , the at least one processor carrying out: a first difficulty calculation process for calculating, on the basis of input data constituting a time series (¶79-80: time series data input from ex. sensors, audio or video) , difficulty in inference carried out by inputting at least one piece of the input data to a first-stage inference model among multiple- stage inference models which are configured such that use of a later-stage inference model achieves higher inference accuracy (¶38: calculating whether input data is easier to process or complicated by a light weight first stage model of the multi-stage models, later stage synthesized specialist combined model generates a final prediction, ¶34: accuracy of final predictions by combined model is higher) ; and a first determination process for determining, on the basis of the difficulty, whether a second- or later-stage inference model will be used for inference with use of the input data (¶38: when the input is deemed complicated, later stage synthesized specialist combined model is used to make inference final prediction) . Regarding claim 2, ZHANG discloses the information processing apparatus according to claim 1, wherein the at least one processor carries out a first data prediction process for predicting the input data that is input to the first-stage inference model (¶29, ¶35, ¶94: initial prediction of the lightweight model) , the input data being predicted from past input data that is chronologically earlier than the input data (¶108-110: training data for training the machine-learned prediction lightweight model , i.e. earlier data than received input to be predicted ) , and in the first difficulty calculation process, the at least one processor calculates, as a value indicative of the difficulty in inference, a prediction error in the first data prediction process (¶29, ¶35, ¶94: determined confidence value of the initial prediction) . Regarding claim 3, ZHANG discloses an information processing apparatus comprising at least one processor, the at least one processor carrying out: a first difficulty calculation process for calculating, on the basis of training data constituting a time series (¶79-80: time series data input from ex. sensors, audio or video), difficulty in inference carried out by inputting at least one piece of the training data to a first-stage inference model among multiple-stage inference models which are configured such that use of a later-stage inference model achieves higher inference accuracy (¶109-110: machine-learned training prediction model is trained using the training input and generates an optional initial prediction based on the input, ¶38: initial prediction , ¶34: accuracy of combined model is higher); and a first determination process for determining, on the basis of the difficulty, whether the training data will be used for learning of a second- or later-stage inference model (¶110: machine-learned prediction model generates plurality of combination values, based on the input training data, for synthesizing the combined model). Regarding claim 4, ZHANG discloses the information processing apparatus according to claim 3, wherein the multiple-stage inference models are generated on the basis of a single multilayer neural network model (¶33), and the at least one processor carries out, in learning of the second- or later-stage inference model, an inference model learning process for also updating a weighting value of an earlier-stage inference model (¶68-70 , ¶121 ). Regarding claim 6, ZHANG discloses the information processing apparatus according to claim 3, wherein the at least one processor determines, in the first determination process, that the training data in which the difficulty exceeds a first threshold will be used for learning of the second- or later-stage inference model (¶24, ¶29), and the at least one processor carries out a first threshold updating process for updating the first threshold on the basis of a plurality of results of inference that are obtained by inputting, to the first-stage inference model, the training data constituting the time series (¶24, ¶120: single hyperparameter confidence threshold adjusted modified). Regarding claims 7 and 8, claims 7 and 8 recite limitations similar to claim 1 and are similarly rejected. Regarding claim 9, claim 9 recites limitation similar to claim 3 and similarly rejected. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis ( i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness . This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim 5 i s rejected under 35 U.S.C. 103 as being unpatentable over ZHANG in view of Sim, US 2019/0385055 A1 published 12/19/2019 [“SIM”] . Regarding claim 5, ZHANG discloses the information processing apparatus according to claim 3, wherein the at least one processor carries out a first data prediction process for using a first prediction model to predict training data at a certain time point among the training data constituting the time series (¶ 120 : initial prediction of the machine learned prediction model for input data by comparing initial prediction to ground truth), in the first difficulty calculation process, the at least one processor calculates, as a value indicative of difficulty in inference with use of the training data at the certain time point, a prediction error in the first data prediction process (¶120: loss term value), and the at least one processor carries out a first prediction model learning process, the first prediction model learning process being a process, carried out by learning with use of the training data constituting the time series, for updating the first prediction model so that the prediction error is decreased (¶121: performing backpropagation with the loss term on the model). ZHANG fails to explicitly disclose wherein the training data at the certain time point being predicted is from training data at a time point chronologically earlier than the certain time point . SIM discloses methods for learning machine models for data prediction. In particular, SIM discloses that it is well known that in time series data sets, that future time points are predicted from past time points in the data (¶4). Accordingly, it would have been obvious to one having ordinary skill in the art and the teachings of ZHANG and SIM before them to apply the known feature of predicting future time points of time series data from past time series data, as suggested by SIM, when applying the prediction process to the certain point of time series data of ZHANG. One would have been motivated to make this application to enhance prediction accuracy of the models, as suggested by SIM (¶9). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Padmanabha Iyer; Anand et al. US 12511219 B2 DEEP NEURAL NETWORKS (DNN) INFERENCE USING PRACTICAL EARLY EXIT NETWORKS Ezrielev ; Ofir et al. US 12412108 B2 SYSTEM AND METHOD FOR INFERENCE GENERATION VIA OPTIMIZATION OF INFERENCE MODEL PORTIONS Li; Zhaoyue et al. US 12236362 B2 INFERENCE COMPUTING APPARATUS, MODEL TRAINING APPARATUS, INFERENCE COMPUTING SYSTEM Schuster; Tal et al. US 11886976 B1 EFFICIENT DECODING OF OUTPUT SEQUENCES USING ADAPTIVE EARLY EXITING Shelhamer ; Evan et al. US 11763545 B2 GENERATING CONFIDENCE-ADAPTIVE PIXEL-LEVEL PREDICTIONS UTILIZING A MULTI-EXIT PIXEL-LEVEL PREDICTION NEURAL NETWORK Achin ; Jeremy at el. US 10496927 B2 TIME-SERIES PREDICTIVE DATA ANALYTICS Nicotera ; Paul et al. US 20230142161 A1 RESPONSE ABSTRACTION AND MODEL SIMPLIFICATION TO IDENTIFY INTERESTING DATA Kouris ; Alexandros et al. US 20230128637 A1 METHOD AND APPARATUS FOR IMAGE SEGMENTATION Nakata; Yohei et al. US 20230117180 A1 INFORMATION PROCESSING METHOD AND INFORMATION PROCESSING SYSTEM Akbari; Mohammad et al. US 20230110925 A1 SYSTEM AND METHOD FOR UNSUPERVISED MULTI-MODEL JOINT REASONING Chen; Hsing -Yu et al. US 20230085518 A1 VIDEO PROCESSING METHOD FOR DETECTING LOCATION, PIXELS, AND SKELETON OF OBJECT, AND ASSOCIATED VIDEO PROCESSING CIRCUIT Nitta; Shuhei et al. US 20230056947 A1 LEARNING APPARATUS, METHOD, AND STORAGE MEDIUM He; Runxin et al. US 20230052255 A1 OPTIMIZING MACHINE LEARNING MODELS Guleryuz ; Onar G. et al. US 20220405569 A1 OBJECT TRACKING IN IMAGES USING ENCODER-DECODER MODELS Yonetani ; Ryo et al. US 20220358749 A1 INFERENCE APPARATUS, INFERENCE METHOD, AND COMPUTER-READABLE STORAGE MEDIUM STORING AN INFERENCE PROGRAM Chavoshi ; Nikan et al. US 20220121955 A1 AUTOMATED MACHINE LEARNING PIPELINE FOR TIMESERIES DATASETS UTILIZING POINT-BASED ALGORITHMS Matsumoto; Daisaku et al. US 20210209468 A1 LEARNING DEVICE, INFERENCE DEVICE, METHOD, AND PROGRAM Itsumi ; Hayato et al. US 20210201503 A1 IMAGE ANALYSIS APPARATUS, IMAGE ANALYSIS METHOD AND PROGRAM RECORDING MEDIUM Murugesan; Sugumar et al. US 20210056386 A1 SELECTIVE USE OF DATA FOR PROBABILISTIC FORECASTING Yaguchi ; Atsushi et al. US 20210012228 A1 INFERENCE APPARATUS, LEARNING APPARATUS, INFERENCE METHOD, AND LEARNING METHOD Malaya; Nicholas US 20190005377 A1 ARTIFICIAL NEURAL NETWORK REDUCTION TO REDUCE INFERENCE COMPUTATION TIME Beye Florian WO 2020250451 A1 TRANSFER LEARNING APPARATUS, TRANSFER LEARNING SYSTEM, METHOD OF TRANSFER LEARNING, AND STORAGE MEDIUM Hirai, Riu et al. WO 2019116985 A1 CALCULATION SYSTEM, SERVER, AND IN-VEHICLE DEVICE Taylor, Ben, et al. "Adaptive deep learning model selection on embedded systems." ACM Sigplan Notices 53.6 (2018): 31-43. Yokoo , Shuhei , Satoshi Iizuka, and Kazuhiro Fukui. " Mlsnet : Resource-efficient adaptive inference with multi-level segmentation networks." 2019 ieee international conference on image processing (ICIP) . IEEE, 2019. Marco, Vicent Sanz, et al. "Optimizing deep learning inference on embedded systems through adaptive model selection." ACM Transactions on Embedded Computing Systems (TECS) 19.1 (2020): 1-28. Passalis , Nikolaos, et al. "Efficient adaptive inference for deep convolutional neural networks using hierarchical early exits." Pattern Recognition 105 (2020): 107346. Wu, Zuxuan , et al. "A coarse-to-fine framework for resource efficient video recognition." International Journal of Computer Vision 129.11 (2021): 2965-2977. Ghodrati , Amir, Babak Ehteshami Bejnordi , and Amirhossein Habibian . " Frameexit : Conditional early exiting for efficient video recognition." Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition . 2021. Park, Junyong , Jong- Ryul Lee, and Yong-Hyuk Moon. "Improved early exiting activation to accelerate edge inference." 2021 International Conference on Information and Communication Technology Convergence (ICTC) . IEEE, 2021. Lahiany , Assaf, and Yehudit Aperstein . " PTEENet : Post-trained early-exit neural networks augmentation for inference cost optimization." IEEE Access 10 (2022): 69680-69687. Gómez-Carmona, Oihane , et al. "Optimizing computational resources for edge intelligence through model cascade strategies." IEEE Internet of Things Journal 9.10 (2021): 7404-7417. Any inquiry concerning this communication or earlier communications from the examiner should be directed to FILLIN "Examiner name" \* MERGEFORMAT ANDREW L TANK whose telephone number is FILLIN "Phone number" \* MERGEFORMAT (571)270-1692 . The examiner can normally be reached FILLIN "Work Schedule?" \* MERGEFORMAT Monday-Thursday 9a-6p . Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, FILLIN "SPE Name?" \* MERGEFORMAT Matthew Ell can be reached at FILLIN "SPE Phone?" \* MERGEFORMAT 571-270-3264 . The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ANDREW L TANK/ Primary Examiner, Art Unit 2141 /Mariela Reyes/ Supervisory Patent Examiner, Art Unit 2142
Read full office action

Prosecution Timeline

Jun 08, 2023
Application Filed
Mar 30, 2026
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12585381
ADVANCED KEYBOARD BASED SEARCH
2y 5m to grant Granted Mar 24, 2026
Patent 12585730
MANAGING MACHINE LEARNING MODELS
2y 5m to grant Granted Mar 24, 2026
Patent 12579479
COUNTERFACTUAL SAMPLES FOR MAINTAINING CONSISTENCY BETWEEN MACHINE LEARNING MODELS
2y 5m to grant Granted Mar 17, 2026
Patent 12566998
SYSTEM, METHODS, AND PROCESSES FOR MODEL PERFORMANCE AGGREGATION
2y 5m to grant Granted Mar 03, 2026
Patent 12555037
MODEL MANAGEMENT DEVICE AND MODEL MANAGING METHOD
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
68%
Grant Probability
99%
With Interview (+31.2%)
4y 0m
Median Time to Grant
Low
PTA Risk
Based on 538 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month