Prosecution Insights
Last updated: April 19, 2026
Application No. 18/201,999

PREDICTION MODEL GENERATION APPARATUS, PREDICTION APPARATUS, PREDICTION MODEL GENERATION METHOD, PREDICTION METHOD, AND PROGRAM

Non-Final OA §101§102
Filed
May 25, 2023
Examiner
GRUSZKA, DANIEL PATRICK
Art Unit
2121
Tech Center
2100 — Computer Architecture & Software
Assignee
NEC Corporation
OA Round
1 (Non-Final)
Grant Probability
Favorable
1-2
OA Rounds
3y 3m
To Grant

Examiner Intelligence

Grants only 0% of cases
0%
Career Allow Rate
0 granted / 0 resolved
-55.0% vs TC avg
Minimal +0% lift
Without
With
+0.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
32 currently pending
Career history
32
Total Applications
across all art units

Statute-Specific Performance

§101
38.3%
-1.7% vs TC avg
§103
42.3%
+2.3% vs TC avg
§102
12.0%
-28.0% vs TC avg
§112
7.4%
-32.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 0 resolved cases

Office Action

§101 §102
Notice of Pre-AIA or AIA Status This Non-Final communication is in response to application No. 18/201,999 filed on 5/25/2023. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed in parent Application No. 18/201,999, filed on 5/25/2023. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more. 101 Subject Matter Eligibility analysis Step 1: Claims 1-20 are within the four statutory categories (a process, machine, manufacture or composition of matter.) Claims 1-11 describe a process, and claims 12-20 describe a machine. With respect to claim 1: Step 2A Prong 1: The claim recites an abstract idea enumerated in the 2019 PEG. a contribution degree calculation process of calculating, with use of a test data set different from a training data set used in training of a prediction model to be tested, a degree of contribution of each of a plurality of features to a prediction result, a value of the each of the plurality of features being inputted to the prediction model to be tested; (this is an abstract idea of a “mathematical concept”. The recited “calculating” represents a mathematical operation that would fall under the “mathematical concepts” grouping.) a feature selection process of selecting, on the basis of the degree of contribution of the each of the plurality of features, at least one feature from among the plurality of features; (This is an abstract idea of a "Mental Process." The "selecting" step under its broadest reasonable interpretation, covers concepts that can be practically performed in the human mind. The selection could be made manually by an individual.) Step 2A Prong 2: The judicial exception is not integrated into a practical application. Additional elements: a prediction model generation process of generating a new prediction model which, upon receiving input of a value of the at least one feature selected, outputs a prediction result. (This amounts to no more than mere instructions to “apply” the exception using a generic computer component.) Step 2B: the claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. The additional element is recited in a generic level and they represent generic computer components to apply the abstract idea. Mere instructions to apply an exception cannot provide an inventive concept (MPEP 2106.05(f)). Therefore, claim 1 is ineligible. With respect to claim 2: Step 2A Prong 1: claim 2, which incorporates the rejection of claim 1, does not recite an abstract idea. Step 2A Prong 2: The judicial exception is not integrated into a practical application. in a case where the at least one feature selected is more than one feature, the at least one processor carries out the contribution degree calculation process, the feature selection process, and the prediction model generation process again with use of the new prediction model as a prediction model to be tested. (This amounts to no more than mere instructions to “apply” the exception using a generic computer component.) Step 2B: the claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception The additional element is recited in a generic level and they represent generic computer components to apply the abstract idea. Mere instructions to apply an exception cannot provide an inventive concept (MPEP 2106.05(f)). Therefore, claim 2 is ineligible. With respect to claim 3: Step 2A Prong 1: claim 3, which incorporates the rejection of claim 1, recites an additional abstract idea: in the contribution degree calculation process, the degree of contribution of the each of the plurality of features is calculated on the basis of a difference between (i) an evaluation value of the prediction model to be tested corresponding to a case in which the value of the each of the plurality of features is changed in the test data set and (ii) an evaluation value of the prediction model to be tested corresponding to a case in which the value of the each of the plurality of features is not changed in the test data set. (this is an abstract idea of a “mathematical concept”. The recited “difference” represents a mathematical operation that would fall under the “mathematical concepts” grouping.) Step 2A Prong 2: claim 3 does not recite any additional elements and thus cannot be integrated into a practical application. Step 2B: claim 3 does not recite an additional element. Therefore, claim 3 is ineligible. With respect to claim 4: Step 2A Prong 1: claim 4, which incorporates the rejection of claim 3, recites an additional abstract idea: in the contribution degree calculation process, the value of the each of the plurality of features is changed such that values of the each of the plurality of features are randomly replaced with each other among a plurality of pieces of data included in the test data set. (this is an abstract idea of a “mathematical concept”. The recited “calculation process” represents mathematical operations that would fall under the “mathematical concepts” grouping.) Step 2A Prong 2: claim 4 does not recite any additional elements and thus cannot be integrated into a practical application. Step 2B: claim 4 does not recite an additional element. Therefore, claim 4 is ineligible. With respect to claim 5: Step 2A Prong 1: claim 5, which incorporates the rejection of claim 1, recites an additional abstract idea: in the contribution degree calculation process, a plurality of the degrees of contribution are calculated for the each of the plurality of features with use of a plurality of the test data sets; and (this is an abstract idea of a “mathematical concept”. The recited “calculation process” represents a mathematical operation that would fall under the “mathematical concepts” grouping.) in the feature selection process, a feature with respect to which a statistic obtained from the plurality of the degrees of contribution satisfies a predetermined condition is selected. (This is an abstract idea of a "Mental Process." The "selection process" step under its broadest reasonable interpretation, covers concepts that can be practically performed in the human mind. The selection could be made manually by an individual.) Step 2A Prong 2: claim 5 does not recite any additional elements and thus cannot be integrated into a practical application. Step 2B: claim 5 does not recite an additional element. Therefore, claim 5 is ineligible. With respect to claim 6: Step 2A Prong 1: claim 6, which incorporates the rejection of claim 5, recites an additional abstract idea: in the feature selection process, the following condition is applied as the predetermined condition: a value obtained by subtracting a standard deviation of the plurality of the degrees of contribution from an average value of the plurality of the degrees of contribution is not less than a threshold. (this is an abstract idea of a “mathematical concept”. The recited “subtracting” represents a mathematical operation that would fall under the “mathematical concepts” grouping.) Step 2A Prong 2: claim 6 does not recite any additional elements and thus cannot be integrated into a practical application. Step 2B: claim 6 does not recite an additional element. Therefore, claim 6 is ineligible. With respect to claim 7: Step 2A Prong 1: claim 7, which incorporates the rejection of claim 1, recites an additional abstract idea: a feature value calculation process of calculating, on the basis of information obtained from a target of prediction, a value of the at least one feature selected; (this is an abstract idea of a “mathematical concept”. The recited “calculation process” represents mathematical operations that would fall under the “mathematical concepts” grouping.) Step 2A Prong 2: The judicial exception is not integrated into a practical application. a prediction process of inputting, to the new prediction model, the calculated value of the at least one feature to thereby output a prediction result related to the target of prediction. (This amounts to no more than mere instructions to “apply” the exception using a generic computer component.) Step 2B: the claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception The additional element is recited in a generic level and they represent generic computer components to apply the abstract idea. Mere instructions to apply an exception cannot provide an inventive concept (MPEP 2106.05(f)). Therefore, claim 7 is ineligible. With respect to claim 8: The claim recites similar limitations as corresponding to claim 1. Therefore, the same subject matter analysis that was utilized for claim 1, as described above, is equally applicable to claim 8. Therefore, claim 8 is ineligible. With respect to claim 9: The claim recites similar limitations as corresponding to claim 7. Therefore, the same subject matter analysis that was utilized for claim 7, as described above, is equally applicable to claim 9. Therefore, claim 9 is ineligible. With respect to claim 10: The claim recites similar limitations as corresponding to claim 1. Therefore, the same subject matter analysis that was utilized for claim 1, as described above, is equally applicable to claim 10. Therefore, claim 10 is ineligible. With respect to claim 11: The claim recites similar limitations as corresponding to claim 7. Therefore, the same subject matter analysis that was utilized for claim 7, as described above, is equally applicable to claim 11. Therefore, claim 11 is ineligible. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-11 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Kormilitsin (US 2023/0334360 A1) Regarding claim 1, Kormilitsin teaches: A prediction model generation apparatus, comprising at least one processor, the at least one processor carrying out: ([0078] “The feature-selection system 1300 includes at least one I/O block 1304 that outputs some or all of the updated ranking 310 to a peripheral device (not shown). For example, the I/O block 1304 may output the one or more highest-ranked candidate features 102 of the updated candidate-feature ranking 310, thereby implementing the block 428 of the methods 400 and 500. The I/O block 1304 is connected to the system bus 1306 and therefore communicates with the processor 1302 and the memory 1308.”) a contribution degree calculation process of calculating, with use of a test data set different from a training data set used in training of a prediction model to be tested, a degree of contribution of each of a plurality of features to a prediction result, a value of the each of the plurality of features being inputted to the prediction model to be tested; ([0060] “FIGS. 7 and 8 illustrate a permutation-based method 700 for calculating score updates Δs. Advantageously, the method 700 can be used with any type of prediction model 208. In FIG. 7, test data 704 associated with each of the p.sub.1 candidate features 102 in the first bucket 110(1) are inputted to a trained prediction model 708. Target data 702 associated with the target feature f.sub.T are also inputted to the prediction model 708, which outputs a first performance measure 710 that quantifies how well the prediction model 208 can recreate the target data 702. The test data 704 and target data 702 may be the same training data 206 used to train the prediction model 208 (e.g., as shown in FIG. 2). Alternatively, the test data 704 and target data 702 may be obtained from a holdout data set, as is commonly used for cross-validation.” The scores are used to demonstrate how each feature contributes to the model see [0028] “Each candidate feature f.sub.i has a corresponding score s.sub.i.sup.(0) (where 1≤i≤n.sub.f) that quantifies the relevance, or predictive ability, of the candidate feature f.sub.i with regards to a prediction model”) a feature selection process of selecting, on the basis of the degree of contribution of the each of the plurality of features, at least one feature from among the plurality of features; ([0041] “In the block 428, one or more highest-ranked features of the updated bucket ranking 418 are outputted. The scores of these highest-ranked features may also be outputted. For example, a highest-ranked updated bucket may be outputted. However, additional highest-ranked updated buckets (e.g., a second-highest updated bucket, a third-highest updated bucket, etc.) may also be outputted without departing from the scope hereof. A subset, or portion, of an updated bucket may also be outputted in the block 428.”) a prediction model generation process of generating a new prediction model which, upon receiving input of a value of the at least one feature selected, outputs a prediction result. ([0065] “FIG. 9 is a flow chart of a method 900 for constructing a multivariate prediction model 918.” And [0075] “In embodiments, the method 900 includes using, after training, the multivariate model 908 to generate a prediction. In these embodiments, the method 900 may include receiving a data object to input to the multivariate model 908. The method 900 may then output the prediction.”) Regarding claim 2, Kormilitsin teaches all of claim 1 as outlined above. Kormilitsin further teaches: the at least one feature selected is more than one feature, the at least one processor carries out the contribution degree calculation process, the feature selection process, and the prediction model generation process again with use of the new prediction model as a prediction model to be tested. ([0110] “the feature-selection method may further include repeating the training, updating, sorting, and partitioning with the truncated bucket ranking as the initial bucket ranking.”) Regarding claim 3, Kormilitsin teaches all of claim 1 as outlined above. Kormilitsin further teaches: the contribution degree calculation process, the degree of contribution of the each of the plurality of features is calculated on the basis of a difference between (i) an evaluation value of the prediction model to be tested corresponding to a case in which the value of the each of the plurality of features is changed in the test data set and (ii) an evaluation value of the prediction model to be tested corresponding to a case in which the value of the each of the plurality of features is not changed in the test data set. ([0062] “The performance measures 710 and 810(1) may be compared to determine the score update Δs.sub.1. For example, the score update Δs.sub.1 may be selected to be the difference between the performance measures 710 and 810(1). In general, the more relevant a feature, the greater the difference and therefore the greater the score update Δs.” Here 710 is a metric to evaluate model performance (see [0060]) and 810 is the same as 710 but a given features impact has been excluded (see [0061])). Regarding claim 4, Kormilitsin teaches all of claim 3 as outlined above. Kormilitsin further teaches: in the contribution degree calculation process, the value of the each of the plurality of features is changed such that values of the each of the plurality of features are randomly replaced with each other among a plurality of pieces of data included in the test data set. ([0061] “The test data 704(1) may be randomized by replacing each data point therein with a randomly-generated value. Alternatively, the test data 704(1) may be randomized by randomly permuting the data points.”) Regarding claim 5, Kormilitsin teaches all of claim 1 as outlined above. Kormilitsin further teaches: in the contribution degree calculation process, a plurality of the degrees of contribution are calculated for the each of the plurality of features with use of a plurality of the test data sets; ([0060] “In FIG. 7, test data 704 associated with each of the p.sub.1 candidate features 102 in the first bucket 110(1) are inputted to a trained prediction model 708.” Each feature has a different test data set). in the feature selection process, a feature with respect to which a statistic obtained from the plurality of the degrees of contribution satisfies a predetermined condition is selected. ([0088] “For each pruning step, the mean μ and standard deviation σ was calculated from the candidate ranking. All candidate features with a score less than μ−ασ were then removed (or truncated) from the ranking, where a is a pruning parameter.”) Regarding claim 6, Kormilitsin teaches all of claim 5 as outlined above. Kormilitsin further teaches: in the feature selection process, the following condition is applied as the predetermined condition: a value obtained by subtracting a standard deviation of the plurality of the degrees of contribution from an average value of the plurality of the degrees of contribution is not less than a threshold. ([0088] “For each pruning step, the mean μ and standard deviation σ was calculated from the candidate ranking. All candidate features with a score less than μ−ασ were then removed (or truncated) from the ranking, where a is a pruning parameter.”) Regarding claim 7, Kormilitsin teaches all of claim 1 as outlined above. Kormilitsin further teaches: A prediction apparatus which uses the new prediction model generated by the prediction model generation apparatus recited in claim 1 ([0075] “In embodiments, the method 900 includes using, after training, the multivariate model 908 to generate a prediction.”) the prediction apparatus comprising at least one processor that carries out ([0077] “The feature-selection system 1300 is a computing device having a processor 1302”) a feature value calculation process of calculating, on the basis of information obtained from a target of prediction, a value of the at least one feature selected; ([0077] “In these embodiments, the method 900 may include receiving a data object to input to the multivariate model 908. The method 900 may then output the prediction. In other embodiments, the method 900 includes outputting, after training, the multivariate model 918, which may include a list of which candidate features are used as inputs for the multivariate model 918 (i.e., the candidate features of the top-features ranking R″). For example, the multivariate model 918 may be transmitted to another computer system that uses the multivariate model 918 for prediction and classification. Other data generated by the method 900 (e.g., single-target rankings, final candidate scores, accuracy against a hold-out data set, etc.) may also be outputted with the multivariate model 918.”) a prediction process of inputting, to the new prediction model, the calculated value of the at least one feature to thereby output a prediction result related to the target of prediction. ([0077] “In these embodiments, the method 900 may include receiving a data object to input to the multivariate model 908. The method 900 may then output the prediction. In other embodiments, the method 900 includes outputting, after training, the multivariate model 918, which may include a list of which candidate features are used as inputs for the multivariate model 918 (i.e., the candidate features of the top-features ranking R″). For example, the multivariate model 918 may be transmitted to another computer system that uses the multivariate model 918 for prediction and classification. Other data generated by the method 900 (e.g., single-target rankings, final candidate scores, accuracy against a hold-out data set, etc.) may also be outputted with the multivariate model 918.”) Regarding claim 8, A prediction model generation method, comprising: ([0006] “The present embodiments include systems and methods that rank a set of candidate features to identify those that are most valuable for constructing a predictor or model.”) calculating, with use of a test data set different from a training data set used in training of a prediction model to be tested, a degree of contribution of each of a plurality of features to a prediction result, a value of the each of the plurality of features being inputted to the prediction model to be tested; ([0060] “FIGS. 7 and 8 illustrate a permutation-based method 700 for calculating score updates Δs. Advantageously, the method 700 can be used with any type of prediction model 208. In FIG. 7, test data 704 associated with each of the p.sub.1 candidate features 102 in the first bucket 110(1) are inputted to a trained prediction model 708. Target data 702 associated with the target feature f.sub.T are also inputted to the prediction model 708, which outputs a first performance measure 710 that quantifies how well the prediction model 208 can recreate the target data 702. The test data 704 and target data 702 may be the same training data 206 used to train the prediction model 208 (e.g., as shown in FIG. 2). Alternatively, the test data 704 and target data 702 may be obtained from a holdout data set, as is commonly used for cross-validation.” The scores are used to demonstrate how each feature contributes to the model see [0028] “Each candidate feature f.sub.i has a corresponding score s.sub.i.sup.(0) (where 1≤i≤n.sub.f) that quantifies the relevance, or predictive ability, of the candidate feature f.sub.i with regards to a prediction model”) selecting, on the basis of the degree of contribution of the each of the plurality of features, at least one feature from among the plurality of features; ([0041] “In the block 428, one or more highest-ranked features of the updated bucket ranking 418 are outputted. The scores of these highest-ranked features may also be outputted. For example, a highest-ranked updated bucket may be outputted. However, additional highest-ranked updated buckets (e.g., a second-highest updated bucket, a third-highest updated bucket, etc.) may also be outputted without departing from the scope hereof. A subset, or portion, of an updated bucket may also be outputted in the block 428.”) generating a new prediction model which, upon receiving input of a value of the at least one feature selected, outputs a prediction result, the calculating, the selecting and the generating each being carried out by a computer. ([0065] “FIG. 9 is a flow chart of a method 900 for constructing a multivariate prediction model 918.” And [0075] “In embodiments, the method 900 includes using, after training, the multivariate model 908 to generate a prediction. In these embodiments, the method 900 may include receiving a data object to input to the multivariate model 908. The method 900 may then output the prediction.”) Regarding claim 9, Kormilitsin teaches all of claim 1 as outlined above. Kormilitsin further teaches: A prediction method carried out by a computer with use of the new prediction model generated by the prediction model generation apparatus recited in claim 1, the prediction method comprising: ([0075] “In embodiments, the method 900 includes using, after training, the multivariate model 908 to generate a prediction.”) calculating, on the basis of information obtained from a target of prediction, a value of the at least one feature selected; and ([0077] “In these embodiments, the method 900 may include receiving a data object to input to the multivariate model 908. The method 900 may then output the prediction. In other embodiments, the method 900 includes outputting, after training, the multivariate model 918, which may include a list of which candidate features are used as inputs for the multivariate model 918 (i.e., the candidate features of the top-features ranking R″). For example, the multivariate model 918 may be transmitted to another computer system that uses the multivariate model 918 for prediction and classification. Other data generated by the method 900 (e.g., single-target rankings, final candidate scores, accuracy against a hold-out data set, etc.) may also be outputted with the multivariate model 918.”) inputting, to the new prediction model, the calculated value of the at least one feature to thereby output a prediction result related to the target of prediction. ([0077] “In these embodiments, the method 900 may include receiving a data object to input to the multivariate model 908. The method 900 may then output the prediction. In other embodiments, the method 900 includes outputting, after training, the multivariate model 918, which may include a list of which candidate features are used as inputs for the multivariate model 918 (i.e., the candidate features of the top-features ranking R″). For example, the multivariate model 918 may be transmitted to another computer system that uses the multivariate model 918 for prediction and classification. Other data generated by the method 900 (e.g., single-target rankings, final candidate scores, accuracy against a hold-out data set, etc.) may also be outputted with the multivariate model 918.”) Regarding claim 10, A non-transitory storage medium storing therein a program for causing a computer to carry out: ([0080] “The memory 1308 stores machine-readable instructions 1312 that, when executed by the processor 1302, control the feature-selection system 1300 to implement the functionality and methods described herein. The memory 1308 also stores data 1314 used by the processor 1302 when executing the machine-readable instructions 1312.”) a contribution degree calculation process of calculating, with use of a test data set different from a training data set used in training of a prediction model to be tested, a degree of contribution of each of a plurality of features to a prediction result, a value of the each of the plurality of features being inputted to the prediction model to be tested; ([0060] “FIGS. 7 and 8 illustrate a permutation-based method 700 for calculating score updates Δs. Advantageously, the method 700 can be used with any type of prediction model 208. In FIG. 7, test data 704 associated with each of the p.sub.1 candidate features 102 in the first bucket 110(1) are inputted to a trained prediction model 708. Target data 702 associated with the target feature f.sub.T are also inputted to the prediction model 708, which outputs a first performance measure 710 that quantifies how well the prediction model 208 can recreate the target data 702. The test data 704 and target data 702 may be the same training data 206 used to train the prediction model 208 (e.g., as shown in FIG. 2). Alternatively, the test data 704 and target data 702 may be obtained from a holdout data set, as is commonly used for cross-validation.” The scores are used to demonstrate how each feature contributes to the model see [0028] “Each candidate feature f.sub.i has a corresponding score s.sub.i.sup.(0) (where 1≤i≤n.sub.f) that quantifies the relevance, or predictive ability, of the candidate feature f.sub.i with regards to a prediction model”) a feature selection process of selecting, on the basis of the degree of contribution of the each of the plurality of features, at least one feature from among the plurality of features; ([0041] “In the block 428, one or more highest-ranked features of the updated bucket ranking 418 are outputted. The scores of these highest-ranked features may also be outputted. For example, a highest-ranked updated bucket may be outputted. However, additional highest-ranked updated buckets (e.g., a second-highest updated bucket, a third-highest updated bucket, etc.) may also be outputted without departing from the scope hereof. A subset, or portion, of an updated bucket may also be outputted in the block 428.”) a prediction model generation process of generating a new prediction model which, upon receiving input of a value of the at least one feature selected, outputs a prediction result. ([0065] “FIG. 9 is a flow chart of a method 900 for constructing a multivariate prediction model 918.” And [0075] “In embodiments, the method 900 includes using, after training, the multivariate model 908 to generate a prediction. In these embodiments, the method 900 may include receiving a data object to input to the multivariate model 908. The method 900 may then output the prediction.”) Regarding claim 11, Kormilitsin teaches all of claim 1 as outlined above. Kormilitsin further teaches: A non-transitory storage medium storing therein a program for causing a computer to function with use of the new prediction model generated by the prediction model generation apparatus recited in claim 1, the program causing the computer to carry out: ([0075] “In embodiments, the method 900 includes using, after training, the multivariate model 908 to generate a prediction.”) a feature value calculation process of calculating, on the basis of information obtained from a target of prediction, a value of the at least one feature selected; ([0077] “In these embodiments, the method 900 may include receiving a data object to input to the multivariate model 908. The method 900 may then output the prediction. In other embodiments, the method 900 includes outputting, after training, the multivariate model 918, which may include a list of which candidate features are used as inputs for the multivariate model 918 (i.e., the candidate features of the top-features ranking R″). For example, the multivariate model 918 may be transmitted to another computer system that uses the multivariate model 918 for prediction and classification. Other data generated by the method 900 (e.g., single-target rankings, final candidate scores, accuracy against a hold-out data set, etc.) may also be outputted with the multivariate model 918.”) a prediction process of inputting, to the new prediction model, the calculated value of the at least one feature to thereby output a prediction result related to the target of prediction. ([0077] “In these embodiments, the method 900 may include receiving a data object to input to the multivariate model 908. The method 900 may then output the prediction. In other embodiments, the method 900 includes outputting, after training, the multivariate model 918, which may include a list of which candidate features are used as inputs for the multivariate model 918 (i.e., the candidate features of the top-features ranking R″). For example, the multivariate model 918 may be transmitted to another computer system that uses the multivariate model 918 for prediction and classification. Other data generated by the method 900 (e.g., single-target rankings, final candidate scores, accuracy against a hold-out data set, etc.) may also be outputted with the multivariate model 918.”) Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to DANIEL PATRICK GRUSZKA whose telephone number is (571)272-5259. The examiner can normally be reached M-F 9:00 AM - 6:00 PM ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Li Zhen can be reached at (571) 272-3768. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DANIEL GRUSZKA/Examiner, Art Unit 2121 /Li B. Zhen/Supervisory Patent Examiner, Art Unit 2121
Read full office action

Prosecution Timeline

May 25, 2023
Application Filed
Feb 24, 2026
Non-Final Rejection — §101, §102 (current)

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
Grant Probability
3y 3m
Median Time to Grant
Low
PTA Risk
Based on 0 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month