Prosecution Insights
Last updated: April 19, 2026
Application No. 17/465,343

Multi-Phase Training Techniques for Machine Learning Models Using Weighted Training Data

Final Rejection §101
Filed
Sep 02, 2021
Examiner
BEAN, GRIFFIN TANNER
Art Unit
2121
Tech Center
2100 — Computer Architecture & Software
Assignee
Paypal Inc.
OA Round
4 (Final)
21%
Grant Probability
At Risk
5-6
OA Rounds
4y 4m
To Grant
50%
With Interview

Examiner Intelligence

Grants only 21% of cases
21%
Career Allow Rate
4 granted / 19 resolved
-33.9% vs TC avg
Strong +28% interview lift
Without
With
+28.4%
Interview Lift
resolved cases with interview
Typical timeline
4y 4m
Avg Prosecution
45 currently pending
Career history
64
Total Applications
across all art units

Statute-Specific Performance

§101
37.7%
-2.3% vs TC avg
§103
40.4%
+0.4% vs TC avg
§102
11.2%
-28.8% vs TC avg
§112
9.7%
-30.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 19 resolved cases

Office Action

§101
DETAILED ACTION This Action is responsive to Claims filed 09/30/2025. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of the Claims Claims 1, 3, 5, 11, and 16-17 have been amended. Claims 1-20 are pending. Response to Arguments Applicant's arguments filed 09/30/2025 regarding the 35 U.S.C. 101 Rejection of Claims 1-20 have been fully considered but they are not persuasive. The Applicant argues the newly amended limitations recite patent-eligible subject matter. The Examiner respectfully disagrees with the Applicant. Taken as a whole, the independent claims recite a series of abstract idea mental process steps, or a generic computer performing generic computing functions in association with the abstract idea mental process steps, directly resulting in the alleged improvement to the functioning of a computer or other technical field, including the limitations cited by the Applicant on Page 14 of the Arguments. The first “training…” step, as presently drafted, is mere pre-solution activity, given the pretraining of the model is recited highly generally. The subsequent “generating…”, “performing…”, “performing…”, “normalizing…”, “generating…”, and “generating…” steps are all practically performed within the human mind or with the aid of pen and paper. There is no specific structure or implementation, save for the recitation of a generic computer, precluding the aforementioned limitations from being performed mentally. The newly amended “determining…” and “updating…” steps, performed as a result of the preceding “generating…” step, also lack specific structure or implementation, save for the recitation of a generic computer, precluding the aforementioned limitations from being performed mentally. Therefore, the “second training…” step, itself defined by a series of abstract idea mental process steps, is merely instructions to apply those abstract ideas. The Examiner points the Applicant to MPEP 2106(I), 2106.04(I), and 2106.05(b)(I) regarding the eligibility of claims made of otherwise ineligible steps or algorithms. The Examiner also contends that, given the “training…” (or second training) steps of the model are recited to such a generic degree, under the broadest reasonable interpretation of the claims, the alleged improvement is a result of the series of abstract idea mental process steps, and per MPEP 2106.05(a), the improvement cannot come from the abstract idea. See the updated 35 U.S.C. 101 Rejection below. Claim Rejections - 35 USC § 101 The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. Claims 1-20 rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more; and because the claims as a whole, considering all claim elements both individually and in combination, do not amount to significantly more than the abstract idea, see Alice Corporation Pty. Ltd. v. CLS Bank International, et al, 573 U.S. (2014). In determining whether the claims are subject matter eligible, the Examiner applies the 2019 USPTO Patent Eligibility Guidelines. (2019 Revised Patent Subject Matter Eligibility Guidance, 84 Fed. Reg. 50, Jan. 7, 2019.) Step 1: Claims 1-10 recite a method for training an initial version of a machine learning classification model, which falls under the statutory category of a process. Claims 11-15 recite non-transitory, computer-readable medium having instructions stored thereon that are executable by a computer system for training an initial version of a machine learning classification model, which falls under the statutory category of a manufacture. Claims 16-20 recite system, comprising: at least one processor; a non-transitory, computer-readable medium having instructions stored thereon that are executable by the at least one processor to cause the system to train a machine learning classification model, which falls under the statutory category of a machine. Step 2A – Prong 1: Claim 1 recites an abstract idea, law of nature, or natural phenomenon. The limitation of “performing, by the computer system, one or more transformations based on the plurality of model scores to generate, for the plurality of training samples, a corresponding plurality of weighting values…”, “performing a logarithmic function on the first corresponding model score to generate a first logarithmic value;”, “normalizing the first logarithmic value based on a highest one and a lowest one of a plurality of logarithmic values generated based on the plurality of model scores;”, “and generating a first weighting value for the first training sample based on the normalized first logarithmic value;”, “determining, using the plurality of weighting values, loss values for the plurality of model scores generated using the initial version of the machine learning classification model;”, and “and updating, based on the loss values and according to a second, lower learning rate, one or more parameters of the initial version of the machine learning classification model.” under the broadest reasonable interpretation, covers a mental process including an observation, evaluation, judgment or opinion that could be performed in the human mind or with the aid of pencil and paper. These limitations therefore fall within the mental process group. Step 2A – Prong 2: The additional elements claim 1 do not integrate the abstract idea into a judicial exception. Claim 1 recites the additional elements “a computer system” and “transformations” are recognized as generic computer components recited at a high level of generality. Although it has and executes instructions to perform the abstract idea itself, this also does not serve to integrate the abstract idea into a practical application as it merely amounts to instructions to "apply it." (See MPEP 2106.04(d)(2) indicating mere instructions to apply an abstract idea does not amount to integrating the abstract idea into a practical application). The additional elements recited in the limitations “training phase”, “a machine learning classification model”, “a training dataset”, “training samples”, “model scores”, “a plurality of classes”, and “weighting values” are recognized as non-generic computer components, however, they are found to generally link the abstract idea to a particular technological environment or field of use (See MPEP 2106.05(h)). The additional elements recited in the limitation “and generating, by the computer system based on the training dataset, an updated version of the machine learning classification model, including, during a second training phase:” merely amounts to instructions to "apply” the preceding abstract idea mental process steps (See MPEP 2106.04(f) indicating mere instructions to apply an abstract idea does not amount to integrating the abstract idea into a practical application). The additional elements recited in the limitations “training, by a computer system, in a first training phase using a first learning rate, an initial version of a machine learning classification model based on a training dataset, wherein, during the first training phase, equal weight is applied to a plurality of training samples in the training dataset;” and “generating, by the computer system using the initial version of the machine learning classification model, a plurality of model scores corresponding to the plurality of training samples in the training dataset, wherein, for a given one of the plurality of training samples, a corresponding given model score from the initial version of the machine learning classification model indicates a probability that the given training sample belongs to a particular one of a plurality of classes;” are found to be mere pre- or post-solution activity such as data output or data transmittal (See MPEP 2106.05(g)). Step 2B: The only limitation on the performance of the described method is a limitation reciting “a computer system” and “transformations” These elements are insufficient to transform a judicial exception to a patentable invention because the recited elements are considered insignificant extra-solution activity (generic computer system, processing resources, links the judicial exception to a particular, respective, technological environment). The claim thus recites computing components only at a high-level of generality such that it amounts to no more than mere instructions to apply the exception using generic computer components; mere instructions to apply an exception using a generic computer component cannot provide an inventive concept (see MPEP 2106.05(f)). The additional elements recited in the limitations “training phase”, “a machine learning classification model”, “a training dataset”, “training samples”, “model scores”, “a plurality of classes”, and “weighting values” are recognized as non-generic computer components, however, they are found to generally link the abstract idea to a particular technological environment or field of use (See MPEP 2106.05(h)). The additional elements recited in the limitations “and generating, by the computer system based on the training dataset, an updated version of the machine learning classification model, including, during a second training phase:” merely amounts to instructions to "apply it." (See MPEP 2106.04(f) indicating mere instructions to apply an abstract idea does not amount to integrating the abstract idea into a practical application). The additional elements recited in the limitations “training, by a computer system, in a first training phase using a first learning rate, an initial version of a machine learning classification model based on a training dataset, wherein, during the first training phase, equal weight is applied to a plurality of training samples in the training dataset;” and “generating, by the computer system using the initial version of the machine learning classification model, a plurality of model scores corresponding to the plurality of training samples in the training dataset, wherein, for a given one of the plurality of training samples, a corresponding given model score from the initial version of the machine learning classification model indicates a probability that the given training sample belongs to a particular one of a plurality of classes;” are found to be well-understood, routine, or conventional activity but for the recitation of generic computer components or additional elements generally linking the abstract idea to a particular technology or field of use. (See WURC examples MPEP2106.05(d)(II)(i)). Taken alone or in ordered combination, these additional elements do not amount to significantly more than the above-identified abstract idea. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely provide conventional computer implementation. For the reasons above, claim 1 is rejected as being directed to non-patentable subject matter under §101. This rejection applies equally to independent claims 11 and 16. Claim 11 recites “A non-transitory, computer-readable medium having instructions stored thereon that are executable by a computer system to perform operations comprising…” (generic computer components) “…performing a first training phase using a first learning rate to generate an initial version of a machine learning classification model, wherein, during the first training phase, equal weighting is applied to a plurality of training samples in a training dataset;” (extra-solution activity) “generating, for the plurality of training samples, a corresponding plurality of weighting values…” (abstract idea, mental process) “…wherein, for a given one of the plurality of training samples, generating a corresponding weighting value includes: generating a model score for the given training sample using the initial version of the machine learning classification model;” (abstract idea, mental process) “…and generating the corresponding weighting value, for the given training sample, based on the model score;” (abstract idea, mental process) “performing a logarithmic function…” (abstract idea, mental process) “normalizing the first logarithmic value…” (abstract idea, mental process) “generating a first weighting value…” (abstract idea, mental process) “…and based on the training dataset, performing a second training phase to generate an updated version of the machine learning classification model, including by: using values for one or more parameters of the initial version of the machine learning classification model as initial values for one or more parameters of the updated version of the machine learning classification model;” (instructions to apply) “applying an optimization algorithm to the plurality of weighting values; and modifying, based on output of the optimization algorithm and according to a second learning rate, the initial values for the one or more parameters of the updated version of the machine learning classification model;” (abstract idea, mental process). The additional elements found in claim 1 and repeated in claim 11 are recognized as non-generic computer components, however, they are found to generally link the abstract idea to a particular technological environment or field of use (See MPEP 2106.05(h)). Claim 16 recites “A system, comprising: at least one processor; a non-transitory, computer-readable medium having instructions stored thereon that are executable by the at least one processor to cause the system to:” (generic computer components) “…access information corresponding to an initial version of a machine learning classification model that was trained, during an initial training phase using a first learning rate, with equal weighting applied to a plurality of training samples in a training dataset;” (data transmittal or manipulation) “…generate, for the plurality of training samples, a plurality of model scores using the initial version of the machine learning classification model, wherein, for a given one of the plurality of training samples, a corresponding model score indicates a probability that the given training sample corresponds to a particular one of a plurality of classes;” (abstract idea, mental process) “…based on the plurality of model scores, determine a plurality of weighting values corresponding to the plurality of training samples…” (abstract idea, mental process) “performing a logarithmic function…” (abstract idea, mental process) “normalizing the first logarithmic value…” (abstract idea, mental process) “generating a first weighting value…” (abstract idea, mental process) “…and generate, based on the training dataset, an updated version of the machine learning classification model…” (instructions to apply) “determining, based on the plurality of weight values, loss values for the plurality of model scores generated using the initial version of the machine learning classification model; and updating, based on the loss values and according to a second, lower learning rate, one or more parameters of the initial version of the machine learning classification model.” (abstract idea mental process steps). Dependent Claims: Claim 2 recites a mental process abstract idea step (“…generated such that a first training sample with a first model score is given a higher weighting value than a second training sample with a second, lower model score.”), assigning weight values in an order is practically performed within the human mind or with the aid of pencil and paper. Claim 3 (claim 17) recites abstract ideas “applying an optimization algorithm to modify one or more parameters of the machine learning classification model, wherein the optimization algorithm uses a particular loss function to evaluate a performance of the machine learning classification model for a given one of the plurality of training samples,” (mathematical relationship or calculation), and “for the given training sample, a corresponding loss value generated using the particular loss function is weighted based on a given weighting value associated with the given training sample.” (mental process). Claim 4 merely recites a refinement of a loss function additional element. The additional elements recited in claim 4 have been found to generally link the abstract idea to a particular technological environment or field of use (See MPEP 2106.05(h)). Claim 5 recites refinements to the “updating…” mental process step of Claim 1. Claim 6 (claim 14 and claim 18) recites abstract ideas “performing one or more of a logit transformation function or a Box-Cox transformation” (mathematical calculations). Claim 7 recites refinements to the additional elements. Claim 8 (claim 15) recites refinements to the additional elements. Claim 9 (claim 15 and claim 19) recites refinements to the additional elements. Claim 10 (claim 20) recites data transmittal and extra-solution activity steps (“receiving, by the computer system, an authorization request corresponding to a second electronic transaction, wherein the authorization request specifies one or more attributes associated with the second electronic transaction;” and “applying, by the computer system, information corresponding to the one or more attributes associated with the second electronic transaction as input to the updated version of the machine learning classification model to generate a predicted classification for the second electronic transaction;”) as well as an abstract idea, mental process step (“determining, by the computer system, whether to authorize the second electronic transaction based on the predicted classification.”) Claim 12 recites refinements to the additional elements and abstract idea mental process steps (“a corresponding loss value generated using the particular loss function is weighted based on a given weighting value associated with the given training sample.”). Claim 13 recites refinements to the additional elements and abstract idea mental process steps (“and wherein the corresponding plurality of weighting values are generated such that a first training sample with a first model score is given a higher weighting value than a second training sample with a second, lower model score.”). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to GRIFFIN T BEAN whose telephone number is (703)756-1473. The examiner can normally be reached M - F 7:30 - 4:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Li Zhen can be reached at (571) 272-3768. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /GRIFFIN TANNER BEAN/Examiner, Art Unit 2121 /Li B. Zhen/Supervisory Patent Examiner, Art Unit 2121
Read full office action

Prosecution Timeline

Sep 02, 2021
Application Filed
Oct 19, 2024
Non-Final Rejection — §101
Jan 06, 2025
Interview Requested
Jan 16, 2025
Examiner Interview Summary
Jan 16, 2025
Applicant Interview (Telephonic)
Jan 21, 2025
Response Filed
Apr 17, 2025
Final Rejection — §101
May 28, 2025
Interview Requested
Jun 11, 2025
Applicant Interview (Telephonic)
Jun 11, 2025
Examiner Interview Summary
Jun 18, 2025
Request for Continued Examination
Jun 23, 2025
Response after Non-Final Action
Jun 24, 2025
Non-Final Rejection — §101
Sep 12, 2025
Interview Requested
Sep 22, 2025
Examiner Interview Summary
Sep 22, 2025
Applicant Interview (Telephonic)
Sep 30, 2025
Response Filed
Jan 12, 2026
Final Rejection — §101 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12424302
ACCELERATED MOLECULAR DYNAMICS SIMULATION METHOD ON A QUANTUM-CLASSICAL HYBRID COMPUTING SYSTEM
2y 5m to grant Granted Sep 23, 2025
Patent 12314861
SYSTEMS AND METHODS FOR SEMI-SUPERVISED LEARNING WITH CONTRASTIVE GRAPH REGULARIZATION
2y 5m to grant Granted May 27, 2025
Patent 12261947
LEARNING SYSTEM, LEARNING METHOD, AND COMPUTER PROGRAM PRODUCT
2y 5m to grant Granted Mar 25, 2025
Study what changed to get past this examiner. Based on 3 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
21%
Grant Probability
50%
With Interview (+28.4%)
4y 4m
Median Time to Grant
High
PTA Risk
Based on 19 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month