Prosecution Insights
Last updated: April 19, 2026
Application No. 17/388,919

AUTOMATICALLY REDUCING MACHINE LEARNING MODEL INPUTS

Final Rejection §101§112
Filed
Jul 29, 2021
Examiner
MENGISTU, TEWODROS E
Art Unit
2127
Tech Center
2100 — Computer Architecture & Software
Assignee
Capital One Services LLC
OA Round
4 (Final)
49%
Grant Probability
Moderate
5-6
OA Rounds
4y 5m
To Grant
77%
With Interview

Examiner Intelligence

Grants 49% of resolved cases
49%
Career Allow Rate
62 granted / 127 resolved
-6.2% vs TC avg
Strong +28% interview lift
Without
With
+28.2%
Interview Lift
resolved cases with interview
Typical timeline
4y 5m
Avg Prosecution
34 currently pending
Career history
161
Total Applications
across all art units

Statute-Specific Performance

§101
27.9%
-12.1% vs TC avg
§103
44.5%
+4.5% vs TC avg
§102
9.6%
-30.4% vs TC avg
§112
14.7%
-25.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 127 resolved cases

Office Action

§101 §112
Detailed Action Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claims 1, 3-5, 7-8, 10-12, 14-15, 17-19, and 21-26 are pending for examination. Claims 1, 8 and 15 are independent. Response to Amendment The office action is responsive to the amendments filed on 10/20/2025. As directed by the amendments claims 1, 3-5, 8, 10-12, 15, 17-19, and 21-26 are amended. Claims 2, 6, 9, 13, 16, and 20 are canceled. Response to Arguments Applicant's arguments filed 10/20/2025 have been fully considered but they are not fully persuasive. Applicant arguments regarding 35 U.S.C. § 101: Applicant submits that the claimed subject-matter is directed to a reduced input machine learning model (MLM) that is less resource intensive and more versatile in an event of a data drift, without compromising performance that may result from reduction of the input training data features. The recited implementation is based on using a distinct dataset for performing integrated variant analysis and scoring each input features based on all the samples in the distinct dataset used for integrated invariant analysis and pruning the input data features based on data inputs that are less/least importance to the performance of the MLM. Some important features of the claimed subject-matter correspond to an streamlined and efficient way of identifying least important data feature with respect to the performance of the model and an efficient way of removing such input features from a large dataset comprising a very large number of features. The former is implemented by Using a specific set of "withheld" samples (e.g., second dataset) distinct from the training data, classify each of these "withheld" ( or out of sample) samples and using an Integrated Variants analysis to rate each input feature for its contribution to classifying the individual withheld sample. The contributions (associated with each on the input data) from all withheld samples (e.g., second dataset distinct from the first training dataset) may then be summed and normalized - the latter is implemented by calculating a probability distribution of the input features based on probability that each feature is the most important feature for this classification model. This probability distribution is then used to rank (order) features based on relative importance to the performance of the MLM. Accordingly, the claimed subject-matter (e.g., as recited by independent claim 1 recites: […] On page 4, the Non-Final Office Action ("NFOA") maintain that the claims are directed to an abstract idea of pruning a dataset based on ranking of input generated based on mathematical calculation and mental processes implemented with generic computing features. Applicant respectfully disagrees and submits that the claims ( e.g., independent claim 1) recite specific implementation for ranking input data features based on a relative importance parameter with respect to a performance of the MLM. The relative importance parameter being derived through a variance analysis of a back propagation operation to generate a multi-modal gaussian distribution, the peaks of which are used as measure of importance and subsequent ranking being based on a proximity of various input data features to these peaks. […] Examiner response: Examiner respectfully disagrees, the claims describe reducing input to a machine learning model based on importance ranking, which is directed to an abstract idea without significantly more. An improvement to the abstract idea itself is not considered an improvement in the functioning of a computer or an improvement to any other technology and is not sufficient to integrate the abstract idea into a practical application. The claimed limitations do not provide improvements to the functioning of a computer, or to any other technology or technical field as described in MPEP 2106.05(a). The claims limitations addressed in the previous action have been almost entirely amended, and therefore the 35 USC 101 rejection has been updated (see 35 USC 101 rejection below). As stated in the rejection various steps describe performing mental steps to determine importance and select inputs to remove/reduce. Applicant further argues: Applicant respectfully disagrees and submits that the claims recite a combination of technical features for implementing an importance-based pruning of data while optimizing performance loss ( e.g. by determining least important input data to prune using a specific ( out of sample) dataset, distinct from training and testing datasets, to perform backpropagation summation of node weight values and uses this parameter as an importance index to prune data around peaks of a resulting distribution graph corresponding to an importance value with relation to model inputs. Furthermore the recited implementation recites specific set of implementation steps which can not be reasonably interpreted as a generalized mathematical algorithm and/or a series of mental steps. Applicant further submits that the claimed subject-matter describes a performance optimizing way of implementing a reduced MLM based on a specific implementation for determining the least important input data to prune. Applicant submits that the claimed system and method for implementing a high-performance reduced-input MLM is tantamount to an improvement to a technical field and not an abstract idea or an improved abstract idea. […] Applicant respectfully submits that the above characterization cannot be reasonably applied to the claims as currently presented. Although individual features of the claims, when viewed in isolation, may be interpreted as generic computer/mathematical functions, an unconventional and novel arrangement of generic computer/mathematical function, that results in performance optimization of a reduced-input MLM qualified as eligible subject-matter under 101. Accordingly, the claimed subject-matter amounts to an unconventional combination of computing/technical features which achieves a new and useful outcome. Further in support of subject-matter eligibility of the claimed subject-matter, Applicant refers to the subject-matter eligibility ruling in BASCOM Global Internet Servs., Inc. v. AT&T Mobility LLC, 827 F.3d 1341, 1347-48 (Fed. Cir. 2016) that held an unconventional combination of known features (i.e., known computing and networking components and/or operations) that achieves a new and useful outcome, to be patent-eligible subject matter. Furthermore, Applicant submits that claims recite computational features such as using a peak value proximity in a multi-modal gaussian distribution that results from specific processing of the MLM using a distinct (out of sample) dataset for variant analysis and determination of relative importance and ranking of input data features such features cannot be reasonably preformed in human mind anymore that filtering operation in Bascom could be interpreted as being performed by a human capable of filtering content based on specified individual user preference. With respect to features corresponding to mathematical/computing operations (e.g., statistical data processing features such as generation of a probability distribution graph for a plurality of generated importance parameters associated with data elements of a large dataset) recited in the claims, Applicant submits that such features are integrated with aforementioned computational steps in a specific arrangement to enable pruning of least important data when implementing a reduced MLM in a streamlined and efficient manner. Even if the Examiner interprets pruning of least important data elements from a large dataset associated with a MLM as an abstract idea, the specifically recited optimization of such a process, based on an unconventional arrangement of technical features, would corresponds to a patent-eligible subject-matter. Accordingly, Applicant submits that the claimed subject-matter, as recited, for example, in the independent claims, constitute an eligible subject-matter. Accordingly, Applicant respectfully request withdrawal of the rejection under section 35 U.S.C. § 101. Examiner response: Examiner respectfully disagrees, applicant describes features not present within the claims such as “perform backpropagation summation of node weight values and uses this parameter as an importance index to prune data around peaks of a resulting distribution graph corresponding to an importance value with relation to model inputs”. Claim 1 does not recite node weight values or peaks, although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. Further applicant argues the claims recites specific set of implementation steps which cannot be reasonably interpreted as a generalized mathematical algorithm and/or a series of mental steps, without specifically pointing out how or which steps of the claims cannot be abstract ideas. Applicant also broadly states the claims are an improvement to a technical field and not an abstract idea. Examiner respectfully disagrees, the claimed limitations do not provide improvements to the functioning of a computer, or to any other technology or technical field as described in MPEP 2106.05(a). Examiner respecify disagrees that “an unconventional and novel arrangement of generic computer/mathematical function, that results in performance optimization of a reduced-input MLM qualified as eligible subject-matter under 101”. Being unconventional and having a novel arrangement does not result in eligible subject-matter under 101. BASCOM describes an entirely different application and it is unclear how applicant’s invention corresponds to the features described in BASCOM. Overall, the claim limitations are a combination of mental steps under step 2A Prong 1, and additional elements under steps 2A Prong 2 & 2B as detailed in the updated 101 rejection below. Applicant arguments regarding 35 U.S.C. § 112(a): Applicant’s arguments, filed 10/20/2025, with respect to 35 U.S.C. § 112(a) have been fully considered and are persuasive. The 35 U.S.C. § 1 has been withdrawn. Applicant arguments regarding 35 U.S.C. § 112(b): Applicant’s arguments, filed 10/20/2025, with respect to 35 U.S.C. § 112( for claims 21, 23, and 25 have been fully considered and are persuasive. The 35 U.S.C. § 1 for claims 21, 23, and 25 has been withdrawn. Applicant arguments regarding 35 U.S.C. § 103: Applicant’s arguments, filed 10/20/2025, with respect to 35 U.S.C. § 103 have been fully considered and are persuasive. The 35 U.S.C. § 103 rejection has been withdrawn. Claim Objections Claim 1 objected to because of the following informalities: Claim 1 recites MLM without spelling out what MLM stands for. Claims 8 and 15 recite similar informalities. Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1, 3-5, 7-8, 10-12, 14-15, 17-19, and 21-26 rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 1 recites in lines 21-22 “applying the second dataset at an output of the original MLM”. It is unclear how exactly a second dataset is applied at an output of a machine learning model. Based on the specification para 0034 states “apply a backpropagation technique at the output of MLM 112A using applied dataset 122 in order to assess the importance of each input of the MLM 112A” which is describing applying backpropagation at an output and not the dataset at the output. Other sections on the specification (e.g. para 0051) re-iterate applying a dataset at an output but do not clarify how exactly a dataset is applied to the output of the model. For purpose of examination, Examiner interprets the limitation as applying a backpropagation technique at the output of the MLM after applying a second dataset. Independent claims 8 and 15 also recites similar limitations and are also rejected under 112(b) for the same reasons as claim 1. Dependent claims 3-5, 7, 10-12, 14, 17-19, and 21-26 do not resolve the 112(b) rejection from independent claims 1, 8, and 15 and are also rejected under 112(b). Claim 1 recites the limitation "the second dataset" in line 21. There is insufficient antecedent basis for this limitation in the claim. Independent claims 8 and 15 also recites similar limitations and are also rejected under 112(b) for the same reasons as claim 1. Dependent claims 3-5, 7, 10-12, 14, 17-19, and 21-26 do not resolve the 112(b) rejection from independent claims 1, 8, and 15 and are also rejected under 112(b). Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1, 3-5, 7-8, 10-12, 14-15, 17-19, and 21-26 rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 According to the first part of the analysis, in the instant case, claims 1, 3-5, 7, 21-22, and 26 are directed to a method, claims 8, 10-12, 14, and 23-24 are directed to a non-transitory computer readable medium, and claim 15, 17-19, and 25 is directed to an apparatus. Thus, each of the claims falls within one of the four statutory categories (i.e., process, machine, manufacture, or composition of matter). Regarding Claim 1: 2A Prong 1: pruning an original MLM to generate a reduced input MLM wherein pruning comprises: determining an importance value of each input of the plurality of inputs associated with the original MLM by applying the second dataset at an output of the original MLM and performing a backpropagation operation with respect to each input of the plurality of input (This step for determining importance values is practically implementable in the human mind and is understood to be a mental process (i.e., evaluation).), the back propagation operation comprising: summing values of each weight along a path associated with each input, by starting at an output of the original MLM and tracing backward to each of the plurality of inputs, wherein an importance value, for each input of the plurality of inputs, is determined based on a summation of weight values (This step for summing values is practically implementable in the human mind and is understood to be a mental process (i.e., evaluation).); and performing a variant analysis by generating a relative importance ranking for each input value based on a distribution plot of importance values in relation to each of the plurality of inputs (This step for performing a variant analysis is practically implementable in the human mind and is understood to be a mental process (i.e., judgment/evaluation).); reducing the number of inputs based on the relative importance ranking of each input value to generate a first reduced input MLM (This step for reducing inputs based on importance is practically implementable in the human mind and is understood to be a mental process (i.e., judgment/evaluation).); and performing a second pruning of the reduced input MLM, if the size of the reduced input MLM is greater than the memory threshold of the target device, to generate a second reduced input MLM capable of being stored on the memory of the target device. (This step for performing further pruning based on a memory threshold is practically implementable in the human mind and is understood to be a mental process (i.e., judgment/evaluation).) 2A Prong 2: This judicial exception is not integrated into a practical application. Additional elements: A method, comprising, via at least one processor of a computing device: (The processor and computer device are understood to be generic computer equipment. See MPEP 2106.05(f).) capable of being stored on a memory of a target device, the original MLM corresponding to a fully trained MLM associated with a plurality of weights generated along a path associated with each of a plurality of inputs based on a first training dataset (The specification of data to be stored is understood to be a field of use limitation. The limitation further specifies the MLM - See MPEP 2106.05(h).) The additional elements as disclosed above alone or in combination do not integrate the judicial exception into practical application as they are generic computer functions in combination with field of use that are implemented to perform the disclosed abstract idea above. 2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional elements: A method, comprising, via at least one processor of a computing device: (The processor and computer device are understood to be generic computer equipment. See MPEP 2106.05(f).) capable of being stored on a memory of a target device, the original MLM corresponding to a fully trained MLM associated with a plurality of weights generated along a path associated with each of a plurality of inputs based on a first training dataset (The specification of data to be stored is understood to be a field of use limitation. The limitation further specifies the MLM - See MPEP 2106.05(h).) The additional elements as disclosed above in combination of the abstract idea are not sufficient to amount to significantly more than the judicial exception as they are generic computer functions in combination with field of use that are implemented to perform the disclosed abstract idea above. Regarding Claim 8: see the rejection of claim 1 above. Same rationale applies. 2A Prong 2 & 2B: The claim recites another additional element “A non-transitory computer-readable storage medium, the computer-readable storage medium including instructions that when executed by a processor, cause the processor to:” (mere instructions to apply the exception using a generic computer component- see MPEP 2106.05(f)) Regarding Claim 15: see the rejection of claim 1 above. Same rationale applies. 2A Prong 2 & 2B: The claim recites another additional element “A computing apparatus comprising: a processor; and a memory storing instructions that, when executed by the processor, cause the processor to:” (mere instructions to apply the exception using a generic computer component- see MPEP 2106.05(f)) Regarding Claims 3, 10, and 17 2A Prong 1: wherein the distribution plot of importance values corresponds to a Gaussian distribution, wherein the Gaussian distribution comprises a multimodal Gaussian distribution that includes a plurality of peaks, determined based on an importance (I) plotted with respect to each of the plurality of inputs of MLM, the importance based on the plurality of weights. (This step is practically implementable in the human mind and is understood to be a mental process with the aid of pen and paper (i.e., evaluation).). 2A Prong 2 & 2B: The claim does not recite any additional elements. Regarding Claims 4, 11, and 18 2A Prong 1: wherein reducing the number of inputs is based on an importance thresholds set around the plurality of peaks of the Gaussian distribution (This step is practically implementable in the human mind and is understood to be a mental process (i.e., evaluation).). 2A Prong 2 & 2B: The claim does not recite any additional elements. Regarding Claims 5, 12, and 19 2A Prong 1: wherein the variant analysis further comprises normalizing each summed weight. (This step is practically implementable in the human mind and is understood to be a mental process (i.e., evaluation).). 2A Prong 2 & 2B: The claim does not recite any additional elements. Regarding Claims 7 and 14 2A Prong 1: The claim does not recite any mental steps. 2A Prong 2 & 2B: wherein the memory threshold comprises a memory capacity of the target device. (The specification of data to be stored is understood to be a field of use limitation. The limitation further specifies the memory threshold - See MPEP 2106.05(h).) Regarding Claims 21, 23, and 25 2A Prong 1: further comprising, (This step for determining gaps and removing inputs is practically implementable in the human mind and is understood to be a mental process (i.e., judgment).). 2A Prong 2 & 2B: via the at least one processor (The processor is understood to be generic computer equipment. See MPEP 2106.05(f).) Regarding Claims 22, 24, and 26 2A Prong 1: wherein the Gaussian distribution comprises a single peak determined based on an importance (I) plotted with respect to each of the plurality of inputs of MLM, the importance based on the plurality of weights, the single peak representing an input of the plurality of inputs with the highest importance (I), wherein the at least one input is removed via eliminating at least one input following a threshold set after the single peak. (This step for determining the gaussian distribution and removing inputs is practically implementable in the human mind and is understood to be a mental process (i.e., evaluation).) 2A Prong 2 & 2B: The claim does not recite any additional elements. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Yakovlev et al. (US 20200327448 A1) describes ranking features based on importance. THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to TEWODROS E MENGISTU whose telephone number is (571)270-7714. The examiner can normally be reached Mon-Fri 9:30-5:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, ABDULLAH KAWSAR can be reached at (571)270-3169. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /TEWODROS E MENGISTU/ Examiner, Art Unit 2127
Read full office action

Prosecution Timeline

Jul 29, 2021
Application Filed
Jan 12, 2024
Non-Final Rejection — §101, §112
Mar 27, 2024
Interview Requested
Apr 12, 2024
Examiner Interview Summary
Apr 12, 2024
Applicant Interview (Telephonic)
Apr 17, 2024
Response Filed
Jun 20, 2024
Final Rejection — §101, §112
Sep 12, 2024
Response after Non-Final Action
Sep 18, 2024
Applicant Interview (Telephonic)
Sep 18, 2024
Response after Non-Final Action
Sep 25, 2024
Request for Continued Examination
Oct 07, 2024
Response after Non-Final Action
Apr 15, 2025
Non-Final Rejection — §101, §112
Oct 20, 2025
Response Filed
Jan 16, 2026
Final Rejection — §101, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12566817
AUTOMATIC MACHINE LEARNING MODEL EVALUATION
2y 5m to grant Granted Mar 03, 2026
Patent 12482032
Selective Data Rejection for Computationally Efficient Distributed Analytics Platform
2y 5m to grant Granted Nov 25, 2025
Patent 12450465
NEURAL NETWORK SYSTEM, NEURAL NETWORK METHOD, AND PROGRAM
2y 5m to grant Granted Oct 21, 2025
Patent 12400252
ARTIFICIAL INTELLIGENCE BASED TRANSACTIONS CONTEXTUALIZATION PLATFORM
2y 5m to grant Granted Aug 26, 2025
Patent 12380369
HYPERPARAMETER TUNING IN AUTOREGRESSIVE INTEGRATED MOVING AVERAGE (ARIMA) MODELS
2y 5m to grant Granted Aug 05, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
49%
Grant Probability
77%
With Interview (+28.2%)
4y 5m
Median Time to Grant
High
PTA Risk
Based on 127 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month