Prosecution Insights
Last updated: April 19, 2026
Application No. 17/410,387

USING DATA REDUCTION TO ACCELERATE MACHINE LEARNING FOR NETWORKING

Non-Final OA §101§103§112
Filed
Aug 24, 2021
Examiner
HEFFINGTON, JOHN M
Art Unit
2145
Tech Center
2100 — Computer Architecture & Software
Assignee
Microsoft Technology Licensing, LLC
OA Round
3 (Non-Final)
40%
Grant Probability
Moderate
3-4
OA Rounds
5y 6m
To Grant
70%
With Interview

Examiner Intelligence

Grants 40% of resolved cases
40%
Career Allow Rate
172 granted / 429 resolved
-14.9% vs TC avg
Strong +30% interview lift
Without
With
+30.0%
Interview Lift
resolved cases with interview
Typical timeline
5y 6m
Avg Prosecution
42 currently pending
Career history
471
Total Applications
across all art units

Statute-Specific Performance

§101
10.2%
-29.8% vs TC avg
§103
64.1%
+24.1% vs TC avg
§102
16.1%
-23.9% vs TC avg
§112
6.4%
-33.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 429 resolved cases

Office Action

§101 §103 §112
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This action is in response to the Request for Continued Examination filed 10/9/2025. Claims 1, 11, 18 have been amended. Claim 10 has been canceled. Claims 1-9, 11-21 are pending. Claims 1, 11, and 18 are independent claims. Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 10/9/2025 has been entered. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-9, 11-17 are rejected under 35 USC 101 as directed to an abstract idea without significantly more. With respect to independent claims, 1 and 11, specifically claim 1 recites "a method for reducing the volume of input data for machine learning exploration for computer networking related problems; obtaining a network topology; selecting a structured search of a plurality of reduction functions based on a machine learning task; identifying a subset of reduction functions in response to the structured search, wherein the subset of reduction functions identifies a minimum subset of data that includes information used by a machine learning model in learning an outcome of interest for a machine learning task at a predetermined level of accuracy; generating transformed data by applying the subset of reduction functions to the input data, wherein the subset of reduction functions and an order of applying the subset of reduction function changes based on the machine learning task; receiving input data related to a network; performing a structured search of a plurality of reduction functions based on a grammar to identify a subset of reduction functions, wherein the grammar is based on the network topology and other domain knowledge; determining whether the transformed data achieves a threshold (data collection), wherein the threshold is a minimum acceptable accuracy for a given computer networking related problem; continuing the structured search by identifying a second set of reduction functions that comply with the rules of the grammar in response to the transformed data exceeding the threshold; generating the transformed data by applying the second set of reduction functions to the input data; returning to a previous transformation of the data if the transformed data does not exceed the threshold; wherein the previous transformation use a previous set of reduction functions that exceed the threshold; outputting the transformed data in response to the transformed data exceeding the threshold, wherein the transformed data is a transformation of the input data resulting in a smaller size than the input data; and training, using the transformed data, the machine learning model for the machine learning task, wherein the transformed data minimizes a volume of training data used in training the machine learning model at the predetermined level of accuracy”. These limitations could be reasonably and practically performed by the human mind, reviewing a message a user can determine an emotional level and based on observation of the review determine if/how the message should be sent, these steps are observation/evaluation steps. Furthermore, the added limitations of selecting a structured search, identifying a subset of reduction functions are steps that are performed by a user. The steps of identifying a minimum subset of data used by a machine learning model for a learning task, generating transformed data, Accordingly, the claim recites a mental process, which can be done utilizing pen and paper. Accordingly, the claim recites an abstract idea. This judicial exception is not integrated into a practical application. At step 2A, prong two, claim(s) 1 and 16 recites the additional elements of "instructions stored in memory; one or more processors; receiving input data related to a network; outputting the transformed data in response to the transformed date exceeding the threshold" are recited at a high degree of generality that they represent no more than mere instructions to apply the judicial exception on a computer. These limitations can also be viewed as nothing more than an attempt to generally link the use of the judicial exception to the technological environment of a computer. It should be noted that because the courts have made it clear that mere physicality or tangibility of an additional element or elements is not a relevant consideration in the eligibility analysis, the physical nature of these computer components does not affect this analysis. See MPEP 2106.05(f) and 2106.05(h)(iv). Furthermore, the claim recites the additional elements of, "network topology; reduction functions; transformed data; and domain knowledge" an insignificant extra solution activity of obtaining, analyzing and displaying results to the users. Accordingly, this additional element does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea. The claims, 1 and 11 at step 2B do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As explained with respect to Step 2A Prong Two, the "instructions stored in memory; one or more processors; receiving input data related to a network; outputting the transformed data in response to the transformed date exceeding the threshold" are at best the equivalent of merely adding the words “apply it” to the judicial exception. Mere instructions to apply an exception cannot provide an inventive concept. Under the 2019 PEG, however, a conclusion that an additional element is insignificant extra-solution activity in Step 2A should be re-evaluated in Step 2B. 2019 PEG Section III(B), 84 Fed. Reg. at 56. At Step 2B, the evaluation of the insignificant extra-solution activity consideration takes into account whether or not the extra-solution activity is well-understood, routine and conventional activity. See MPEP 2106.05(g) and 2106.05(d)(II). Here, the recitation of "instructions stored in memory; one or more processors; receiving input data related to a network; outputting the transformed data in response to the transformed date exceeding the threshold" are recited at a high level of generality which are well-understood, routine and conventional. And, the data gathering, manipulation and providing recitations, "network topology," "reduction functions", "transformed data," and "domain knowledge" is an insignificant extra solution activity because its well-known, routine and conventional activity reciting the steps of mere data gathering and providing the manipulated data (See MPEP 2106.5(d)(II)(i)) Claim(s) 11 is/are similarly rejected. The amendments to Claims 1 and 11 are simply additional steps similar to steps already claimed and have been determined to be capable of being performed by a human, therefore, the amendments to not add significantly more or integrate the claims into a practical application. Claims 2-5, 7, 9 are dependent claims and do not recite any additional elements that would amount to significantly more than the abstract idea. Specifically, Claim 2, with respect to step 2A prong 1 “combining the input data or combining different reduction functions of the plurality of reduction functions” recites abstract idea of mental steps (observation & evaluation), a person can choose or determine algorithms based on search goals. Claim 3, with respect to step 2A prong 2 “wherein the subset of reduction functions satisfy the one or more rules of the grammar” recites additional elements of insignificant extra solution activity. With respect to step 2B the recited insignificant extra solution activity is recited at a high level of generality which are well-understood, routine and conventional as taught by the prior art of records. Claim 4, The method of claim 2, wherein identifying the subset of reduction functions further includes: selecting at least two reduction functions from the plurality of reduction functions (mental); determining whether the one or more rules of the grammar allow combining the at least two reduction functions (mental); if the one or more rules are satisfied, adding the at least two reduction functions to the subset of reduction function. Claim 5, The method of claim 1, further comprising: receiving a search budget… performing the structured search… (mental and math) Claim 7, The method of claim 1, further comprising: if the transformed data is below the threshold, applying additional reduction functions to the transformed data until the transformed data exceeds the threshold (mental and math). Claim 9, The method of claim 1, wherein the network topology includes a structure of the network and network dependencies (mental). Claims 11-15, and 17 are similar to claims 6 and 8 and are likewise deficient. Claim 16 can be performed mentally and with math. Claim 21, The method of claim 1, wherein the structured search emulates the machine learning model applying different combinations of reduction functions and selects the subset of reduction functions in response to evaluating an accuracy of the emulation (not significantly more since it is well-known that a search applies filters (different combinations of reduction functions) and returns filtered results (selecting the subset of reduction functions). Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claim(s) 1, 11 is/are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 1 recites the limitation "the rules of the grammar". There is insufficient antecedent basis for this limitation in the claim. Claim 11 recites the limitation "the rules of the grammar". There is insufficient antecedent basis for this limitation in the claim. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 1 - 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Jezewski (2022/0414492) in view of Greene (2019/0245754) and further in view of Matthews et al. (11,328,222). To encapsulate the prior art, Greene teaches a network monitoring system that receives one or more inputs related to computer network configuration, a synonymous term to network topology. Jezewski teaches applying various methods to structures such as, “mapping functions between components input-output connecting sequences/networks (Page 1, Paragraphs 42 – 43),” to solve problems with automation workflow. Regarding Claim 1, Jezewski teaches a method for reducing the volume of input data for machine learning exploration for computer networking related problems, comprising (Page 3, Paragraph 117): selecting a structured search of a plurality of reduction functions based on a machine learning task; performing the structured search of the plurality of reduction functions based on a grammar (Page 11, Paragraph 378); wherein the transformed data is a transformation of the input data resulting in a smaller size than the input data (Page 3 Paragraphs 122-123). However, Jezewski fails to teach wherein the grammar is based on the network topology and other domain knowledge. Jezewski teaches finding reduce difference functions (Page 11, Paragraph 378) but not based on network topology and other domain knowledge; generating transformed data by applying the subset of reduction functions to the input data (Page 3, Paragraph 117); determining whether the transformed data achieves a threshold, wherein the threshold is a minimum acceptable accuracy; continuing the structured search by identifying a second set of reduction functions that comply with the rules of the grammar in response to the transformed data exceeding the threshold; generating the transformed data by applying the second set of reduction functions to the input data, applying problem-solution interaction structures to solution automation workflows (P 0317) optimizing interactions between problem/solution pairs, where the problem is the difference between them that should be reduced (P 0319) using to reduce the solution (P 0323) as a primary interface query intents (P 0324) identifying a solution (P 0374) by identifying structures to reduce the solution set to search for specific problems (P 0375) and generate alternative solution automation workflow insight paths with an interface query to find aligning functions like ‘reduce differences’ (P 0378) sort the problems into a sequence that would solve the problems (P 0173); Moreover Jezewski fails to teach a given computer networking related problem. Jezewski does teach finding and testing for a data accuracy threshold (Page 26, Paragraphs 897-901), but not for a computer networking problem; (Page 52, Paragraphs 1873-1874 & 1879 -1881); returning to a previous transformation of the data if the transformed data does not exceed the threshold; wherein the previous transformation use a previous set of reduction functions that exceed the threshold generate search strategies (P 1393) use patterns of prior searches with similar inputs (P 1400) use patterns of previous searches for similar topic or similar keywords or similar intents (P 1401); and outputting the transformed data in response to the transformed date exceeding the threshold (Page 26, Paragraphs 897-901). Jezewski does not teach identifying a subset of reduction functions in response to the structured search, wherein the subset of reduction functions identifies a minimum subset of data that includes information used by a machine learning model in learning an outcome of interest for a machine learning task at a predetermined level of accuracy; wherein the subset of reduction functions and an order of applying the subset of reduction function changes based on the machine learning task; training, using the transformed data, the machine learning model for the machine learning task, wherein the transformed data minimizes a volume of training data used in training the machine learning model at the predetermined level of accuracy. All elements of the current invention as stated above except receiving input data related to a network and obtaining a network topology. Jezewski does not teach wherein the grammar is based on network topology and other domain knowledge, neither does Jezewski teach a given computer networking related problem. However, Greene teaches receiving input data related to a network (Page 1, Paragraph 6); and obtaining a network topology (Page 1, Paragraph 6); It would have been prima facie obvious to one of ordinary skill in the art before the effective filing data of the claimed invent to have modified Jezewski to incorporate the teachings of Greene to provide the input obtaining a network topology. Doing so would have enabled the reduction of input data based on domain knowledge. Jezewski does not teach identifying a subset of reduction functions in response to the structured search, wherein the subset of reduction functions identifies a minimum subset of data that includes information used by a machine learning model in learning an outcome of interest for a machine learning task at a predetermined level of accuracy; wherein the subset of reduction functions and an order of applying the subset of reduction function changes based on the machine learning task; training, using the transformed data, the machine learning model for the machine learning task, wherein the transformed data minimizes a volume of training data used in training the machine learning model at the predetermined level of accuracy. However, Jezewski teaches the ‘reduce’ interaction function of the workflow reduces the relevant network problem inputs (variables) to the ‘important variable’ structures, or reduce it to a ‘category prediction’ structure (Page 3 Paragraph 122) reduce/expand the solution is a subset/superset of the problem variables (Page 17 Paragraph 0587) and Greene teaches grouping related features into common training domains, the computational requirements of NMLM 120 system are reduced while still providing the necessary level of granularity to identify individual features that require remediation (Page 5 Paragraph 49); training, using the transformed data, the machine learning model for the machine learning task, wherein the transformed data minimizes a volume of training data used in training the machine learning model at the predetermined level of accuracy. Matthews teaches specific reduction operations are identified (Column 9, Lines 20-25) complex collective actions comprise multiple sub-actions to be performed on the compute data and the order in which sub-actions are performed (Column 10, Lines 3-8) configuration information is accessed indicating what type of reduction operation should be performed on the new compute data set, based on a data type identifier specified in the compute data (Column 49, Lines 45-57); identifying a subset of reduction functions in response to the structured search, wherein the subset of reduction functions identifies a minimum subset of data that includes information used by a machine learning model in learning an outcome of interest for a machine learning task at a predetermined level of accuracy. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing data of the claimed invent to have modified Greene to incorporate the teachings of Matthews to provide a more efficient system for computationally-intense applications (Matthews: Column 1 Lines 39-43) by applying a reduction phase (Column 2 Lines 50-58). Regarding Claim 2, Jezewski teaches a method for combining the input data or combining different reduction functions of the plurality of reduction functions (Page 11, Paragraph 378). Regarding Claim 3, Jezewski teaches a method wherein the subset of reduction functions satisfy the one or more rules of the grammar (Page 11, Paragraph 378). Regarding Claim 4, Jezewski teaches a method wherein identifying the subset of reduction functions further includes (Page 11, Paragraph 373): selecting at least two reduction functions from the plurality of reduction functions (Page 11, Paragraph 373); determining whether the one or more rules of the grammar allow combining the at least two reduction functions (Page 11, Paragraph 373); if the one or more rules are satisfied, adding the at least two reduction functions to the subset of reduction functions (Page 11, Paragraph 373); and if the one or more rules are not satisfied, selecting different reduction functions for the subset of reduction functions (Page 11, Paragraph 373). Regarding Claim 5, Jezewski teaches a method further comprising: receiving a search budget that provides constraints on a time for performing the structured search or band width limits for performing the structured search (Page 11, Paragraph 378); and performing the structured search within the search budget (Page 11, Paragraph 378). Regarding Claim 6, Jezewski teaches a method wherein the threshold is a baseline level of accuracy of a matching learning model using the transformed data (Page 26, Paragraphs 897-901). Regarding Claim 7, Jezewski teaches a method further comprising: if the transformed data is below the threshold, applying additional reduction functions to the transformed data until the transformed data exceeds the threshold (Page 26, Paragraphs 897-901). Regarding Claim 8, Jezewski teaches a method wherein an auto machine learning model determines whether the transformed data exceeds the threshold by emulating an application of a machine learning model to the transformed data (Page 26, Paragraphs 897-901). Regarding Claim 9, Jezewski teaches a method wherein the network topology includes a structure of the network and network dependencies (Page 1, Paragraph 6). Regarding Claim 10, Jezewski teaches a method wherein the transformed data is used in training a machine learning model for the machine learning task (Page 52, Paragraphs 1873-1874 & 1879 -1881). Regarding Claim 11 teaches data reduction engine claims similar to the method claim of Claim 1 and is rejected with the same rationale. Regarding Claim 12, Jezewski teaches a data reduction engine wherein the grammar includes one or more rules for combining the input data or combining different reduction functions of the plurality of reduction functions (Page 11, Paragraph 373). Regarding Claim 13, Jezewski teaches a data reduction engine wherein the subset of reduction functions satisfy the one or more rules of the grammar (Page 11, Paragraph 373). Regarding Claim 14, Jezewski teaches a data reduction engine wherein the one or more processors are further operable to (Page 11, Paragraph 378): receive a search budget that provides constraints on a time for performing the structured search or bandwidth limits for performing the structured search (Page 11, Paragraph 378); and perform the structured search within the search budget (Page 11, Paragraph 378). Regarding Claim 15, Jezewski teaches a data reduction engine wherein the threshold is a baseline level of accuracy of a machine learning model using the transformed data and an auto machine learning model determines whether the transformed data exceeds the threshold by emulating an application of the machine learning model to the transformed data (Page 26, Paragraphs 897-901). Regarding Claim 16, Jezewski teaches a data reduction engine wherein the one or more processors are further operable to: apply additional reduction functions to the transformed data until the transformed data exceeds the threshold if the transformed data is below the threshold (Page 26, Paragraphs 897-901 & Page 54, Paragraph 1926). Regarding Claim 17, Greene teaches a data reduction engine wherein the network topology includes a structure of the network and network dependencies (Page 1, Paragraph 6). Regarding Claim 21. Jezewski teaches wherein the structured search (Page 11, Paragraph 0376-378) emulates the machine learning model applying different combinations of reduction functions, (Page 11 Paragraph 377). Matthews teaches selects the subset of reduction functions (Column 9, Lines 20-25, Column 49, Lines 45-57). And Jezewski teaches in response to evaluating an accuracy of the emulation (Page 26, Paragraphs 897-901). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing data of the claimed invent to have modified Jezewski to incorporate the teachings of Matthews to provide a more efficient system for computationally-intense applications (Matthews: Column 1 Lines 39-43) by applying a reduction phase (Column 2 Lines 50-58). Claim(s) 18-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Greene (2019/0245754) in view of Matthews et al. (11,328,222) and further in view of Jezewski (2022/0414492). Regarding Claim 18, Greene teaches monitoring one or more target networks with various configurations, such as tree, spoke and hub, or ring configurations (Page 6, Paragraph 54 & fig.4)--obtaining a network topology for a network, wherein the network topology provides network dependency rules for combining data. Greene teaches “network configuration” and “structure configured to route data and information to and from the plurality or endpoints”, which equates to network topology and dependency rules respectfully Additionally, Greene teaches coordinating data transfers throughout, and reconfiguring the entire computer network to achieve steady state network configuration; features and raw data from each node in each subset of each network is obtained and correlated into a training domain (Page 6, Paragraphs 55, 57, 58)-- defining a set of rules for combining the data from different data sources within the network based on the network topology; generating a grammar based on the set of rules. Greene teaches network configuration information and network performance metrics are acquired using querying techniques (Page 3, Paragraph 37) a determination is made whether the data included in the response is in a format that is recognized by the feature gathering system (Page 7, Paragraph 0063)--using the grammar in guiding a structured search. Greene teaches a correlation/relationship module is configured to utilize various feature selection and reduction techniques, by grouping related features into common training domains, the computational requirements of NMLM system are reduced while still providing the necessary level of granularity (Page 5, Paragraph 49) defining the first data set includes performing feature reduction analysis on the first set of features and the second set of features, feature reduction analysis is a process in which the number of features under consideration are reduced by obtaining a set of principal features (Page 8, Paragraph 69)--a plurality of reduction functions; the subset of reduction functions reduces the data for the machine learning task. All elements of the current invention as stated above except receiving input data related to a network and obtaining a network topology. Greene does not teach search of a plurality of reductions functions based on a machine learning task; selecting a subset of reduction functions in response to the structured search, as disclosed in the claims. However, Matthews teaches specific reduction operations are identified (Column 9, Lines 20-25) configuration information is accessed indicating what type of reduction operation should be performed on the new compute data set, based on a data type identifier specified in the compute data (Column 49, Lines 45-57); It would have been prima facie obvious to one of ordinary skill in the art before the effective filing data of the claimed invent to have modified Greene to incorporate the teachings of Matthews to provide a more efficient system for computationally-intense applications (Matthews: Column 1 Lines 39-43) by applying a reduction phase (Column 2 Lines 50-58). Green does not teach wherein the structure search identifies a combination of reduction functions providing transformed data that maintains an acceptable accuracy for the machine learning task. However, Matthews teaches specific reduction operations are identified (Column 9, Lines 20-25) complex collective actions comprise multiple sub-actions to be performed on the compute data and the order in which sub-actions are performed (Column 10, Lines 3-8) configuration information is accessed indicating what type of reduction operation should be performed on the new compute data set, based on a data type identifier specified in the compute data (Column 49, Lines 45-57); identifying a subset of reduction functions in response to the structured search, wherein the subset of reduction functions identifies a minimum subset of data that includes information used by a machine learning model in learning an outcome of interest for a machine learning task at a predetermined level of accuracy. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing data of the claimed invent to have modified Greene to incorporate the teachings of Matthews to provide a more efficient system for computationally-intense applications (Matthews: Column 1 Lines 39-43) by applying a reduction phase (Column 2 Lines 50-58). Jezewski teaches wherein the structured search (Page 11, Paragraph 0376-378) emulates the machine learning model applying different combinations of reduction functions, (Page 11 Paragraph 377). Regarding Claim 19, Greene teaches “…machine learning techniques correlate and train the various networks and components that constitute distributed computer network into a training domain…” (57-58), which refers to grammar that is globally defined. Regarding Claim 20, Greene fails to explicitly teach the grammar restricts use of reduction functions on the data by defining policies for combining the data or combining different reduction functions. However, Jezewski teaches a method wherein the grammar restricts use of reduction functions on the data by defining policies for combining the data or combining different reduction functions (Page 11, Paragraph 378)-- the grammar restricts use of reduction functions on the data by defining policies for combining the data or combining different reduction functions. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing data of the claimed invent to have modified Greene to incorporate the teachings of Jezewski to provide grammar for the use of reduction functions. Doing so would have enabled the reduction of input data based on domain knowledge. Response to Arguments Applicant's arguments filed 9/10/2025 have been fully considered but they are not persuasive. The applicant argues: The Office Action rejected claims 1-9 and 11-17 under 35 U.S.C. § 101 because the present claims are directed to an abstract idea without significantly more. The Applicant respectfully submits that the amended independent claims are directed towards statutory subject matter. The Office Action on page 3 alleges that the features of the independent claims "could be reasonably and practically performed by the human mind." The Applicant respectfully disagrees. For example, amended independent claim 1 recites "training, using the transformed data, the machine learning model for the machine learning task, wherein the transformed data minimizes a volume of training data used in training the machine learning model at the predetermined level of accuracy." The Applicant respectfully submits that training the machine learning model cannot practically be performed in the human mind. In addition, the Applicant submits that the amended independent claims recite elements that both individually and a whole are directed to a practical application. For example, as described in paragraph [0023] of the Specification "[o]ne technical advantage of the present disclosure is using the data reduction to reduce the volume and complexity of data used during the machine learning development workflow and speeding up the machine learning development workflow." Applicant submits that the claims, as a whole, are directed to a practical application. The claims provide an improvement to techniques for transforming "the initial input data while maintaining an acceptable accuracy for the machine learning models for the machine learning tasks." See, paragraph [0024] of the Specification. For at least the above reasons, the Applicant respectfully submits that the amended independent claims are patent eligible and requests withdrawal of the 35 U.S.C. § 101 rejection of the claims. The examiner respectfully disagrees. The amendments to Claims 1 and 11 are simply additional steps similar to steps already claimed and have been determined to be capable of being performed by a human, therefore, the amendments to not add significantly more or integrate the claims into a practical application. While the transformed data my facilitate minimizing a volume of training data, the transformed data itself can be determined by a user. That is, the transformation is not required to be processed by an algorithm or computing device to transform the data. In order for the transformation to be more than an abstract idea, the transformation process would need to be described in such a way beyond the capability of the human mind. Furthermore, even if the effect of the amended claims is to speed up the machine learning development workflow, process, algorithm or steps, that product that effect would need to be claimed in such a way that could not be performed by the human mind. That is, it is not enough to claim a process that can be performed by the human mind even if the output of that process has some identifiable advantage. Claims 1 and 11 have been amended as follows: continuing the structured search by identifying a second set of reduction functions that comply with the rules of the grammar in response to the transformed data exceeding the threshold; generating the transformed data by applying the second set of reduction functions to the input data; returning to a previous transformation of the data if the transformed data does not exceed the threshold; wherein the previous transformation use a previous set of reduction functions that exceed the threshold. Jezewski teaches finding reduce difference functions (Page 11, Paragraph 378) but not based on network topology and other domain knowledge; generating transformed data by applying the subset of reduction functions to the input data (Page 3, Paragraph 117); determining whether the transformed data achieves a threshold, wherein the threshold is a minimum acceptable accuracy; continuing the structured search by identifying a second set of reduction functions that comply with the rules of the grammar in response to the transformed data exceeding the threshold; generating the transformed data by applying the second set of reduction functions to the input data, applying problem-solution interaction structures to solution automation workflows (P 0317) optimizing interactions between problem/solution pairs, where the problem is the difference between them that should be reduced (P 0319) using to reduce the solution (P 0323) as a primary interface query intents (P 0324) identifying a solution (P 0374) by identifying structures to reduce the solution set to search for specific problems (P 0375) and generate alternative solution automation workflow insight paths with an interface query to find aligning functions like ‘reduce differences’ (P 0378) sort the problems into a sequence that would solve the problems (P 0173). Jezewski does teach finding and testing for a data accuracy threshold (Page 26, Paragraphs 897-901), but not for a computer networking problem; (Page 52, Paragraphs 1873-1874 & 1879 -1881); returning to a previous transformation of the data if the transformed data does not exceed the threshold; wherein the previous transformation use a previous set of reduction functions that exceed the threshold generate search strategies (P 1393) use patterns of prior searches with similar inputs (P 1400) use patterns of previous searches for similar topic or similar keywords or similar intents (P 1401); and outputting the transformed data in response to the transformed date exceeding the threshold (Page 26, Paragraphs 897-901). Jezewski does not teach identifying a subset of reduction functions in response to the structured search, wherein the subset of reduction functions identifies a minimum subset of data that includes information used by a machine learning model in learning an outcome of interest for a machine learning task at a predetermined level of accuracy; wherein the subset of reduction functions and an order of applying the subset of reduction function changes based on the machine learning task; training, using the transformed data, the machine learning model for the machine learning task, wherein the transformed data minimizes a volume of training data used in training the machine learning model at the predetermined level of accuracy. With respect to amendments to Claim 18, Green does not teach wherein the structure search identifies a combination of reduction functions providing transformed data that maintains an acceptable accuracy for the machine learning task. However, Matthews teaches specific reduction operations are identified (Column 9, Lines 20-25) complex collective actions comprise multiple sub-actions to be performed on the compute data and the order in which sub-actions are performed (Column 10, Lines 3-8) configuration information is accessed indicating what type of reduction operation should be performed on the new compute data set, based on a data type identifier specified in the compute data (Column 49, Lines 45-57); identifying a subset of reduction functions in response to the structured search, wherein the subset of reduction functions identifies a minimum subset of data that includes information used by a machine learning model in learning an outcome of interest for a machine learning task at a predetermined level of accuracy. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing data of the claimed invent to have modified Greene to incorporate the teachings of Matthews to provide a more efficient system for computationally-intense applications (Matthews: Column 1 Lines 39-43) by applying a reduction phase (Column 2 Lines 50-58). Jezewski teaches wherein the structured search (Page 11, Paragraph 0376-378) emulates the machine learning model applying different combinations of reduction functions, (Page 11 Paragraph 377). Conclusion Any inquiry concerning this communication should be directed to JOHN M HEFFINGTON at telephone number (571)270-1696. Examiner interviews are available via a variety of formats. See MPEP § 713.01. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JOHN M HEFFINGTON whose telephone number is (571)270-1696. The examiner can normally be reached on Monday through Friday from 9:30 am to 5:30 pm Eastern. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Cesar B Paula, can be reached at telephone number 571-272-4128. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from Patent Center. Status information for published applications may be obtained from Patent Center. Status information for unpublished applications is available through Patent Center to authorized users only. Should you have questions about access to the USPTO patent electronic filing system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). Examiner interviews are available via a variety of formats. See MPEP § 713.01. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) Form at https://www.uspto.gov/InterviewPractice. /J.M.H/Examiner, Art Unit 2145 1/10/2026 /CESAR B PAULA/Supervisory Patent Examiner, Art Unit 2145
Read full office action

Prosecution Timeline

Aug 24, 2021
Application Filed
Jan 15, 2025
Non-Final Rejection — §101, §103, §112
Apr 10, 2025
Interview Requested
Apr 22, 2025
Response Filed
Apr 22, 2025
Applicant Interview (Telephonic)
Jun 23, 2025
Examiner Interview Summary
Jul 09, 2025
Final Rejection — §101, §103, §112
Aug 26, 2025
Interview Requested
Sep 10, 2025
Response after Non-Final Action
Oct 09, 2025
Request for Continued Examination
Oct 15, 2025
Response after Non-Final Action
Jan 10, 2026
Non-Final Rejection — §101, §103, §112
Mar 23, 2026
Interview Requested

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12554999
INLINE VALIDATION OF MACHINE LEARNING MODELS
2y 5m to grant Granted Feb 17, 2026
Patent 12455545
SYSTEM AND METHOD FOR SMART SELECTION AND BUILDING OF INDUSTRIAL AUTOMATION CONTROL SYSTEMS FROM INDUSTRIAL AUTOMATION CONTROL LIBRARIES AND OBJECTS
2y 5m to grant Granted Oct 28, 2025
Patent 12299541
MODEL INSIGHTS FRAMEWORK FOR PROVIDING INSIGHT BASED ON MODEL EVALUATIONS TO OPTIMIZE MACHINE LEARNING MODELS
2y 5m to grant Granted May 13, 2025
Patent 12277427
GRAPHICAL USER INTERFACES FOR EXPLORING AND INTERACTING WITH DISTRIBUTED SOFTWARE APPLICATIONS
2y 5m to grant Granted Apr 15, 2025
Patent 12124554
IMAGE RECOGNITION REVERSE TUNING TEST SYSTEM
2y 5m to grant Granted Oct 22, 2024
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
40%
Grant Probability
70%
With Interview (+30.0%)
5y 6m
Median Time to Grant
High
PTA Risk
Based on 429 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month