Prosecution Insights
Last updated: April 19, 2026
Application No. 17/691,825

GENERATION SUPPORT APPARATUS, GENERATION SUPPORT METHOD, AND GENERATION SUPPORT PROGRAM

Non-Final OA §101§103
Filed
Mar 10, 2022
Examiner
MEYER, JACQUELINE CHRISTINE
Art Unit
2144
Tech Center
2100 — Computer Architecture & Software
Assignee
Hitachi, Ltd.
OA Round
3 (Non-Final)
62%
Grant Probability
Moderate
3-4
OA Rounds
4y 3m
To Grant
99%
With Interview

Examiner Intelligence

Grants 62% of resolved cases
62%
Career Allow Rate
8 granted / 13 resolved
+6.5% vs TC avg
Strong +68% interview lift
Without
With
+67.5%
Interview Lift
resolved cases with interview
Typical timeline
4y 3m
Avg Prosecution
24 currently pending
Career history
37
Total Applications
across all art units

Statute-Specific Performance

§101
30.1%
-9.9% vs TC avg
§103
44.5%
+4.5% vs TC avg
§102
9.9%
-30.1% vs TC avg
§112
12.7%
-27.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 13 resolved cases

Office Action

§101 §103
DETAILED ACTION This nonfinal office action is responsive to the amendment filed on November 7, 2025. Claims 1-2 and 4-9 are pending. Claims 1, 8, and 9 are independent. Claim rejections under 35 USC §101 are maintained. See sections Claim Rejections – 35 USC §101 and Response to Arguments below. Claim rejections under 35 USC §103 are maintained. See sections Claim Rejections – 35 USC §103 and Response to Arguments below. Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on November 7, 2025 has been entered. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim 9 is rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claim(s) does/do not fall within at least one of the four categories of patent eligible subject matter because the claim recites a program which would include software, which does not fall under one of the statutory categories for patent eligibility (Microsoft Corp. v. AT&T Corp., 550 U.S. 437, 449, 82 USPQ2d 1400, 1407), see MPEP 2106.03. Claims 1-2 and 4-9 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding claim 1: Claim 1 recites an apparatus and therefore falls within the statutory category of a machine. The claim also recites adding change history to the component group, recording a version of an updated component group, and generating second configuration information including a component corresponding to the version of the updated component group. The foregoing can practically be performed in the human mind. For instance, a person is capable of mentally keeping track of changes, i.e. keeping track of the amount of money being spent on groceries as more is added to the cart, is adding change history to the component group. Likewise, a person is capable of mentally keeping track of revisions and a version number for that revision as well as, with the aid of pencil and paper, writing (generating) configuration information including a component corresponding to the version of the updated component group. Therefore, these limitations fall within the "mental processes" grouping of abstract ideas. The judicial exception is not integrated into a practical application. The claim recites an apparatus, one or more processors coupled with memory, inputting the configuration information, and configuring the second machine learning model based on the second configuration information which is nothing more than mere instructions to apply the judicial exception using a generic computer. The claim also recites storing a plurality of pieces of component information and component management information which is nothing more than insignificant extra-solution activity. Additionally, the included limitation of “wherein the plurality of component groups include…” is generally linking the judicial exception to a particular field of use. The claim as a whole, looking at the additional elements individually and in combination does not integrate the judicial exception into a practical application. Therefore, the claim is directed to an abstract idea. The claim does not recite additional elements that amount to significantly more than the judicial exception. In particular, claim 1 recites an apparatus, one or more processors coupled with memory, inputting the configuration information, and configuring the second machine learning model based on the second configuration information which is nothing more than mere instructions to apply the judicial exception using a generic computer. The claim also recites storing a plurality of pieces of component information and component management information which is nothing more than insignificant extra-solution activity. Storing information in memory is a well-understood, routine, and conventional activity (Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788F.3d at 1363, 115 USPQ2d at 1092-93), see MPEP 2106.05(d). Additionally, the included limitation of “wherein the plurality of component groups include…” is generally linking the judicial exception to a particular field of use. The additional elements, taken individually and in combination, do not result in the claim, as a whole, amounting to significantly more than the identified judicial exception. The claim is not patent eligible. Regarding claim 2: Claim 2 further elaborates on the apparatus of claim 1. The claim also recites inputting a learning element, adding the change history to the learning element information, recording a version of a learning element, specifying a version of an updated learning element, and including a learning element of the specified version. The foregoing can practically be performed in the human mind, see claim 1 above for examples. Therefore, these limitations fall within the "mental processes" grouping of abstract ideas. The judicial exception is not integrated into a practical application. The claim recites a memory which is nothing more than mere instructions to apply the judicial exception using a generic computer. The claim also recites storing learning element information, storing a version of the learning element, and outputting a version of an updated learning element which is nothing more than insignificant extra-solution activity. The claim as a whole, looking at the additional elements individually and in combination does not integrate the judicial exception into a practical application. Therefore, the claim is directed to an abstract idea. The claim does not recite additional elements that amount to significantly more than the judicial exception. In particular, claim 2 recites a memory which is nothing more than mere instructions to apply the judicial exception using a generic computer. The claim also recites storing learning element information, storing a version of the learning element, and outputting a version of an updated learning element which is nothing more than insignificant extra-solution activity. Storing and outputting information are well-understood, routine, and conventional activities (Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); 788 F.3d at 1363, 115 USPQ2d at 1092-93), see MPEP 2106.05(d). The additional elements, taken individually and in combination, do not result in the claim, as a whole, amounting to significantly more than the identified judicial exception. The claim is not patent eligible. Regarding claim 4: Claim 4 further elaborates on the component groups of claim 1 and therefore also falls under the “mental processes” grouping of abstract ideas. The judicial exception is not integrated into a practical application. The claim recites that the first group and the second group include different information for a learning algorithm and machine learning model which is nothing more than generally linking the judicial exception to a particular field of use. The claim as a whole, looking at the additional elements individually and in combination does not integrate the judicial exception into a practical application. Therefore, the claim is directed to an abstract idea. The claim does not recite additional elements that amount to significantly more than the judicial exception. In particular, claim 4 recites that the first group and the second group include different information for a learning algorithm and machine learning model which is nothing more than generally linking the judicial exception to a particular field of use. The additional elements, taken individually and in combination, do not result in the claim, as a whole, amounting to significantly more than the identified judicial exception. The claim is not patent eligible. Regarding claim 5: Claim 5 further elaborates on the apparatus of claim 1 and therefore also falls under the “mental processes” grouping of abstract ideas. The judicial exception is not integrated into a practical application. The claim recites generating a machine learning model based on the configuration information. This claim is recited at such a high level of generality that it is nothing more than mere instructions to apply the judicial exception using a generic computer. The claim as a whole, looking at the additional elements individually and in combination does not integrate the judicial exception into a practical application. Therefore, the claim is directed to an abstract idea. The claim does not recite additional elements that amount to significantly more than the judicial exception. In particular, claim 5 recites generating a machine learning model based on the configuration information. This claim is recited at such a high level of generality that it is nothing more than mere instructions to apply the judicial exception using a generic computer. The additional elements, taken individually and in combination, do not result in the claim, as a whole, amounting to significantly more than the identified judicial exception. The claim is not patent eligible. Regarding claim 6: Claim 6 further elaborates on the apparatus of claim 2 and therefore also falls under the “mental processes” grouping of abstract ideas. The judicial exception is not integrated into a practical application. The claim recites generating a machine learning model based on the configuration information. This claim is recited at such a high level of generality that it is nothing more than mere instructions to apply the judicial exception using a generic computer. The claim as a whole, looking at the additional elements individually and in combination does not integrate the judicial exception into a practical application. Therefore, the claim is directed to an abstract idea. The claim does not recite additional elements that amount to significantly more than the judicial exception. In particular, claim 6 recites generating a machine learning model based on the configuration information. This claim is recited at such a high level of generality that it is nothing more than mere instructions to apply the judicial exception using a generic computer. The additional elements, taken individually and in combination, do not result in the claim, as a whole, amounting to significantly more than the identified judicial exception. The claim is not patent eligible. Regarding claim 7: Claim 7 further elaborates on the apparatus of claim 1. The claim also recites determining the component information for storing a component. The foregoing can practically be performed in the human mind. For instance, a person is capable of mentally determining if something is in development or production (application state) and keeping track of a list of different categories (component information). Therefore, these limitations fall within the "mental processes" grouping of abstract ideas. The judicial exception is not integrated into a practical application. The claim recites that the component groups include same components which is nothing more than merely linking the use of the judicial exception to a particular field of use. The claim also recites storing state management information and outputting an application state of the first machine learning model which is nothing more than insignificant extra-solution activity. The claim as a whole, looking at the additional elements individually and in combination does not integrate the judicial exception into a practical application. Therefore, the claim is directed to an abstract idea. The claim does not recite additional elements that amount to significantly more than the judicial exception. In particular, claim 7 recites that the component groups include same components which is nothing more than merely linking the use of the judicial exception to a particular field of use. The claim also recites storing state management information and outputting an application state of the first machine learning model which is nothing more than insignificant extra-solution activity. Storing and outputting information are well-understood, routine, and conventional activities (Versata Dev. Group Inc., v. SAP Am. Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93), see MPEP 2106.05(d). The additional elements, taken individually and in combination, do not result in the claim, as a whole, amounting to significantly more than the identified judicial exception. The claim is not patent eligible. Regarding claim 8: Claim 8 recites a method and therefore falls within the statutory category of a process. The claim also recites adding change history to the component group, recording a version of an updated component group, and generating second configuration information including a component corresponding to the version of the updated component group. The foregoing can practically be performed in the human mind. For instance, a person is capable of mentally keeping track of changes, i.e. keeping track of the amount of money being spent on groceries as more is added to the cart, is adding change history to the component group. Likewise, a person is capable of mentally keeping track of revisions and a version number for that revision as well as, with the aid of pencil and paper, writing (generating) configuration information including a component corresponding to the version of the updated component group. Therefore, these limitations fall within the "mental processes" grouping of abstract ideas. The judicial exception is not integrated into a practical application. The claim recites one or more processors coupled with memory, inputting the configuration information, and configuring the second machine learning model based on the second configuration information which is nothing more than mere instructions to apply the judicial exception using a generic computer. The claim also recites storing a plurality of pieces of component information and component management information which is nothing more than insignificant extra-solution activity. Additionally, the included limitation of “wherein the plurality of component groups include…” is generally linking the judicial exception to a particular field of use. The claim as a whole, looking at the additional elements individually and in combination does not integrate the judicial exception into a practical application. Therefore, the claim is directed to an abstract idea. The claim does not recite additional elements that amount to significantly more than the judicial exception. In particular, claim 8 recites one or more processors coupled with memory, inputting the configuration information, and configuring the second machine learning model based on the second configuration information which is nothing more than mere instructions to apply the judicial exception using a generic computer. The claim also recites storing a plurality of pieces of component information and component management information which is nothing more than insignificant extra-solution activity. Storing information in memory is a well-understood, routine, and conventional activity (Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788F.3d at 1363, 115 USPQ2d at 1092-93), see MPEP 2106.05(d). Additionally, the included limitation of “wherein the plurality of component groups include…” is generally linking the judicial exception to a particular field of use. The additional elements, taken individually and in combination, do not result in the claim, as a whole, amounting to significantly more than the identified judicial exception. The claim is not patent eligible. Regarding claim 9: Claim 9 does not fall within at least one of the four categories of patent eligible subject matter, as cited above. Even if the applicant is able to amend the claim to fall within one of the four statutory categories for patent eligibility, the claim also recites adding change history to the component group, recording a version of an updated component group, and generating second configuration information including a component corresponding to the version of the updated component group. The foregoing can practically be performed in the human mind. For instance, a person is capable of mentally keeping track of changes, i.e. keeping track of the amount of money being spent on groceries as more is added to the cart, is adding change history to the component group. Likewise, a person is capable of mentally keeping track of revisions and a version number for that revision as well as, with the aid of pencil and paper, writing (generating) configuration information including a component corresponding to the version of the updated component group. Therefore, these limitations fall within the "mental processes" grouping of abstract ideas. The judicial exception is not integrated into a practical application. The claim recites a computer, inputting the configuration information, and configuring the second machine learning model based on the second configuration information which is nothing more than mere instructions to apply the judicial exception using a generic computer. The claim also recites storing a plurality of pieces of component information and component management information which is nothing more than insignificant extra-solution activity. Additionally, the included limitation of “wherein the plurality of component groups include…” is generally linking the judicial exception to a particular field of use. The claim as a whole, looking at the additional elements individually and in combination does not integrate the judicial exception into a practical application. Therefore, the claim is directed to an abstract idea. The claim does not recite additional elements that amount to significantly more than the judicial exception. In particular, claim 8 recites a computer, inputting the configuration information, and configuring the second machine learning model based on the second configuration information which is nothing more than mere instructions to apply the judicial exception using a generic computer. The claim also recites storing a plurality of pieces of component information and component management information which is nothing more than insignificant extra-solution activity. Storing information in memory is a well-understood, routine, and conventional activity (Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788F.3d at 1363, 115 USPQ2d at 1092-93), see MPEP 2106.05(d). Additionally, the included limitation of “wherein the plurality of component groups include…” is generally linking the judicial exception to a particular field of use. The additional elements, taken individually and in combination, do not result in the claim, as a whole, amounting to significantly more than the identified judicial exception. The claim is not patent eligible. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-2 and 4-9 are rejected under 35 U.S.C. 103 as being unpatentable over Bowers et al. (US9996804), hereinafter Bowers, in view of Miao et al. (US20170091651), hereinafter Miao. Regarding claim 1, Bowers teaches: An apparatus that generates configuration information for configuring a machine learning model, the apparatus comprising one or more processors coupled with memory, configured to: (Bowers, column 18, lines 52-54: “The code 670 stored in memory 620 may be implemented as software and/or firmware to program the processor(s) 610 to carry out actions described above.” – The code stored in memory implemented as software and/or firmware is analogous to an apparatus configured to carry out the steps below.) store, in the memory, (I) a plurality of pieces of component information comprising a change history for each of a plurality of component groups, the plurality of component groups being groups of a plurality of components associated with functional elements of the machine learning model, and (Bowers, column 4, lines 1-4: “In some embodiments, the model tracking service can store a version history of the latent model in the model tracker database 118. The version history can include a provenance chain of the latent model.” And column 4, lines 43-48: “Each model in the model tracker database 118 can be associated with one or more developer/analyst users, one or more application services, one or more training configurations, experimental metadata associated with testing the model, a version history of the model, evaluative metrics of the model, or any combination thereof.” And Fig. 4 – The model tracking service storing the version history is analogous to the change history. The model tracker database including the application services, training configurations, metadata etc. is analogous to the plurality of component groups being associated with functional elements of the machine learning model.) (ii) component management information comprising versions for the plurality of component groups constituting the machine learning model, (Bowers, column 4, lines 4-8: “Tracking the version history can include tracking one or more modifications from a previous machine learning model to a subsequent machine learning model. For example, the version history can include the production copy as a parent model.” – tracking one or more modifications in the version history is analogous to storing versions for the plurality of component groups.) input the configuration information generated for configuration of the machine learning model on an updated first machine learning model; (Bowers, column 2, lines 45-47: “In turn, the machine learner system can train the new latent model based on the configurations.” – training a latent model based on the configurations is analogous to inputting the configuration information on an updated first machine learning model.) add the change history, for each component group, to the component information as part of the plurality of pieces of the component information based on the configuration information of the updated first machine learning model: (Bowers, column 4, lines 11-19: “In some embodiments, the model tracking service can track and record one or more differences in training configurations of the latent model as compared to its parent model (e.g., the production copy) in the model tracker database 118. The tracked differences in the training configurations can include differences in one or more sources of training datasets, one or more training datasets, one or more data features, or any combination thereof that were used to train the latent model.” – the latent model is analogous to the updated first machine learning model, tracking the differences in training configurations is analogous to adding the change history for the component group to the component information. The training datasets, sources of training datasets, and features are all pieces of component information.) record, in the component management information, a version of an updated component group of the updated first machine learning model; (Bowers, column 6, lines 3-6: “The model tracker database 234 is configured to record data and metadata associated with the machine learning models tracked by the model tracking engine 214.” – the model tracker database serves as the recorder.) Bowers does not explicitly teach: wherein the plurality of component groups include a first group including a component of the plurality of components determined in a development stage of the machine learning model, and a second group including a component of the plurality of components determined in an individualization stage of individualizing the machine learning model after the development stage according to an application target; generate second configuration information including a component corresponding to the version of the updated component group of the updated first machine learning model, for a second machine learning model including a component group of a same version as a component group before updating of the first machine learning model; and configure the second machine learning model based on the second configuration information. However, Miao teaches: wherein the plurality of component groups include a first group including a component of the plurality of components determined in a development stage of the machine learning model, (Miao, paragraph 0068: “Initially, the server is matched to a set of clients for distribution of a statistical model based on attributes of the server and clients (operation 402). For example, the speed, amount of memory, and/or network bandwidth of the server may be matched to the network traffic, popularity, and/or cost associated with the clients. Next, the server transmits a first global version of the statistical model to the set of clients (operation 404). For example, the server may merge a number of local versions of the statistical model from the clients into the first global version of the statistical model. Alternatively, the server and/or another component may generate the global version from a predefined set of training data.” – The local versions and the set of training data are each a component used to create a machine learning model, and since the model generated is the global version this is analogous to the development stage of the machine learning model.) and a second group including a component of the plurality of components determined in an individualization stage of individualizing the machine learning model after the development stage according to an application target; (Miao, paragraph 0022: “In addition, statistical model 108 may be trained and/or adapted to new data received on the clients. For example, the clients may be electronic devices (e.g., personal computers, laptop computers, mobile phones, tablet computers, portable media players, digital cameras, etc.) that produce updates 114-116 to statistical model 108 based on user feedback from users of the clients.” – The new data and user feedback are each components used to determine an individualized model as it is being used to adapt the model which would individualize the model for a particular application target, i.e. the feedback of the clients.) generate second configuration information including a component corresponding to the version of the updated component group of the updated first machine learning model, for a second machine learning model including a component group of a same version as a component group before updating of the first machine learning model; and (Miao, paragraph 0071: “Similarly, the server associates the first set of updates with the second global version (operation 410). For example, the server may obtain and/or generate update identifiers for the first set of updates and/or the global versions used to produce the updates, and combine the update identifiers into a version identifier for the second global version.” – The first set of updates and the global versions used to produce the updates is analogous to the second configuration information corresponding to the updated component group of the updated first machine learning model. These being associated with the second global version is analogous to the second configuration information for a second machine learning model.) configure the second machine learning model based on the second configuration information. (Miao, paragraph 0072: “The server may then transmit the second global version to the clients asynchronously from receiving a second set of updates to one or both global versions from another subset of the clients (operation 412). For example, the server may generate and broadcast the second global version without using any iteration barriers or locks to synchronize the updates from the clients.” – Generating the second global version is analogous to configuring the second machine learning model and the second set of updates is analogous to the second configuration information, as noted above.) Miao is considered analogous to the claimed invention as it is in the same field of endeavor, machine learning. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to have modified Bowers, which already teaches storing a change history for each of a plurality of component groups but does not explicitly teach that the plurality of component groups include a first group determined during a development stage and a second group determined during an individualization stage, generating second configuration information, and configuring the second machine learning model based on the second configuration information, to include the teachings of Miao which does teach that the plurality of component groups include a first group determined during a development stage and a second group determined during an individualization stage, generating second configuration information, and configuring the second machine learning model based on the second configuration information in order to facilitate "effectively collecting, storing, managing, compressing, transferring, sharing, analyzing, and/or visualizing large data sets." (Miao, paragraph 0007) Regarding claim 2, Bowers and Miao teach the apparatus of claim 1, as cited above. Bowers further teaches: store, in the memory, learning element information for storing a change history of a learning element that is used for learning of the machine learning model; (Bowers, column 3, lines 61-63: “The model tracking service can also record the training configurations used to generate the latent model in the model tracker database 118.” – The training configurations used to generate the latent model is analogous to the change history of a learning element used for learning the model.) input a learning element that is used for learning of the updated first machine learning model, adds the change history to the learning element information based on the learning element, and records a version of a learning element of the updated first machine learning model in the component management information; and (Bowers, column 3, line 63 through column 4 line 1: “In some embodiments, the model tracking service can index the latent model in the model tracker database 118 based on the training data source used, the training dataset used, the data features used, or any combination thereof, in creating and training the latent model.” – The training data source, training dataset, and/or the data features is analogous to the change history of the learning element information. Indexing this information is analogous to inputting and recording it.) Bowers does not explicitly teach: store a version of the learning element of the machine learning model; output a version of an updated learning element of the updated first machine learning model, for the second machine learning model using a learning element of a same version as the component group before updating of the first machine learning model, and encode a learning element of the version in the configuration information. However, Miao further teaches: store a version of the learning element of the machine learning model; (Miao, paragraph 0033: “In synchronous ADMM, Equations 8, 9, and 10 are performed in the clients, while Equation 11 is performed by server 102. Server 102 may wait for each round of updates 114-116 to be submitted by the clients before merging updates 114-116 into a new global version of statistical model 108.” – The equations 8, 9, and 10 being analogous to learning elements. Since the server waits for each round of updates to be submitted by the clients then it is storing a version of the learning element.) output a version of an updated learning element of the updated first machine learning model, for the second machine learning model using a learning element of a same version as the component group before updating of the first machine learning model, and encode a learning element of the version in the configuration information. (Miao, paragraph 0051: “In particular, server 102 may begin with a first global version 210 of the statistical model and assign it a version identifier of “1.0”. Server 102 may transmit global version 210 to clients A 202, B 204, and C 206. In turn, client A 202 may generate an update 220 to global version 210, and client C 206 may generate a separate update 222 to global version 210. Updates 220 and 222 may be transmitted to server 102 and merged into a second global version 212 with a version identifier of “2.0A1C1.” To generate the version identifier of global version 212, server 102 may increment the version number (e.g., “2.0”) of the statistical model, identify updates 220 and 222 as changes to global version 210 from clients A 202 and C 206, and append representations of the updates and the global version 212 to clients C 206 and D 208.” – Appending representations of the updates to the global version is analogous to encoding a learning element of the specified version in the configuration information while the server transmitting the version is analogous to outputting the version of the updated learning element.) Regarding claim 4, Bowers and Miao teach the apparatus of claim 1, as cited above. Bowers further teaches: the first group includes information for identifying a learning algorithm of the machine learning model, information for identifying logic used for preprocessing of the learning algorithm, information for identifying logic used for post-processing of the learning algorithm, and a parameter used in the machine learning model, and (Bowers, column 2, lines 27-41: “In several embodiments, the user interface enables an application operator (e.g., a developer/analyst user or an analyst user) to edit configurations of the production copy. For example, the application operator can edit the configurations by adding or removing one or more training datasets or sources of the training datasets, one or more features of interest to use, one or more parameters a model training algorithm (e.g., Gaussian Mixture Model (GMM) algorithm, Support Vector Machine (SVM) algorithm, neural network algorithm, Hidden Markov Model (HMM) algorithm, etc.), or any combination thereof. Based on the edits received via the user interface, the machine learner system can generate a new latent model (e.g., by the training model based on the specified configurations).”) Bowers does not explicitly teach: the second group includes the information for identifying logic used for preprocessing of the learning algorithm, the information for identifying logic used for post-processing of the learning algorithm, the parameter used in the machine learning model, and information for identifying the application target of the machine learning model. However, Miao further teaches: the second group includes the information for identifying logic used for preprocessing of the learning algorithm, the information for identifying logic used for post-processing of the learning algorithm, the parameter used in the machine learning model, and information for identifying the application target of the machine learning model. (Miao, paragraph 0026: “As each piece of feedback is received from the user, the client may provide the feedback as training data for statistical model 108 to customize the output of statistical model 108 to the user’s current job search activity. Consequently, the client may generate recommendations of job listings based on aggregated training data used to produce the global version, as well as the user’s input during the current session with the job search tool.”) Regarding claim 5, Bowers and Miao teach the apparatus of claim 1, as cited above. Bowers further teaches: generate, for the second machine learning model, the machine learning model based on the configuration information including the component corresponding to the version of the updated component group of the updated first machine learning model. (Bowers, column 2, lines 41-45: “The machine learner system can also generate a new latent model by enabling a user to specify configurations (e.g., training datasets, sources of training sets, features of interest, parameters, or any combination thereof) of the latent model from scratch.” – Generating a new latent model is analogous to generating a second machine learning model.) Regarding claim 6, Bowers and Miao teach the apparatus of claim 2, as cited above. Bowers does not explicitly teach: generate, for the second machine learning model, the machine learning model based on the configuration information including the component corresponding to the version of the updated component group of the updated first machine learning model, and performs learning on the generated machine learning model using the learning element. However, Miao further teaches: generate, for the second machine learning model, the machine learning model based on the configuration information including the component corresponding to the version of the updated component group of the updated first machine learning model, and performs learning on the generated machine learning model using the learning element. (Miao, paragraph 0071: “Similarly, the server associates the first set of updates with the second global version (operation 410). For example, the server may obtain and/or generate update identifiers for the first set of updates and/or the global versions used to produce the updates, and combine the update identifiers into a version identifier for the second global version.” – Configuration information would include update identifiers which are used to produce updates.) Regarding claim 7, Bowers and Miao teaches the apparatus of claim 1, s cited above. Bowers does not explicitly teach: wherein the plurality of component groups include same components, and wherein the one or more processors are further configured to: store, in the memory, state management information for managing an application state of the machine learning model, and output an application state of the first machine learning model from the state management information, and determines the component information for storing a component included in updated configuration information for the first machine learning model based on the application state. However, Miao further teaches: wherein the plurality of component groups include same components, (Miao, paragraph 0029: “Version-management apparatus 118 may concatenate, hash, and/or otherwise combine update identifiers for a given subset of updates 114-116 into the version identifier for the global version that will be produced from the updates.” – Update identifiers for a given subset of updates being concatenated, hashed, or otherwise combined is analogous to the component groups including same components.) and wherein the one or more processors are further configured to: store, in the memory, state management information for managing an application state of the machine learning model, and (Miao, paragraph 0029: “For example, each update identifier may specify the client from which the corresponding update was received, as well as the global version of statistical model 108 used to produce the update.” – The client from which the update was received is analogous to application state of the model.) output an application state of the first machine learning model from the state management information, and determines the component information for storing a component included in updated configuration information for the first machine learning model based on the application state. (Miao, paragraph 0054: “By tracking the merging of updates 220-232 into global versions 210-216 using version identifiers of global versions 210-216, server 102 may generate new global versions of the statistical model independently from receiving updates 220-232 to previous global versions from clients A 202, B 204, C 206, and D 208. For example, server 102 may use the version identifiers to ensure that updates that were previously merged into previous global versions are not merged into a new global version of statistical model 108, and that updates that have been received from the clients but not merged into the previous global versions are included in the new global version.” – The version identifiers are analogous to the application state of the machine learning model, this is sent to the clients and is therefore being output. The updates are analogous to component information for updating the configuration information of the machine learning model, which are based on the version identifiers, i.e. the application state.) Regarding claim 8, Claim 8 has all the same limitations of claim 1 which are taught by Bowers and Miao – see claim 1 above. Regarding claim 9, Claim 9 has all the same limitations of claim 1 which are taught by Bowers and Miao – see claim 1 above. Bowers additionally teaches: A program that causes a computer to execute a processing to generate configuration information for configuring a machine learning model, the program causing the computer to perform steps of: (Bowers, column 18, lines 52-58: “The code 670 stored in memory 620 may be implemented as software and/or firmware to program the processor(s) 610 to carry out actions described above. In certain embodiments, such software or firmware may be initially provided to the computing device 600 by downloading it from a remote system through the computing device 600 (e.g., via network adapter 640).”) Response to Arguments Applicant's arguments filed November 7, 2025 have been fully considered but they are not persuasive. Regarding claim rejections under 35 USC §101, examiner concedes that “input configuration information generated for configuration of the machine learning model on an updated first machine learning model” cannot be performed in the human mind, however, it is merely applying the judicial exception using a generic computer. Adding change history to the component information, recording a version of an updated component group, and generating second configuration information are all steps that can be performed mentally, with the aid of pencil and paper. The claims are all directed to “generat[ing] configuration information for configuring a machine learning model,” thus, when considering the claims as a whole the claimed invention is something that can be practically performed in the human mind. Regarding the argument on page 12 of Applicant’s remarks that the improvement of the claimed invention includes “an object of the present invention is to provide a technology capable of supporting generation of a related machine learning model when a certain model is changed.” It is unclear how this represents a technological improvement. For instance, a teacher model can be updated and changed which then leads to the generation of one or more student models which would be related models. This is a common technique which relies upon components of the teacher model to be used to create the student model. Thus, the claims as written do not reflect any alleged technical improvement. Regarding Applicant’s argument that a “factual determination is required to support a conclusion that an additional element (or combination of additional elements) is well-understood, routine, conventional activity” examiner notes that page 4 of the previous office action, immediately following the cited portion, includes case law which supports storing information as being well-understood, routine, conventional activity – Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701, Fed. Cir. 2015); OIP Techs., 788F.3d at 1363, 115 USPQ2d at 1092-93. Therefore, claim rejections under 35 USC §101 of claims 1-2 and 4-9 are maintained. See section Claim Rejections – 35 USC §101 above. Regarding applicant’s argument on page 16 that Bowers does not teach a change history for each of a plurality of components groups associated with functional elements of a machine learning model, examiner disagrees. Bowers teaches a model tracker database which includes things such as training configurations, metadata, evaluation metrics (col. 4, lines 43-48), as well as the configuration differences such as adding/removing features or using different datasets (see Fig. 4). These represent the change history and each of these are component groups which are functional elements of a machine learning model. Regarding applicant’s argument that Miao fails to teach “the plurality of component groups include a first group including a component of the plurality of components determined in a development stage of the machine learning model and a second group including a component of the plurality of components determined in an individualizing stage of individualizing the machine learning model after the development stage according to an application target,” examiner disagrees. The global model of Miao is analogous to the development stage of the model. Thus, when the local versions are merged they represent a component determined in a development stage while the training dataset represents another component of a machine learning model which is also determined in a development stage. Similarly, the clients updating the model on the client device is analogous to an individualizing stage and the model being adapted to new data indicates a component determined in an individualizing stage. Applicant states “simply mentioning the local versions, training data, and feedback, as in Miao, are not the same as components that are associated with functional elements of a machine learning model as discussed hereinabove.” Examiner notes that “functional elements of a machine learning model” is given their plain meaning as Applicant does not define this within the disclosure thus functional elements of a model would include elements that require the machine learning model to function. Such elements would include, at the very least, training data to train and update the machine learning model. Therefore, claim rejections under 35 USC §103 are maintained. See section Claim Rejections – 35 USC §103 above. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Fu et al. (US 11132687) Any inquiry concerning this communication or earlier communications from the examiner should be directed to JACQUELINE MEYER whose telephone number is (703)756-5676. The examiner can normally be reached M-F 8:00 am - 4:30 pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Tamara Kyle can be reached at 571-272-4241. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /J.C.M./Examiner, Art Unit 2144 /TAMARA T KYLE/Supervisory Patent Examiner, Art Unit 2144
Read full office action

Prosecution Timeline

Mar 10, 2022
Application Filed
Jan 10, 2025
Non-Final Rejection — §101, §103
Jun 20, 2025
Response Filed
Aug 04, 2025
Final Rejection — §101, §103
Nov 07, 2025
Request for Continued Examination
Nov 16, 2025
Response after Non-Final Action
Dec 18, 2025
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12585981
MANAGING AN INSTALLED BASE OF ARTIFICIAL INTELLIGENCE MODULES
2y 5m to grant Granted Mar 24, 2026
Patent 12468941
SYSTEMS AND METHODS FOR DYNAMICS-AWARE COMPARISON OF REWARD FUNCTIONS
2y 5m to grant Granted Nov 11, 2025
Study what changed to get past this examiner. Based on 2 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
62%
Grant Probability
99%
With Interview (+67.5%)
4y 3m
Median Time to Grant
High
PTA Risk
Based on 13 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month