Prosecution Insights
Last updated: April 19, 2026
Application No. 18/331,248

DATA ANALYSIS METHOD, DATA ANALYSIS SYSTEM, AND DATA ANALYSIS SYSTEM SERVER

Non-Final OA §101§103
Filed
Jun 08, 2023
Examiner
SPRAUL III, VINCENT ANTON
Art Unit
2129
Tech Center
2100 — Computer Architecture & Software
Assignee
Shimadzu Corporation
OA Round
1 (Non-Final)
59%
Grant Probability
Moderate
1-2
OA Rounds
4y 6m
To Grant
94%
With Interview

Examiner Intelligence

Grants 59% of resolved cases
59%
Career Allow Rate
20 granted / 34 resolved
+3.8% vs TC avg
Strong +35% interview lift
Without
With
+34.7%
Interview Lift
resolved cases with interview
Typical timeline
4y 6m
Avg Prosecution
30 currently pending
Career history
64
Total Applications
across all art units

Statute-Specific Performance

§101
22.6%
-17.4% vs TC avg
§103
48.4%
+8.4% vs TC avg
§102
9.1%
-30.9% vs TC avg
§112
14.4%
-25.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 34 resolved cases

Office Action

§101 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-8 rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Analysis is provided for the claims under the guidelines of MPEP 2106. Regarding claim 1: Step 1: The claim recites “A data analysis method implemented between a system management company and a customer who desires to acquire an analysis result of measurement data acquired from a measuring device, the data analysis method comprising” the steps that follow. Thus the claim is to a process, which is a statutory category of invention. Step 2A prong 1: The limitation “a learning algorithm selection step of selecting the learning algorithm to be used for learning from among the plurality of learning algorithm groups based on the input information,” in its broadest reasonable interpretation, recites a mental process. A person could select a learning algorithm based on input information using observation and judgment. Thus, the claim recites an abstract idea. Step 2A prong 2: The further element “a storage step of storing in advance a plurality of learning algorithm groups each including a type of the measurement data, a type of analysis of the measurement data, and a learning algorithm associated with each other” recites mere data gathering, which is insignificant extra-solution activity (MPEP 2106.05(g)). The further element “a training data receiving step of receiving an input of training data; an input information receiving step of receiving an input of input information including information about the type of the measurement data and information about the type of analysis of the measurement data” recites mere data gathering, which is insignificant extra-solution activity (MPEP 2106.05(g)). The further element “a trained model generation step of generating a trained model based on the training data and the selected learning algorithm” recites model training at a high level of generality. No particular model or method of training is described. The element thus merely recites the use of a computer as a tool to perform the abstract idea, and is equivalent to adding the words “apply it” or the equivalent to the judicial exception (MPEP 2106.05(f)). The further element “an analysis result acquisition step of analyzing the measurement data based on the trained model and acquiring the analysis result” recites model use at a high level of generality. No particular model or method of use is described. The element thus merely recites the use of a computer as a tool to perform the abstract idea, and is equivalent to adding the words “apply it” or the equivalent to the judicial exception (MPEP 2106.05(f)). Thus, the additional elements merely recite the use of a computer as a tool to perform the abstract idea or recite insignificant extra-solution activity. Taken alone, the additional elements do not integrate the abstract idea into a practical application. Considering the elements together as an ordered combination adds nothing that is not present from examining the elements individually. The elements, individually or together, do not describe an improvement in the functioning of technology. Step 2B: The claim as a whole does not amount to significantly more than the recited judicial exception. These additional claim elements recite mere instructions to apply the abstract idea: “a trained model generation step of generating a trained model based on the training data and the selected learning algorithm” “an analysis result acquisition step of analyzing the measurement data based on the trained model and acquiring the analysis result” The elements “a storage step of storing in advance a plurality of learning algorithm groups each including a type of the measurement data, a type of analysis of the measurement data, and a learning algorithm associated with each other” and “a training data receiving step of receiving an input of training data; an input information receiving step of receiving an input of input information including information about the type of the measurement data and information about the type of analysis of the measurement data” recite mere data gathering, which is recognized as well-understood, routine, and conventional activity in the art (see MPEP § 2106.05(d)(II)(i)). Even when considered in combination, the additional elements represent mere instructions to apply the abstract idea to a computer or represent insignificant extra-solution activity, which do not provide an inventive concept. The claim is not eligible under 35 U.S.C. 101. Regarding claim 2: For step 2A prong 1, claim 2 further limits claim 1 and the same elements in claim 2 still recite an abstract idea. The limitation “wherein in the learning algorithm selection step, the learning algorithm corresponding to information about the type of the measurement data and information about the type of analysis of the measurement data is selected from among the plurality of learning algorithm groups” further limits the mental process identified under claim 1, but it remains a mental process. For step 2A prong 2, and step 2B, no further elements remain to be considered. The claim as a whole does not amount to significantly more than the recited judicial exception and is ineligible under 35 U.S.C. 101. Regarding claim 3: For step 2A prong 1, claim 3 further limits claim 2 and the same elements in claim 3 still recite an abstract idea. The limitation “a trained model selection step of selecting the trained model to be used to analyze the measurement data based on the information about the type of the measurement data and the information about the type of analysis of the measurement data received in the analysis receiving step” in its broadest reasonable interpretation, recites a mental process. A person could select a learning trained model based the type of measurement data and the information about the type of analysis of the measurement data, using observation and judgment. Step 2A prong 2: The further element “a trained model storage step of storing, in association with each other, the trained model generated in the trained model generation step, and the type of the measurement data and the type of analysis of the measurement data, both of which are received in the input information receiving step” recites mere data updating, which is insignificant extra-solution activity (MPEP 2106.05(g)). The further element “an analysis receiving step of receiving the information about the type of the measurement data, the information about the type of analysis of the measurement data, and the measurement data acquired by the measuring device” recites mere data gathering, which is insignificant extra-solution activity (MPEP 2106.05(g)). The further element “wherein in the analysis result acquisition step, the measurement data is input to the trained model selected in the trained model selection step, and the analysis result is acquired” recites model use at a high level of generality. No particular model or method of use is described. The element thus merely recites the use of a computer as a tool to perform the abstract idea, and is equivalent to adding the words “apply it” or the equivalent to the judicial exception (MPEP 2106.05(f)). Step 2B: The element “a trained model storage step of storing, in association with each other, the trained model generated in the trained model generation step, and the type of the measurement data and the type of analysis of the measurement data, both of which are received in the input information receiving step” recites mere data updating, which is recognized as well-understood, routine, and conventional activity in the art (see MPEP § 2106.05(d)(II)(iv)). The element “an analysis receiving step of receiving the information about the type of the measurement data, the information about the type of analysis of the measurement data, and the measurement data acquired by the measuring device” recites mere data gathering, which is recognized as well-understood, routine, and conventional activity in the art (see MPEP § 2106.05(d)(II)(i)). The additional element “wherein in the analysis result acquisition step, the measurement data is input to the trained model selected in the trained model selection step, and the analysis result is acquired” recites mere instructions to apply the abstract idea. Even when considered in combination, the additional elements represent mere instructions to apply the abstract idea to a computer or represent insignificant extra-solution activity, which do not provide an inventive concept. The claim is not eligible under 35 U.S.C. 101. Regarding claim 4: For step 2A prong 1, claim 4 further limits claim 3 and the same elements in claim 4 still recite an abstract idea. Step 2 prong 2: The element “a data processing program storage step of storing in advance a plurality of data processing program groups each including a data processing program configured to perform a data process different from a data process of the trained model, the data processing program being associated with the type of the measurement data and the type of analysis of the measurement data” recites mere data gathering, which is recognized as well-understood, routine, and conventional activity in the art (see MPEP § 2106.05(d)(II)(i)). The element “wherein in the analysis receiving step, an operation input is further received to select analysis conditions including the trained model and the data processing program” recites mere data gathering, which is recognized as well-understood, routine, and conventional activity in the art (see MPEP § 2106.05(d)(II)(i)). The element “in the analysis result acquisition step, the analysis result is acquired based on the selected analysis conditions” further limits the analysis result acquisition step but it remains a mere recitation of model use at a high level of generality. No particular model or method of use is described. The element thus merely recites the use of a computer as a tool to perform the abstract idea, and is equivalent to adding the words “apply it” or the equivalent to the judicial exception (MPEP 2106.05(f)). Step 2B: For step 2B, the claim as a whole does not amount to significantly more than the recited judicial exception. The elements “a data processing program storage step of storing in advance a plurality of data processing program groups each including a data processing program configured to perform a data process different from a data process of the trained model, the data processing program being associated with the type of the measurement data and the type of analysis of the measurement data” and “wherein in the analysis receiving step, an operation input is further received to select analysis conditions including the trained model and the data processing program” recite mere data gathering, which is recognized as well-understood, routine, and conventional activity in the art (see MPEP § 2106.05(d)(II)(i)). The additional element “in the analysis result acquisition step, the analysis result is acquired based on the selected analysis conditions” recites mere instructions to apply the abstract idea. Even when considered in combination, the additional elements represent mere instructions to apply the abstract idea to a computer or represent insignificant extra-solution activity, which do not provide an inventive concept. The claim is not eligible under 35 U.S.C. 101. Regarding claim 5: For step 2A prong 1, claim 5 further limits claim 4 and the same elements in claim 5 still recite an abstract idea. For step 2A prong 2, the further element “a data processing program addition receiving step of receiving addition of another data processing program associated with the type of the measurement data and the type of analysis of the measurement data to the plurality of data processing program groups stored in advance” recites mere data updating, which is insignificant extra-solution activity (MPEP 2106.05(g)). For step 2B, the element “a data processing program addition receiving step of receiving addition of another data processing program associated with the type of the measurement data and the type of analysis of the measurement data to the plurality of data processing program groups stored in advance” recites mere data updating, which is recognized as well-understood, routine, and conventional activity in the art (see MPEP § 2106.05(d)(II)(iv)). Even when considered in combination, the additional elements represent mere instructions to apply the abstract idea to a computer or represent insignificant extra-solution activity, which do not provide an inventive concept. The claim is not eligible under 35 U.S.C. 101. Regarding claim 6: For step 2A prong 1, claim 6 further limits claim 1 and the same elements in claim 6 still recite an abstract idea. For step 2A prong 2, the further element “a learning algorithm addition receiving step of receiving addition of another learning algorithm associated with the type of the measurement data and the type of analysis of the measurement data to the plurality of learning algorithm groups stored in advance” recites mere data updating, which is insignificant extra-solution activity (MPEP 2106.05(g)). For step 2B, the element “a learning algorithm addition receiving step of receiving addition of another learning algorithm associated with the type of the measurement data and the type of analysis of the measurement data to the plurality of learning algorithm groups stored in advance” recites mere data updating, which is recognized as well-understood, routine, and conventional activity in the art (see MPEP § 2106.05(d)(II)(iv)). Even when considered in combination, the additional elements represent mere instructions to apply the abstract idea to a computer or represent insignificant extra-solution activity, which do not provide an inventive concept. The claim is not eligible under 35 U.S.C. 101. Regarding claim 7: For step 1, claim 7 recites “A data analysis system configured to perform data analysis between a system management company and a customer who desires to acquire an analysis result of measurement data acquired from a measuring device, the data analysis system comprising: a server configured to generate a trained model to analyze the measurement data; and a data processor configured to request the server to analyze the measurement data; wherein the server includes” the components that follow. Thus the claim is to a machine, which is a statutory category of invention. The components of the server perform the steps of the method of claim 1, and therefore claim 7 is not eligible under 35 U.S.C. 101 by analogous reasoning. Regarding claim 8: For step 1, claim 8 recites “A data analysis system server configured to perform data analysis between a system management company and a customer who desires to acquire an analysis result of measurement data acquired from a measuring device, the data analysis system server comprising” the components that follow. Thus the claim is to a machine, which is a statutory category of invention. The components of the data analysis system performs the steps of the method of claim 1, and therefore claim 8 is not eligible under 35 U.S.C. 101 by analogous reasoning. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-8 rejected under 35 U.S.C. 103 over Limasanches et al., US Pre-Grant Publication No. 2022/0067428 (hereafter Limasanches) in view of Kawaai et al., US Pre-Grant Publication No. 2020/0014761 (hereafter Kawaai). Regarding claim 1 and analogous claims 7-8: Limasanches teaches: “A data analysis method implemented between a system management company and a customer who desires to acquire an analysis result of measurement data acquired from a measuring device, the data analysis method comprising”: Limasanches, paragraph 0005, “A data analysis system configured to perform data analysis between a system management company and a customer who desires to acquire an analysis result of measurement data acquired from a measuring device [A data analysis method implemented between a system management company and a customer who desires to acquire an analysis result of measurement data acquired from a measuring device], the data analysis system comprising: a server configured to generate a trained model to analyze the measurement data; and a data processor configured to request the server to analyze the measurement data; wherein the server includes: a storage configured to store in advance a plurality of learning algorithm groups each including a type of the measurement data, a type of analysis of the measurement data, and a learning algorithm associated with each other; an input receiver configured to receive an input of input information including information about the type of the measurement data and information about the type of analysis of the measurement data, and an input of training data; a learning algorithm selector configured to select the learning algorithm to be used for learning from among the plurality of learning algorithm groups based on the input information; a trained model generator configured to generate the trained model based on the training data and the selected learning algorithm; and an analysis result acquirer configured to analyze the measurement data based on the trained model generated by the trained model generator and acquire the analysis result”; Limasanches, paragraph 0017, “A system disclosed herein may be a physical computer system (one or more physical computers) or may be a system built on a computation resource group (a plurality of computation resources) such as a cloud platform. The computer system or the computation resource group includes one or more interface devices (including, for example, a communication device and an input/output device), one or more storage devices (including, for example, a memory (main storage device) and an auxiliary storage device), and one or more processors.” “a storage step of storing in advance a plurality of learning algorithm groups each including a type of the measurement data, a type of analysis of the measurement data, and a learning algorithm associated with each other”: Limasanches, paragraph 0039, ”The model trainer 107 trains the selected existing learning model using the user's training data set. The model database 108 stores the existing model [a learning algorithm], related information the existing model, the newly trained learning model, and related information on the newly trained learning model. As described later, the related information includes a task description of the learning model [a type of analysis of the measurement data] and an essential characteristic amount vector of training data [a type of the measurement data].” “a training data receiving step of receiving an input of training data”: Limasanches, paragraph 0020, “In an embodiment, a user inputs, to the system, a simple description of a task (new task) desired by the user to be executed and a training data set for the task [receiving step of receiving an input of training data]. The system extracts an essential characteristic amount from the training data set and extracts related information on the task from the description of the task. The system uses a model, data used for training of the model, the corresponding essential characteristic amount, and the description of the corresponding task to find a related learning model in a database storing the foregoing information. The learning model selected from the database is finely adjusted (retrained) using a user's data set. This enables the model to be adapted to a different user's data set.” “an input information receiving step of receiving an input of input information including information about the type of the measurement data and information about the type of analysis of the measurement data”: Limasanches, paragraph 0025-0026,“The system according to the embodiment of the present specification includes a task analyzer and an essential characteristic amount extractor. Input to the task analyzer is a description input by a user. Details of a task desired by the user to be achieved are briefly described [information about the type of analysis of the measurement data]. Output from the task analyzer is a task expression in a format that enables a next functional section to acquire an optimal learning model. As an example, the task expression can be in the format of a keyword string or a character string. The task description input by the user and the task expression generated from the task description are information on the details of the task. Input to the essential characteristic amount extractor is a user's training data set that includes a plurality of files and is in a folder format. Each of the files is one sample of the training data set. Output from the essential characteristic amount extractor is one-dimensional characteristic amount vectors corresponding to data samples included in the user's training data set [information about the type of the measurement data]. Each of the one-dimensional characteristic amount vectors can include a plurality of elements.” “a learning algorithm selection step of selecting the learning algorithm to be used for learning from among the plurality of learning algorithm groups based on the input information”: Limasanches, paragraph 0020, “In an embodiment, a user inputs, to the system, a simple description of a task (new task) desired by the user to be executed and a training data set for the task. The system extracts an essential characteristic amount from the training data set and extracts related information on the task from the description of the task. The system uses a model, data used for training of the model, the corresponding essential characteristic amount, and the description of the corresponding task to find a related learning model in a database storing the foregoing information [selecting the learning algorithm to be used for learning from among the plurality of learning algorithm groups based on the input information]. The learning model selected from the database is finely adjusted (retrained) using a user's data set. This enables the model to be adapted to a different user's data set.” “a trained model generation step of generating a trained model based on the training data and the selected learning algorithm”: Limasanches, paragraph 0039, ”The model trainer 107 trains the selected existing learning model using the user's training data set [generating a trained model based on the training data and the selected learning algorithm]. The model database 108 stores the existing model, related information the existing model, the newly trained learning model, and related information on the newly trained learning model. As described later, the related information includes a task description of the learning model and an essential characteristic amount vector of training data.” Limasanches does not explicitly teach “an analysis result acquisition step of analyzing the measurement data based on the trained model and acquiring the analysis result.” Kawaai teaches “an analysis result acquisition step of analyzing the measurement data based on the trained model and acquiring the analysis result”: Kawaai, paragraph 0038, “Finally, in the device 20 ( e.g., the plurality of devices 201, 202, . . . , 20n in FIG. 1), in a state where the shared model or the additional learned model is stored in the learner, inference processing is performed in the learner by using the device data and an inference result as output data is obtained (S25) [analyzing the measurement data based on the trained model and acquiring the analysis result].” Kawaai and Limasanches are analogous arts as they are both related to automated model selection. It would have been obvious to a person having ordinary skill in the art prior to the effective filing date of the claimed invention to have combined the model inference of Kawaai with the teachings of Limasanches to arrive at the present invention, in order to apply the selected model to get analysis results, as stated in Kawaai, paragraph 0038, “Finally, in the device 20 ( e.g., the plurality of devices 201, 202, . . . , 20n in FIG. 1), in a state where the shared model or the additional learned model is stored in the learner, inference processing is performed in the learner by using the device data and an inference result as output data is obtained (S25).” Regarding claim 2: Limasanches as modified by Kawaai teaches “The data analysis method according to claim 1.” Limasanches further teaches “wherein in the learning algorithm selection step, the learning algorithm corresponding to information about the type of the measurement data and information about the type of analysis of the measurement data is selected from among the plurality of learning algorithm groups”: Limasanches, paragraph 0020, “In an embodiment, a user inputs, to the system, a simple description of a task (new task) desired by the user to be executed and a training data set for the task. The system extracts an essential characteristic amount from the training data set and extracts related information on the task from the description of the task. The system uses a model, data used for training of the model, the corresponding essential characteristic amount [information about the type of the measurement data], and the description of the corresponding task [and information about the type of analysis of the measurement data] to find a related learning model in a database storing the foregoing information [the learning algorithm corresponding to information about the type of the measurement data and information about the type of analysis of the measurement data is selected from among the plurality of learning algorithm groups]. The learning model selected from the database is finely adjusted (retrained) using a user's data set. This enables the model to be adapted to a different user's data set.” Regarding claim 3: Limasanches as modified by Kawaai teaches “The data analysis method according to claim 2.” Limasanches further teaches: “a trained model storage step of storing, in association with each other, the trained model generated in the trained model generation step, and the type of the measurement data and the type of analysis of the measurement data, both of which are received in the input information receiving step”: Limasanches, paragraph 0039, ”The model trainer 107 trains the selected existing learning model using the user's training data set. The model database 108 stores the existing model, related information the existing model, the newly trained learning model, and related information on the newly trained learning model. As described later, the related information includes a task description of the learning model and an essential characteristic amount vector of training data [storing, in association with each other, the trained model generated in the trained model generation step, and the type of the measurement data and the type of analysis of the measurement data].” “an analysis receiving step of receiving the information about the type of the measurement data, the information about the type of analysis of the measurement data, and the measurement data acquired by the measuring device”: Limasanches, paragraph 0020, “In an embodiment, a user inputs, to the system, a simple description of a task (new task) desired by the user [information about the type of analysis of the measurement data] to be executed and a training data set for the task [the measurement data]. The system extracts an essential characteristic amount from the training data set [information about the type of the measurement data] and extracts related information on the task from the description of the task.” “a trained model selection step of selecting the trained model to be used to analyze the measurement data based on the information about the type of the measurement data and the information about the type of analysis of the measurement data received in the analysis receiving step”: Limasanches, paragraph 0020, “In an embodiment, a user inputs, to the system, a simple description of a task (new task) desired by the user to be executed and a training data set for the task. The system extracts an essential characteristic amount from the training data set and extracts related information on the task from the description of the task. The system uses a model, data used for training of the model, the corresponding essential characteristic amount, and the description of the corresponding task to find a related learning model in a database storing the foregoing information [selecting the trained model to be used to analyze the measurement data based on the information about the type of the measurement data and the information about the type of analysis]. The learning model selected from the database is finely adjusted (retrained) using a user's data set. This enables the model to be adapted to a different user's data set.” Kawaai further teaches “wherein in the analysis result acquisition step, the measurement data is input to the trained model selected in the trained model selection step, and the analysis result is acquired”: Kawaai, paragraph 0038, “Finally, in the device 20 ( e.g., the plurality of devices 201, 202, . . . , 20n in FIG. 1), in a state where the shared model or the additional learned model is stored in the learner, inference processing is performed in the learner by using the device data and an inference result as output data is obtained (S25) [the measurement data is input to the trained model selected in the trained model selection step, and the analysis result is acquired].” Kawaai and Limasanches are combinable for the rationale given under claim 1. Regarding claim 4: Limasanches as modified by Kawaai teaches “The data analysis method according to claim 3.” Limasanches further teaches “the data processing program being associated with the type of the measurement data and the type of analysis of the measurement data”: Limasanches, paragraph 0039, ”The model trainer 107 trains the selected existing learning model using the user's training data set. The model database 108 stores the existing model, related information the existing model, the newly trained learning model, and related information on the newly trained learning model [the data processing program being associated with the type of the measurement data and the type of analysis of the measurement data]. As described later, the related information includes a task description of the learning model [type of analysis of the measurement data] and an essential characteristic amount vector of training data [type of the measurement data].” Kawaai further teaches “a data processing program storage step of storing in advance a plurality of data processing program groups each including a data processing program configured to perform a data process different from a data process of the trained model”: Kawaai, paragraphs 0019-0020, “In addition, in the learned model providing system according to some embodiments of the present disclosure, the device has a function of performing additional learning processing on a shared model. The server device includes an additional learned model management unit configured to receive an additional learned model transmitted from the device to cause a storage unit to store the additional learned model. A target shared model selection unit of the server device is configured to perform selection by including as option, in addition to a shared model, also an additional learned model. In addition, in the learned model providing system according to some embodiments of the present disclosure, the device has a function of performing additional learning processing on a shared model, and includes a storage unit caused to store an additional learned model, and an additional learned model information transmitter configured to transmit information necessary for selecting an additional learned model to the server device. A target shared model selection unit of the server device is configured to perform selection by including as option, in addition to the shared model, also an additional learned model stored in a storage unit of the device [storing in advance a plurality of data processing program groups each including a data processing program configured to perform a data process different from a data process of the trained model].” Kawaai and Limasanches are analogous arts as they are both related to automated model selection. It would have been obvious to a person having ordinary skill in the art prior to the effective filing date of the claimed invention to have combined the additional processing step of Kawaai with the teachings of Limasanches to arrive at the present invention, in order to provide further model specialization, as stated in Kawaai, paragraph 0021, “In addition, providing an additional learning processing function allows an additional learned model more specialized in the environment and conditions of the device to be obtained, so that it is possible to additionally perform highly accurate inference processing in the device.” Regarding claim 5: Limasanches as modified by Kawaai teaches “The data analysis method according to claim 4.” Limasanches further teaches (bold only) “a data processing program addition receiving step of receiving addition of another data processing program associated with the type of the measurement data and the type of analysis of the measurement data to the plurality of data processing program groups stored in advance“: Limasanches, paragraph 0039, ”The model trainer 107 trains the selected existing learning model using the user's training data set. The model database 108 stores the existing model, related information the existing model, the newly trained learning model, and related information on the newly trained learning model [the data processing program being associated with the type of the measurement data and the type of analysis of the measurement data]. As described later, the related information includes a task description of the learning model [type of analysis of the measurement data] and an essential characteristic amount vector of training data [type of the measurement data].” Kawaai further teaches (bold only) “a data processing program addition receiving step of receiving addition of another data processing program associated with the type of the measurement data and the type of analysis of the measurement data to the plurality of data processing program groups stored in advance”: Kawaai, paragraph 0035, “A shared model is selected or a learning model is newly generated, and then additional learning is performed by a learner on the shared model or the new learning model (S16). The additional learning is performed by using sample data for performing additional learning, collected from the device 20. After the additional learning is completed, the generated additional learned model is stored in the storage unit 15 (S17) [receiving addition of another data processing program … to the plurality of data processing program groups stored in advance]. The server device 10 may transmit the generated additional learned model to the device 20.” Kawaai and Limasanches are analogous arts as they are both related to automated model selection. It would have been obvious to a person having ordinary skill in the art prior to the effective filing date of the claimed invention to have combined the storage of the additional processing step of Kawaai with the teachings of Limasanches to arrive at the present invention, in order to reuse the additional model in providing further model specialization, as stated in Kawaai, paragraph 0021, “In addition, providing an additional learning processing function allows an additional learned model more specialized in the environment and conditions of the device to be obtained, so that it is possible to additionally perform highly accurate inference processing in the device.” Regarding claim 6: Limasanches as modified by Kawaai teaches “The data analysis method according to claim 1.” Limasanches further teaches “a learning algorithm addition receiving step of receiving addition of another learning algorithm associated with the type of the measurement data and the type of analysis of the measurement data to the plurality of learning algorithm groups stored in advance”: Limasanches, paragraph 0051, “When the ratio of the harmful sample is smaller than the threshold (YES in step S106), the model trainer 107 trains the selected learning model using the user data set (S109). Input to the learning model for the training is the essential characteristic amount vector extracted from the user data set. After that, the trained learning model, the essential characteristic amount vector of the training data, and the task description are stored in the model database 108 and can be used for the future (S110) [receiving addition of another learning algorithm associated with the type of the measurement data and the type of analysis of the measurement data to the plurality of learning algorithm groups stored in advance].” Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Crossley et al., US Pre-Grant Publication No. 2018/0063265, discloses a method for selecting from among a plurality of models, based on data types for data to be processed. Krasner et al., US Pre-Grant Publication No. 2022/0179829, discloses a machine-learned method of selecting among a plurality of compression algorithms based on predicted efficiency. Any inquiry concerning this communication or earlier communications from the examiner should be directed to VINCENT SPRAUL whose telephone number is (703) 756-1511. The examiner can normally be reached M-F 9:00 am - 5:00 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, MICHAEL HUNTLEY can be reached at (303) 297-4307. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /VAS/ Examiner, Art Unit 2129 /MICHAEL J HUNTLEY/Supervisory Patent Examiner, Art Unit 2129
Read full office action

Prosecution Timeline

Jun 08, 2023
Application Filed
Feb 24, 2026
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12591634
COMPOSITE EMBEDDING SYSTEMS AND METHODS FOR MULTI-LEVEL GRANULARITY SIMILARITY RELEVANCE SCORING
2y 5m to grant Granted Mar 31, 2026
Patent 12591796
INTELLIGENT DISTANCE PROMPTING
2y 5m to grant Granted Mar 31, 2026
Patent 12572620
RELIABLE INFERENCE OF A MACHINE LEARNING MODEL
2y 5m to grant Granted Mar 10, 2026
Patent 12566974
Method, System, and Computer Program Product for Knowledge Graph Based Embedding, Explainability, and/or Multi-Task Learning
2y 5m to grant Granted Mar 03, 2026
Patent 12547616
SEMANTIC REASONING FOR TABULAR QUESTION ANSWERING
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
59%
Grant Probability
94%
With Interview (+34.7%)
4y 6m
Median Time to Grant
Low
PTA Risk
Based on 34 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month