Prosecution Insights
Last updated: April 19, 2026
Application No. 19/038,743

COMPUTER-READABLE RECORDING MEDIUM STORING INFORMATION OUTPUT PROGRAM, INFORMATION OUTPUT METHOD, AND INFORMATION PROCESSING DEVICE

Non-Final OA §101§102
Filed
Jan 28, 2025
Examiner
NEWTON, CHAD A
Art Unit
3681
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Fujitsu Limited
OA Round
1 (Non-Final)
38%
Grant Probability
At Risk
1-2
OA Rounds
4y 0m
To Grant
64%
With Interview

Examiner Intelligence

Grants only 38% of cases
38%
Career Allow Rate
82 granted / 218 resolved
-14.4% vs TC avg
Strong +26% interview lift
Without
With
+26.0%
Interview Lift
resolved cases with interview
Typical timeline
4y 0m
Avg Prosecution
55 currently pending
Career history
273
Total Applications
across all art units

Statute-Specific Performance

§101
35.3%
-4.7% vs TC avg
§103
38.7%
-1.3% vs TC avg
§102
12.7%
-27.3% vs TC avg
§112
10.5%
-29.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 218 resolved cases

Office Action

§101 §102
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims This office action for the 19/038743 application is in response to the communications filed January 28, 2025. Claims 1-20 were initially submitted January 28, 2025. Claims 1-20 are currently pending and considered below. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more. As per claim 1, Step 1: The claim recites subject matter within a statutory category as a manufacture. Step 2A is a two-prong inquiry, in which Prong 1 determines whether a claim recites a judicial exception. Prong 2 determines if the additional limitations of the claim integrates the recited judicial exception into a practical application. If the additional elements of the claim fail to integrate the judicial exception into a practical application, claim is directed to the recited judicial exception, see MPEP 2106.04(II)(A). Step 2A Prong 1: The claim contains subject matter that recites an abstract idea, with the steps of a process comprising: identifying a flow of persons in a route that includes a plurality of options by using a behavior selection model that indicates which behavior a person selects for a policy; generating information that indicates a prediction result of the policy based on an identified flow of persons; and outputting generated information of a prediction result of policy. These steps, as drafted, under the broadest reasonable interpretation recite: certain methods of organizing human activity (e.g., fundamental economic principles or practices including: hedging; insurance; mitigating risk; etc., commercial or legal interactions including: agreements in the form of contracts; legal obligations; advertising, marketing or sales activities or behaviors; business relations; etc., managing personal behavior or relationships or interactions between people including: social activities; teaching; following rules or instructions; etc.) but for recitation of generic computer components. That is, other than reciting steps as performed by the generic computer components, nothing in the claim element precludes the step from being directed to certain methods of organizing human activity. The identified abstract idea, law of nature, or natural phenomenon identified above, in the context of this claim, encompasses a certain method of organizing human activity, namely managing personal behavior or relationships or interactions between people. This is because each of the limitations of the abstract idea recites a list of rules or instructions that a human person can follow in the course of their personal behavior. If a claim limitation, under its broadest reasonable interpretation, covers at least the recited methods of organizing human activity above, but for the recitation of generic computer components, then it falls within the “Certain Methods of Organizing Human Activity” grouping of abstract ideas. Accordingly, the claim recites an abstract idea. See MPEP 2106.04(a). Step 2A Prong 2: The claim does not recite additional elements that integrate the judicial exception into a practical application. In particular, the additional elements do not integrate the abstract idea into a practical application, other than the abstract idea per se, because the additional elements amount to no more than limitations which: amount to mere instructions to apply an exception, see MPEP 2106.05(f), such as: “A non-transitory computer-readable recording medium storing an information output program for causing a computer to execute” which corresponds to merely using a computer as a tool to perform an abstract idea. Paragraphs [0137]-[0141] of the as-filed specification describes that the hardware that implements the steps of the abstract idea amount to nothing more than generic computer components. Implementing an abstract idea on a generic computer, does not integrate the abstract idea into a practical application in Step 2A Prong Two or add significantly more in Step 2B, similar to how the recitation of the computer in the claim in Alice amounted to mere instructions to apply the abstract idea of intermediated settlement on a generic computer. add insignificant extra-solution activity to the abstract idea, see MPEP 2106.05(g), such as: “to a display screen” which corresponds to mere data gathering and/or output. Accordingly, this claim is directed to an abstract idea. Step 2B: The claim does not recite additional elements that amount to significantly more than the judicial exception. As discussed above with respect to discussion of integration of the abstract idea into a practical application, the additional elements amount to no more than mere instructions to apply an exception, add insignificant extra-solution activity to the abstract idea, and/or generally link the abstract idea to a particular technological environment or field of use. Additionally, the additional limitations, identified as insignificant extra-solution activity to the abstract idea, amount to no more than limitations which amount to elements that have been recognized as well-understood, routine, and conventional activity in particular fields such as: computer functions that have been identified by the courts as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity, see MPEP 2106.05(d)(II), such as: “to a display screen” which corresponds to receiving or transmitting data over a network. Looking at the limitations of the claim as an ordered combination adds nothing that is not already present when looking at the elements taken individually. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely recite an abstract idea and/or provide conventional computer implementation which does not impose a meaningful limit to integrate the abstract idea into a practical application and/or amount to no more than limitations which amount to elements that have been recognized as well-understood, routine, and conventional activity in particular fields. As per claim 2, Claim 2 depends from claim 1 and inherits all the limitations of the claim from which it depends. Claim 2 merely further defines the abstract idea and/or introduces additional elements that are insufficient to provide a practical application or something significantly more: “acquiring data related to the person, the identifying includes a process of, when acquired data related to a person is input to a policy model, identifying a route to a terminal node through nodes and edges by using a behavior selection model that indicates which behavior the person selects for the policy, and the generating includes a process of generating information that indicates a prediction result of policy related to a service to be provided to the person based on an identified route.” further describes the abstract idea. This claim limitation is still directed to “Certain Methods of Organizing Human Activity” and therefore continues to recite an abstract idea. “wherein the computer is caused to further execute a process of” further defines an additional element that was insufficient to provide a practical application and/or significantly more. The claim with this further defining limitation still corresponds to merely using a computer as a tool to perform an abstract idea. Looking at the limitations of the claim as an ordered combination adds nothing that is not already present when looking at the elements taken individually. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely recite an abstract idea and/or provide conventional computer implementation which does not impose a meaningful limit to integrate the abstract idea into a practical application and/or amount to no more than limitations which amount to elements that have been recognized as well-understood, routine, and conventional activity in particular fields. As per claim 3, Claim 3 from claim 1 and inherits all the limitations of the claim from which it depends. Claim 3 merely further defines the abstract idea and/or introduces additional elements that are insufficient to provide a practical application or something significantly more: “wherein the identifying includes a process of predicting a number of persons who use a service in each of a plurality of routes to terminal nodes branched with an intermediate node as a starting point, based on a behavior selection model that indicates which behavior a person selects for the policy, and the generating includes a process of generating information that indicates a prediction result of policy in which a predicted number of persons and a service to be provided to the person are associated with each other.” further describes the abstract idea. This claim limitation is still directed to “Certain Methods of Organizing Human Activity” and therefore continues to recite an abstract idea. Looking at the limitations of the claim as an ordered combination adds nothing that is not already present when looking at the elements taken individually. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely recite an abstract idea and/or provide conventional computer implementation which does not impose a meaningful limit to integrate the abstract idea into a practical application and/or amount to no more than limitations which amount to elements that have been recognized as well-understood, routine, and conventional activity in particular fields. As per claim 4, Claim 4 depends from claim 1 and inherits all the limitations of the claim from which it depends. Claim 4 merely further defines the abstract idea and/or introduces additional elements that are insufficient to provide a practical application or something significantly more: “acquiring a policy model which has a flow structure constituted by nodes and edges, and in which a behavior selection model that indicates which behavior a person selects for the policy is associated with an intermediate node of the flow structure, and a service to be provided to the person is associated with an intermediate node or a terminal node of the flow structure, the identifying includes a process of identifying a route to a terminal node by tracing nodes and edges of the flow structure based on the behavior selection model when data related to a person to be analyzed is input to an acquired policy model, the generating includes a process of generating information that indicates a prediction result of policy in which a number of persons flowing in an identified route and a service to be provided to the person are associated with each other, and the outputting includes a process of outputting generated information that indicates a prediction result of policy” further describes the abstract idea. This claim limitation is still directed to “Certain Methods of Organizing Human Activity” and therefore continues to recite an abstract idea. “wherein the computer is caused to further execute a process of” further defines an additional element that was insufficient to provide a practical application and/or significantly more. The claim with this further defining limitation still corresponds to merely using a computer as a tool to perform an abstract idea. “to the display screen.” further defines an additional element that was insufficient to provide a practical application and/or significantly more. The claim with this further defining limitation still corresponds to mere data gathering and/or output and receiving or transmitting data over a network. Looking at the limitations of the claim as an ordered combination adds nothing that is not already present when looking at the elements taken individually. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely recite an abstract idea and/or provide conventional computer implementation which does not impose a meaningful limit to integrate the abstract idea into a practical application and/or amount to no more than limitations which amount to elements that have been recognized as well-understood, routine, and conventional activity in particular fields. As per claim 5, Claim 5 depends from claim 1 and inherits all the limitations of the claim from which it depends. Claim 5 merely further defines the abstract idea and/or introduces additional elements that are insufficient to provide a practical application or something significantly more: “wherein the identifying identifies a number of persons flowing through a node in a flow graph based on the behavior selection model when acquired data related to a person is input to a policy model, the generating generates an image in which edges are made thick based on an identified number of persons,” further describes the abstract idea. This claim limitation is still directed to “Certain Methods of Organizing Human Activity” and therefore continues to recite an abstract idea. “the outputting displays a generated image on the display screen.” further defines an additional element that was insufficient to provide a practical application and/or significantly more. The claim with this further defining limitation still corresponds to mere data gathering and/or output and receiving or transmitting data over a network. Looking at the limitations of the claim as an ordered combination adds nothing that is not already present when looking at the elements taken individually. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely recite an abstract idea and/or provide conventional computer implementation which does not impose a meaningful limit to integrate the abstract idea into a practical application and/or amount to no more than limitations which amount to elements that have been recognized as well-understood, routine, and conventional activity in particular fields. As per claim 6, Claim 6 depends from claim 1 and inherits all the limitations of the claim from which it depends. Claim 6 merely further defines the abstract idea and/or introduces additional elements that are insufficient to provide a practical application or something significantly more: “wherein the generating includes a process of generating an image that indicates a first prediction result in which a number of persons flowing through a node in a flow graph is predicted, by inputting data related to a person to a first policy model before the policy is executed, and a process of generating an image that indicates a second prediction result in which a number of persons flowing through a node in a flow graph after the policy is executed is predicted, by inputting to a second policy model after the policy is executed, and the outputting includes …a first prediction result and the image that indicates a second prediction result.” further describes the abstract idea. This claim limitation is still directed to “Certain Methods of Organizing Human Activity” and therefore continues to recite an abstract idea. “a process of displaying, on the display screen, the image that indicates” further defines an additional element that was insufficient to provide a practical application and/or significantly more. The claim with this further defining limitation still corresponds to mere data gathering and/or output and receiving or transmitting data over a network. Looking at the limitations of the claim as an ordered combination adds nothing that is not already present when looking at the elements taken individually. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely recite an abstract idea and/or provide conventional computer implementation which does not impose a meaningful limit to integrate the abstract idea into a practical application and/or amount to no more than limitations which amount to elements that have been recognized as well-understood, routine, and conventional activity in particular fields. As per claim 7, Claim 7 depends from claim 6 and inherits all the limitations of the claim from which it depends. Claim 7 merely further defines the abstract idea and/or introduces additional elements that are insufficient to provide a practical application or something significantly more: “identifying the behavior selection model that corresponds to a policy under consideration,” further describes the abstract idea. This claim limitation is still directed to “Certain Methods of Organizing Human Activity” and therefore continues to recite an abstract idea. “wherein the computer is caused to further execute a process of” further defines an additional element that was insufficient to provide a practical application and/or significantly more. The claim with this further defining limitation still corresponds to merely using a computer as a tool to perform an abstract idea. “storing an identified behavior selection model in association with an intermediate node of a second policy model after the policy is executed.” introduces additional elements that is insufficient to provide a practical application or significantly more: Step 2A Prong 2: In particular, the additional elements do not integrate the abstract idea into a practical application, other than the abstract idea per se, because the additional elements amount to no more than limitations which: add insignificant extra-solution activity to the abstract idea, see MPEP 2106.05(g), such as: “storing an identified behavior selection model in association with an intermediate node of a second policy model after the policy is executed.” which corresponds to mere data gathering and/or output. Step 2B: As discussed above with respect to discussion of integration of the abstract idea into a practical application, the additional elements amount to no more than mere instructions to apply an exception, add insignificant extra-solution activity to the abstract idea, and/or generally link the abstract idea to a particular technological environment or field of use. Additionally, the additional limitations, identified as insignificant extra-solution activity to the abstract idea, amount to no more than limitations which amount to elements that have been recognized as well-understood, routine, and conventional activity in particular fields such as: computer functions that have been identified by the courts as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity, see MPEP 2106.05(d)(II), such as: “storing an identified behavior selection model in association with an intermediate node of a second policy model after the policy is executed.” which corresponds to storing and retrieving information in memory. Looking at the limitations of the claim as an ordered combination adds nothing that is not already present when looking at the elements taken individually. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely recite an abstract idea and/or provide conventional computer implementation which does not impose a meaningful limit to integrate the abstract idea into a practical application and/or amount to no more than limitations which amount to elements that have been recognized as well-understood, routine, and conventional activity in particular fields. As per claim 8, Claim 8 depends from claim 1 and inherits all the limitations of the claim from which it depends. Claim 8 merely further defines the abstract idea and/or introduces additional elements that are insufficient to provide a practical application or something significantly more: “generating a …twin in which a real space is reproduced in a … space, executing, in the … twin that has been generated, a simulation of a flow of persons in a route that includes the plurality of options by using the behavior selection model, and generating information that indicates a prediction result of the policy based on a result of an executed simulation.” further describes the abstract idea. This claim limitation is still directed to “Certain Methods of Organizing Human Activity” and therefore continues to recite an abstract idea. “wherein the computer is caused to further execute a process of”, “digital” and “virtual” further defines an additional element that was insufficient to provide a practical application and/or significantly more. The claim with this further defining limitation still corresponds to merely using a computer as a tool to perform an abstract idea. Looking at the limitations of the claim as an ordered combination adds nothing that is not already present when looking at the elements taken individually. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely recite an abstract idea and/or provide conventional computer implementation which does not impose a meaningful limit to integrate the abstract idea into a practical application and/or amount to no more than limitations which amount to elements that have been recognized as well-understood, routine, and conventional activity in particular fields. As per claim 9, Claim 9 depends from claim 8 and inherits all the limitations of the claim from which it depends. Claim 9 merely further defines the abstract idea and/or introduces additional elements that are insufficient to provide a practical application or something significantly more: “the generating of a … twin generates, in the … space, a … twin that is time-synchronized with the real space, the executing of a simulation executes, by using attribute information of each of a plurality of agents that corresponds to each of a plurality of persons and the behavior selection model associated with a conditional branch of a policy model, in the … twin, a simulation of whether each of the plurality of persons takes a behavior for the policy, so as to identify a route to which each of the plurality of persons is allocated in an option of the conditional branch, and the generating generates information that indicates a prediction result of the policy that includes an identified route to which each of the plurality of identified persons is allocated.” further describes the abstract idea. This claim limitation is still directed to “Certain Methods of Organizing Human Activity” and therefore continues to recite an abstract idea. “digital” and “virtual” further defines an additional element that was insufficient to provide a practical application and/or significantly more. The claim with this further defining limitation still corresponds to merely using a computer as a tool to perform an abstract idea. Looking at the limitations of the claim as an ordered combination adds nothing that is not already present when looking at the elements taken individually. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely recite an abstract idea and/or provide conventional computer implementation which does not impose a meaningful limit to integrate the abstract idea into a practical application and/or amount to no more than limitations which amount to elements that have been recognized as well-understood, routine, and conventional activity in particular fields. As per claim 10, Claim 10 is substantially similar to claim 1. Accordingly, claim 10 is rejected for the same reasons as claim 1. As per claim 11, Claim 11 is substantially similar to claim 2. Accordingly, claim 11 is rejected for the same reasons as claim 2. As per claim 12, Claim 12 is substantially similar to claim 3. Accordingly, claim 12 is rejected for the same reasons as claim 3. As per claim 13, Claim 13 is substantially similar to claim 4. Accordingly, claim 13 is rejected for the same reasons as claim 4. As per claim 14, Claim 14 is substantially similar to claim 5. Accordingly, claim 14 is rejected for the same reasons as claim 5. As per claim 15, Claim 15 is substantially similar to claim 6. Accordingly, claim 15 is rejected for the same reasons as claim 6. As per claim 16, Claim 16 is substantially similar to claim 7. Accordingly, claim 16 is rejected for the same reasons as claim 7. As per claim 17, Claim 17 is substantially similar to claim 1. Accordingly, claim 17 is rejected for the same reasons as claim 1. Claim 17 further recites limitations which: “An information processing device comprising: a memory; and a processor coupled to the memory and configured to perform a process of:” further defines an additional element that was insufficient to provide a practical application and/or significantly more. The claim with this further defining limitation still corresponds to merely using a computer as a tool to perform an abstract idea. Looking at the limitations of the claim as an ordered combination adds nothing that is not already present when looking at the elements taken individually. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely recite an abstract idea and/or provide conventional computer implementation which does not impose a meaningful limit to integrate the abstract idea into a practical application and/or amount to no more than limitations which amount to elements that have been recognized as well-understood, routine, and conventional activity in particular fields. As per claim 18, Claim 18 is substantially similar to claim 2. Accordingly, claim 18 is rejected for the same reasons as claim 2. As per claim 19, Claim 19 is substantially similar to claim 3. Accordingly, claim 19 is rejected for the same reasons as claim 3. As per claim 20, Claim 20 is substantially similar to claim 4. Accordingly, claim 20 is rejected for the same reasons as claim 4. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-4, 8-13 and 17-20 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Vaughan (US 2021/0035668). As per claim 1, Vaughan discloses a non-transitory computer-readable recording medium storing an information output program for causing a computer to execute a process: (Paragraph [0164] of Vaughan. The teaching describes that the systems and methods provided herein, such as the computer system 401, can be embodied in programming. Various aspects of the technology may be thought of as “products” or “articles of manufacture” typically in the form of machine (or processor) executable code and/or associated data that is carried on or embodied in a type of machine readable medium. Machine-executable code can be stored on an electronic storage unit, such memory (e.g., read-only memory, random-access memory, flash memory) or a hard disk. “Storage” type media can include any or all of the tangible memory of the computers, processors or the like, or associated modules thereof, such as various semiconductor memories, tape drives, disk drives and the like, which may provide non-transitory storage at any time for the software programming.) Vaughan further discloses identifying a flow of persons in a route that includes a plurality of options by using a behavior selection model that indicates which behavior a person selects for a policy: (Paragraphs [0082], [0095]-[0105] and [0175] and Figure 3 of Vaughan. The teaching describes that the therapeutic module may rely on the diagnostic module in order to classify subjects as having different conditions or different severity levels of a condition. Specifically, the therapeutic module 134 can then indicate to the diagnosis module 132 that the latest round of therapy is finished, and that a new diagnosis is needed. The diagnostic module 132 can then provide new diagnostic tests and questions to the digital device 110, as well as take input from the therapeutic module of any data provided as part of therapy, such as recordings of learning sessions or browsing history of caregivers or subjects related to the therapy or diagnosed condition. The diagnostic module 132 then provides an updated diagnosis to repeat the process and provide a next step of therapy. This means that the flows pertaining to a particular subject refer to a plurality of subjects undergoing the same process individually, forming an analysis of the population of subjects. FIG. 3 illustrates a flow diagram 300 showing the handling of suspected or confirmed speech and language delay. This model relies upon user selection which describes the behavior of the subject to recommend next steps in the subject’s treatment. For example a subject that indicates non-verbal behavior, a person selects this option in the system for which steps 312, 314 and processed and end nodes 318 and 316 result in the policy that is selected for the subject. Policies will result in recommending a therapist visit in 318 or 328 or reporting progress with further therapies being recommended) Vaughan further discloses generating information that indicates a prediction result of the policy based on an identified flow of persons and outputting generated information of a prediction result of policy to a display screen: (Paragraphs [0112], [0131] and [0132] and Figure 8 of Vaughan. The teaching describes that digital therapeutics treatment methods and apparatus described with reference to FIGS. 1-4 are particularly well suited for combination with the methods and apparatus to evaluate subjects with fewer questions described herein with reference to FIGS. 5A to 14. This means that the process of Figure 3 is followed up by a process such as the one described in Figure 8. Figure 8 is an exemplary operational flow 800 of a method of a prediction module. This prediction model uses new and saved information to determine a classification of a subject with regard to a plurality of predicted conditions of the subject. This means that when a policy is selected in the process of Figure 3, the process of Figure 8 predicts the result of that policy in the form of predicting a condition of the subject. The results of this prediction is output to the user of the device in step 825.) As per claim 2, Vaughan discloses the limitations of claim 1. Vaughan further discloses wherein the computer is caused to further execute a process of acquiring data related to the person and the identifying includes a process of, when acquired data related to a person is input to a policy model, identifying a route to a terminal node through nodes and edges by using a behavior selection model that indicates which behavior the person selects for the policy: (Paragraphs [0082], [0095]-[0105] and [0175] and Figure 3 of Vaughan. The teaching describes that the therapeutic module may rely on the diagnostic module in order to classify subjects as having different conditions or different severity levels of a condition. Specifically, the therapeutic module 134 can then indicate to the diagnosis module 132 that the latest round of therapy is finished, and that a new diagnosis is needed. The diagnostic module 132 can then provide new diagnostic tests and questions to the digital device 110, as well as take input from the therapeutic module of any data provided as part of therapy, such as recordings of learning sessions or browsing history of caregivers or subjects related to the therapy or diagnosed condition. The diagnostic module 132 then provides an updated diagnosis to repeat the process and provide a next step of therapy. This means that the flows pertaining to a particular subject refer to a plurality of subjects undergoing the same process individually, forming an analysis of the population of subjects. FIG. 3 illustrates a flow diagram 300 showing the handling of suspected or confirmed speech and language delay. This model relies upon user selection which describes the behavior of the subject to recommend next steps in the subject’s treatment. For example a subject that indicates non-verbal behavior, a person selects this option in the system for which steps 312, 314 and processed and end nodes 318 and 316 result in the policy that is selected for the subject. Policies will result in recommending a therapist visit in 318 or 328 or reporting progress with further therapies being recommended. At step 312 or 322, additional information about the subject is collected with the tests that are implemented.) Vaughan further discloses the generating includes a process of generating information that indicates a prediction result of policy related to a service to be provided to the person based on an identified route: (Paragraphs [0112], [0131] and [0132] and Figure 8 of Vaughan. The teaching describes that digital therapeutics treatment methods and apparatus described with reference to FIGS. 1-4 are particularly well suited for combination with the methods and apparatus to evaluate subjects with fewer questions described herein with reference to FIGS. 5A to 14. This means that the process of Figure 3 is followed up by a process such as the one described in Figure 8. Figure 8 is an exemplary operational flow 800 of a method of a prediction module. This prediction model uses new and saved information to determine a classification of a subject with regard to a plurality of predicted conditions of the subject. This means that when a policy is selected in the process of Figure 3, the process of Figure 8 predicts the result of that policy in the form of predicting a condition of the subject. The results of this prediction is output to the user of the device in step 825.) As per claim 3, Vaughan discloses the limitations of claim 1. Vaughan further discloses wherein the identifying includes a process of predicting a number of persons who use a service in each of a plurality of routes to terminal nodes branched with an intermediate node as a starting point, based on a behavior selection model that indicates which behavior a person selects for the policy: (Paragraphs [0082], [0095]-[0105] and [0175] and Figure 3 of Vaughan. The teaching describes that the therapeutic module may rely on the diagnostic module in order to classify subjects as having different conditions or different severity levels of a condition. Specifically, the therapeutic module 134 can then indicate to the diagnosis module 132 that the latest round of therapy is finished, and that a new diagnosis is needed. The diagnostic module 132 can then provide new diagnostic tests and questions to the digital device 110, as well as take input from the therapeutic module of any data provided as part of therapy, such as recordings of learning sessions or browsing history of caregivers or subjects related to the therapy or diagnosed condition. The diagnostic module 132 then provides an updated diagnosis to repeat the process and provide a next step of therapy. This means that the flows pertaining to a particular subject refer to a plurality of subjects undergoing the same process individually, forming an analysis of the population of subjects. FIG. 3 illustrates a flow diagram 300 showing the handling of suspected or confirmed speech and language delay. This model relies upon user selection which describes the behavior of the subject to recommend next steps in the subject’s treatment. For example a subject that indicates non-verbal behavior, a person selects this option in the system for which steps 312, 314 and processed and end nodes 318 and 316 result in the policy that is selected for the subject. Policies will result in recommending a therapist visit in 318 or 328 or reporting progress with further therapies being recommended. Intermediate step 312 or 322, additional information about the subject is collected with the tests that are implemented. As in the non-verbal track, progress in response to verbal therapies is continually monitored in step 324 to determine whether a diagnosis has improved at a predicted rate.) Vaughan further discloses the generating includes a process of generating information that indicates a prediction result of policy in which a predicted number of persons and a service to be provided to the person are associated with each other: (Paragraphs [0112], [0131] and [0132] and Figure 8 of Vaughan. The teaching describes that digital therapeutics treatment methods and apparatus described with reference to FIGS. 1-4 are particularly well suited for combination with the methods and apparatus to evaluate subjects with fewer questions described herein with reference to FIGS. 5A to 14. This means that the process of Figure 3 is followed up by a process such as the one described in Figure 8. Figure 8 is an exemplary operational flow 800 of a method of a prediction module. This prediction model uses new and saved information to determine a classification of a subject with regard to a plurality of predicted conditions of the subject. This means that when a policy is selected in the process of Figure 3, the process of Figure 8 predicts the result of that policy in the form of predicting a condition of the subject. The results of this prediction is output to the user of the device in step 825.) As per claim 4, Vaughan discloses the limitations of claim 1. Vaughan further discloses wherein the computer is caused to further execute a process of acquiring a policy model which has a flow structure constituted by nodes and edges, and in which a behavior selection model that indicates which behavior a person selects for the policy is associated with an intermediate node of the flow structure, and a service to be provided to the person is associated with an intermediate node or a terminal node of the flow structure, and the identifying includes a process of identifying a route to a terminal node by tracing nodes and edges of the flow structure based on the behavior selection model when data related to a person to be analyzed is input to an acquired policy model: (Paragraphs [0082], [0095]-[0105] and [0175] and Figure 3 of Vaughan. The teaching describes that the therapeutic module may rely on the diagnostic module in order to classify subjects as having different conditions or different severity levels of a condition. Specifically, the therapeutic module 134 can then indicate to the diagnosis module 132 that the latest round of therapy is finished, and that a new diagnosis is needed. The diagnostic module 132 can then provide new diagnostic tests and questions to the digital device 110, as well as take input from the therapeutic module of any data provided as part of therapy, such as recordings of learning sessions or browsing history of caregivers or subjects related to the therapy or diagnosed condition. The diagnostic module 132 then provides an updated diagnosis to repeat the process and provide a next step of therapy. This means that the flows pertaining to a particular subject refer to a plurality of subjects undergoing the same process individually, forming an analysis of the population of subjects. FIG. 3 illustrates a flow diagram 300 showing the handling of suspected or confirmed speech and language delay. This model relies upon user selection which describes the behavior of the subject to recommend next steps in the subject’s treatment. For example a subject that indicates non-verbal behavior, a person selects this option in the system for which steps 312, 314 and processed and end nodes 318 and 316 result in the policy that is selected for the subject. Policies will result in recommending a therapist visit in 318 or 328 or reporting progress with further therapies being recommended) Vaughan further discloses the generating includes a process of generating information that indicates a prediction result of policy in which a number of persons flowing in an identified route and a service to be provided to the person are associated with each other, and the outputting includes a process of outputting generated information that indicates a prediction result of policy to the display screen: (Paragraphs [0112], [0131] and [0132] and Figure 8 of Vaughan. The teaching describes that digital therapeutics treatment methods and apparatus described with reference to FIGS. 1-4 are particularly well suited for combination with the methods and apparatus to evaluate subjects with fewer questions described herein with reference to FIGS. 5A to 14. This means that the process of Figure 3 is followed up by a process such as the one described in Figure 8. Figure 8 is an exemplary operational flow 800 of a method of a prediction module. This prediction model uses new and saved information to determine a classification of a subject with regard to a plurality of predicted conditions of the subject. This means that when a policy is selected in the process of Figure 3, the process of Figure 8 predicts the result of that policy in the form of predicting a condition of the subject. The results of this prediction is output to the user of the device in step 825.) As per claim 8, Vaughan discloses the limitations of claim 1. Vaughan further discloses wherein the computer is caused to further execute a process of generating a digital twin in which a real space is reproduced in a virtual space, executing, in the digital twin that has been generated, a simulation of a flow of persons in a route that includes the plurality of options by using the behavior selection model, and generating information that indicates a prediction result of the policy based on a result of an executed simulation: (Paragraphs [0082], [0095]-[0105] and [0175] and Figure 3 of Vaughan. The teaching describes that the therapeutic module may rely on the diagnostic module in order to classify subjects as having different conditions or different severity levels of a condition. Specifically, the therapeutic module 134 can then indicate to the diagnosis module 132 that the latest round of therapy is finished, and that a new diagnosis is needed. The diagnostic module 132 can then provide new diagnostic tests and questions to the digital device 110, as well as take input from the therapeutic module of any data provided as part of therapy, such as recordings of learning sessions or browsing history of caregivers or subjects related to the therapy or diagnosed condition. The diagnostic module 132 then provides an updated diagnosis to repeat the process and provide a next step of therapy. This means that the flows pertaining to a particular subject refer to a plurality of subjects undergoing the same process individually, forming an analysis of the population of subjects. FIG. 3 illustrates a flow diagram 300 showing the handling of suspected or confirmed speech and language delay. This model relies upon user selection which describes the behavior of the subject to recommend next steps in the subject’s treatment. For example a subject that indicates non-verbal behavior, a person selects this option in the system for which steps 312, 314 and processed and end nodes 318 and 316 result in the policy that is selected for the subject. Policies will result in recommending a therapist visit in 318 or 328 or reporting progress with further therapies being recommended) (Paragraphs [0112], [0131] and [0132] and Figure 8 of Vaughan. The teaching describes that digital therapeutics treatment methods and apparatus described with reference to FIGS. 1-4 are particularly well suited for combination with the methods and apparatus to evaluate subjects with fewer questions described herein with reference to FIGS. 5A to 14. This means that the process of Figure 3 is followed up by a process such as the one described in Figure 8. Figure 8 is an exemplary operational flow 800 of a method of a prediction module. This prediction model uses new and saved information to determine a classification of a subject with regard to a plurality of predicted conditions of the subject. This means that when a policy is selected in the process of Figure 3, the process of Figure 8 predicts the result of that policy in the form of predicting a condition of the subject. The results of this prediction is output to the user of the device in step 825.) Here it is understood that the real space of subjects being evaluated is being virtualized in real-time by the system that the user is implementing for the evaluation itself. Each step being run by the system simulates the results of the real space of subjects in this virtual environment. As per claim 9, Vaughan discloses the limitations of claim 8. Vaughan further discloses herein the generating of a digital twin generates, in the virtual space, a digital twin that is time-synchronized with the real space, the executing of a simulation executes, by using attribute information of each of a plurality of agents that corresponds to each of a plurality of persons and the behavior selection model associated with a conditional branch of a policy model, in the digital twin, a simulation of whether each of the plurality of persons takes a behavior for the policy, so as to identify a route to which each of the plurality of persons is allocated in an option of the conditional branch, and the generating generates information that indicates a prediction result of the policy that includes an identified route to which each of the plurality of identified persons is allocated. (Paragraphs [0082], [0095]-[0105] and [0175] and Figure 3 of Vaughan. The teaching describes that the therapeutic module may rely on the diagnostic module in order to classify subjects as having different conditions or different severity levels of a condition. Specifically, the therapeutic module 134 can then indicate to the diagnosis module 132 that the latest round of therapy is finished, and that a new diagnosis is needed. The diagnostic module 132 can then provide new diagnostic tests and questions to the digital device 110, as well as take input from the therapeutic module of any data provided as part of therapy, such as recordings of learning sessions or browsing history of caregivers or subjects related to the therapy or diagnosed condition. The diagnostic module 132 then provides an updated diagnosis to repeat the process and provide a next step of therapy. This means that the flows pertaining to a particular subject refer to a plurality of subjects undergoing the same process individually, forming an analysis of the population of subjects. FIG. 3 illustrates a flow diagram 300 showing the handling of suspected or confirmed speech and language delay. This model relies upon user selection which describes the behavior of the subject to recommend next steps in the subject’s treatment. For example a subject that indicates non-verbal behavior, a person selects this option in the system for which steps 312, 314 and processed and end nodes 318 and 316 result in the policy that is selected for the subject. Policies will result in recommending a therapist visit in 318 or 328 or reporting progress with further therapies being recommended) (Paragraphs [0112], [0131] and [0132] and Figure 8 of Vaughan. The teaching describes that digital therapeutics treatment methods and apparatus described with reference to FIGS. 1-4 are particularly well suited for combination with the methods and apparatus to evaluate subjects with fewer questions described herein with reference to FIGS. 5A to 14. This means that the process of Figure 3 is followed up by a process such as the one described in Figure 8. Figure 8 is an exemplary operational flow 800 of a method of a prediction module. This prediction model uses new and saved information to determine a classification of a subject with regard to a plurality of predicted conditions of the subject. This means that when a policy is selected in the process of Figure 3, the process of Figure 8 predicts the result of that policy in the form of predicting a condition of the subject. The results of this prediction is output to the user of the device in step 825.) Here it is understood that the real space of subjects being evaluated is being virtualized in real-time by the system that the user is implementing for the evaluation itself. Each step being run by the system simulates the results of the real space of subjects in this virtual environment. As per claim 10, Claim 10 is substantially similar to claim 1. Accordingly, claim 10 is rejected for the same reasons as claim 1. As per claim 11, Claim 11 is substantially similar to claim 2. Accordingly, claim 11 is rejected for the same reasons as claim 2. As per claim 12, Claim 12 is substantially similar to claim 3. Accordingly, claim 12 is rejected for the same reasons as claim 3. As per claim 13, Claim 13 is substantially similar to claim 4. Accordingly, claim 13 is rejected for the same reasons as claim 4. As per claim 17, Claim 17 is substantially similar to claim 1. Accordingly, claim 17 is rejected for the same reasons as claim 1. Vaughan further discloses an information processing device comprising: a memory; and a processor coupled to the memory and configured to perform a process: (Paragraph [0164] of Vaughan. The teaching describes that the systems and methods provided herein, such as the computer system 401, can be embodied in programming. Various aspects of the technology may be thought of as “products” or “articles of manufacture” typically in the form of machine (or processor) executable code and/or associated data that is carried on or embodied in a type of machine readable medium. Machine-executable code can be stored on an electronic storage unit, such memory (e.g., read-only memory, random-access memory, flash memory) or a hard disk. “Storage” type media can include any or all of the tangible memory of the computers, processors or the like, or associated modules thereof, such as various semiconductor memories, tape drives, disk drives and the like, which may provide non-transitory storage at any time for the software programming.) As per claim 18, Claim 18 is substantially similar to claim 2. Accordingly, claim 18 is rejected for the same reasons as claim 2. As per claim 19, Claim 19 is substantially similar to claim 3. Accordingly, claim 19 is rejected for the same reasons as claim 3. As per claim 20, Claim 20 is substantially similar to claim 4. Accordingly, claim 20 is rejected for the same reasons as claim 4. Subject Matter Free of Prior Art Claims 5 and 14 contain subject matter that is free of prior art. The Examiner has conducted a thorough search of the prior art and could not find a single reference, or combination of references with adequate rationale to combine to teach the limitation of “the generating generates an image in which edges are made thick based on an identified number of persons, and the outputting displays a generated image on the display screen”. The closest prior art that the Examiner was able to find to teach this limitation was: Josephson et al. (US 2023/0017672) which teaches the display of a knowledge graph in Figure 2B which bolds the path of information taken by an inference engine which was used to determine specific users who performed research for particular medical treatments. See paragraph [0091] of Josephson et al. It can be seen that while the feature of augmenting a flow or knowledge graph with a thickening of edges is the only similarity between the prior art and the claimed invention. This visual augmentation was not in any way due to a number of identified persons and there is no basis to conclude that one of ordinary skill in the art would have arrived at the claimed invention prior to the filing date simply by reviewing this prior art. Accordingly, claims 5 and 15 contain subject matter free of prior art. Claims 6, 7, 15 and 16 contain subject matter that is free of prior art. The Examiner has conducted a thorough search of the prior art and could not find a single reference, or combination of references with adequate rationale to combine to teach the limitation of “a process of generating an image that indicates a first prediction result in which a number of persons flowing through a node in a flow graph is predicted, by inputting data related to a person to a first policy model before the policy is executed, and a process of generating an image that indicates a second prediction result in which a number of persons flowing through a node in a flow graph after the policy is executed is predicted, by inputting to a second policy model after the policy is executed, and the outputting includes a process of displaying, on the display screen, the image that indicates a first prediction result and the image that indicates a second prediction result”. The closest prior art that the Examiner was able to find to teach this limitation was: Dimitrova et al. (US 2023/0274809) which teaches a medial workflow generation interface that displays a flow chart of how medical decisions are made. See Figure 4 and paragraphs [0123]-[0128] of Dimitrova et al. It can be seen that a mere workflow generation UI does not read on this limitation. Not only is there number of persons indicating where they are moving in the proposed workflow, there is also not a plurality of flow charts being displayed in the same output. Accordingly, claims 6, 7, 15 and 16 contain subject matter free of prior art. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHAD A NEWTON whose telephone number is (313)446-6604. The examiner can normally be reached M-F 8:00AM-4:00PM (EST). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, PETER H. CHOI can be reached at (469) 295-9171. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /CHAD A NEWTON/Primary Examiner, Art Unit 3681
Read full office action

Prosecution Timeline

Jan 28, 2025
Application Filed
Jan 23, 2026
Non-Final Rejection — §101, §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597497
Health Analysis Based on Ingestible Sensors
2y 5m to grant Granted Apr 07, 2026
Patent 12597498
MEDICATION USE SUPPORT SYSTEM
2y 5m to grant Granted Apr 07, 2026
Patent 12591974
METHODS, DEVICES, AND SYSTEMS FOR DETECTING ANALYTE LEVELS
2y 5m to grant Granted Mar 31, 2026
Patent 12555680
RADIO-FREQUENCY SYSTEMS AND METHODS FOR CO-LOCALIZATION OF MEDICAL DEVICES AND PATIENTS
2y 5m to grant Granted Feb 17, 2026
Patent 12525326
PERSONALIZED TREATMENT TOOL
2y 5m to grant Granted Jan 13, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
38%
Grant Probability
64%
With Interview (+26.0%)
4y 0m
Median Time to Grant
Low
PTA Risk
Based on 218 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month