Prosecution Insights
Last updated: April 19, 2026
Application No. 18/752,743

ASSISTANCE APPARATUS, ASSISTANCE METHOD, AND COMPUTER READABLE MEDIUM

Final Rejection §101§102§103§112
Filed
Jun 24, 2024
Examiner
SKROBARCZYK III, ROBERT ANTHONY
Art Unit
3792
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Yokogawa Electric Corporation
OA Round
2 (Final)
20%
Grant Probability
At Risk
3-4
OA Rounds
2y 8m
To Grant
58%
With Interview

Examiner Intelligence

Grants only 20% of cases
20%
Career Allow Rate
2 granted / 10 resolved
-50.0% vs TC avg
Strong +38% interview lift
Without
With
+37.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
23 currently pending
Career history
33
Total Applications
across all art units

Statute-Specific Performance

§101
32.8%
-7.2% vs TC avg
§103
30.9%
-9.1% vs TC avg
§102
24.4%
-15.6% vs TC avg
§112
9.9%
-30.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 10 resolved cases

Office Action

§101 §102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims In response dated December 23rd, 2025, Applicant amended claims 1-9 and 18-20. Claims 10-17 have been canceled. Claims 21-28 have been newly added. Claims 1-9 and 18-28 are currently pending. Priority Acknowledgment is made of applicant’s claim for priority. The certified copy has been filed in parent Application No. JP 2023-104703, filed on June 27th, 2023. Examiner acknowledges the applicant’s claim for priority. Information Disclosure Statement The information disclosure statement (IDS) submitted on July 16th, 2024 is being considered by the examiner. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception without significantly more. Step 1 Claim 20 recites a computer readable medium which, under broadest reasonable interpretation, encompasses signal per se and therefore falls outside of one of the four statutory categories. Examiner recommends Applicant to exclude transitory signals by using claims language such as “non-transitory computer readable medium”. Claims 1-19 recite subject matter within a statutory category as a process, machine, and/or article of manufacture. However, it will be shown in the following steps, that claims 1-20 are nonetheless unpatentable under 35 U.S.C. 101. Step 2A Prong One Claim 20 states: A computer readable medium having recorded thereon an assistance program that, when executed by a computer, causes the computer to execute: acquiring brain wave information of a target person when a living subject engages in communication to the target person, wherein the target person is in a medical condition in which communication from the target person to the living subject is substantially limited, and wherein the communication from the target person to the living subject is at least one of oral communication or visual communication; determining an emotional state of the target person based on the brain wave information acquired in the acquiring. presenting the determined emotional state of the target person; acquiring attribute information indicating an attribute of the living subject; and presenting the emotional state of the target person based on the attribute information. The broadest reasonable interpretation of these steps includes organizing human activity and/or mental process because each bolded component can practically be performed by the human mind or with pen and paper. Other than reciting generic computer terms like “a computer readable medium” and “a computer”, nothing in the claims precludes the bold-font portions from practically being performed in the mind. For example, but for the “computer” language, “determining an emotional state of the target person based on the brain wave information acquired in the acquiring” in the context of this claim encompasses managing personal behavior of a healthcare provider diagnosing a patient based on a measure EEG signal. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components, then it falls within the “Mental Processes” or “Organizing Human Activity” grouping of abstract ideas. Accordingly, the claim recites an abstract idea. Independent claims 1 and 19 cover similar steps of acquiring brain wave information of a target person who is in a state in which dialog is difficult in a case where a living subject conducts an engagement for the target person and determining a state of the target person based on the brain wave information acquired in the acquiring. These claims fall under the same category of an abstract idea and follows the same rationale as claim 20. Dependent claims recite additional subject matter which further narrows or defines the abstract idea embodied in the claims (such as claim 21, reciting particular aspects of how “wherein the attribute information of the living subject is a type of relationship between the living subject and the target person.” may be performed in the mind but for recitation of generic computer components). Dependent claims 3, 18, 24 and 25 add additional elements to their parent claims which will be further inspected in the following steps for a practical application to their abstract idea. Step 2A Prong Two This judicial exception of “Organizing Human Activity” or “Mental Processes” is not integrated into a practical application. Independent claim 20’s article of manufacture recites additional elements such as a computer and a processor. The computer and processor will be treated as generic computer components. In particular, these additional elements do not integrate the abstract idea into a practical application because the additional elements: amount to mere instructions to apply an exception (such as recitation of “A computer readable medium having recorded thereon an assistance program that, when executed by a computer, causes the computer to execute” amounts to invoking computers as a tool to perform the abstract idea, see applicant’s specification [0024] “Part or whole of the assistance apparatus 100 may be achieved by a computer” which describes using a general computer, see MPEP 2106.05(f)) add insignificant extra-solution activity to the abstract idea (such as recitation of recitation of “acquiring brain wave information of a target person when a living subject engages in communication to the target person, wherein the target person is in a medical condition in which communication from the target person to the living subject is substantially limited, and wherein the communication from the target person to the living subject is at least one of oral communication or visual communication” amounts to selecting a particular data source or type of data to be manipulated, recitation of amounts to insignificant application, see MPEP 2106.05(g)) Dependent claims recite additional subject matter which amount to limitations consistent with the additional elements in the independent claims. For instance, dependent claim 9 adds an additional elements of a machine learning model to their parent claims. Additionally claim 9 “wherein: the at least one processor generates, by performing machine learning on a relationship between the brain wave information and the engagement and the engagement for putting the emotional state of the target person into a predetermined emotional state, an engagement inference model for making an inference of the engagement for putting the emotional state of the target person into the predetermined state based on the brain wave information and the emotional state of the target person.”, add insignificant extra-solution activity to the abstract idea which amounts to mere data gathering, claim 3 “wherein: the at least one processor acquires the brain wave information of the target person before the engagement with the living subject, and the at least one processor generates emotional state information indicating the emotional state of the target person based on a change of the brain wave information from before the engagement with the living person to the brain wave information after the engagement with the living person and the biological information, and the at least one processor determines the emotional state of the target person based on the emotional state information generated” claim 24 “wherein the at least one processor presents the emotional state of the target person to the living subject that is close social relationship even if the emotional state of the target subject is a negative emotion.” claim 25 and 26 “wherein the at least one processor prevents the presentation of the emotional state of the target person to the living subject that is the close social relationship if the emotional state of the target subject is a negative emotion.”, amounts to necessary data outputting, see MPEP 2106.05(g), and claim 18’s recitation of “wherein the at least one processor changes a presentation mode of the emotional state of the target person based on the attribute information” amounts to insignificant application). Looking at the limitations as an ordered combination adds nothing that is not already present when looking at the elements taken individually. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely provide conventional computer implementation and do not impose a meaningful limit to integrate the abstract idea into a practical application. The remaining dependent claims 2, 4-8, 22-23, and 25-28 do not recite additional elements or activity but further narrow or define the abstract idea embodied in the claims and hence also do not integrate the aforementioned abstract idea into a practical application. Step 2B The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to discussion of integration of the abstract idea into a practical application, the additional elements amount to no more than mere instructions to apply an exception and add insignificant extra-solution activity to the abstract idea. Glanowski et al (WO 2003074739) demonstrates “Conventional machine learning algorithms such as decision trees and neural networks generate solutions by training on a given collection of data where the outcome is known and then applying the trained system to predict data in unknown outcomes” [page 4, lines 31-34] that machine learning was conventional long before the priority data of the claimed invention. As such, this additional element, individually and in combination with the prior additional element, does not amount to significantly more. Additionally, the additional limitations, amount to no more than limitations which amount to elements that have been recognized as well-understood, routine, and conventional activity in particular fields. To elaborate: “acquiring brain wave information of a target person when a living subject engages in communication to the target person, wherein the target person is in a medical condition in which communication from the target person to the living subject is substantially limited, and wherein the communication from the target person to the living subject is at least one of oral communication or visual communication” is equivalently, receiving or transmitting data over a network, Symantec, MPEP 2106.05(d)(II)(i); Dependent claims recite additional subject matter which, as discussed above with respect to integration of the abstract idea into a practical application, amount to invoking computers as a tool to perform the abstract idea. Dependent claims recite additional subject matter which amount to limitations consistent with the additional elements in the independent claims. These additional limitations amount to elements that have been recognized as well-understood, routine, and conventional activity in particular fields. To elaborate: claim 3 “wherein: the at least one processor acquires the brain wave information of the target person before the engagement with the living subject, and the at least one processor generates emotional state information indicating the emotional state of the target person based on a change of the brain wave information from before the engagement with the living person to the brain wave information after the engagement with the living person and the biological information, and the at least one processor determines the emotional state of the target person based on the emotional state information generated.”, is equivalently, electronic recordkeeping, Alice Corp., MPEP 2106.05(d)(II)(iii); claim 9 “wherein: the at least one processor generates, by performing machine learning on a relationship between the brain wave information and the engagement and the engagement for putting the emotional state of the target person into a predetermined emotional state, an engagement inference model for making an inference of the engagement for putting the emotional state of the target person into the predetermined state based on the brain wave information and the emotional state of the target person.” , is equivalently, performing repetitive calculations, Flook, MPEP 2106.05(d)(II)(ii); claim 18 “wherein the at least one processor changes a presentation mode of the emotional state of the target person based on the attribute information” , is equivalently, Presenting offers and gathering statistics, OIP Techs., MPEP 2106.05(d)(II)(iv) claim 24 “wherein the at least one processor presents the emotional state of the target person to the living subject that is close social relationship even if the emotional state of the target subject is a negative emotion.” , is equivalently, Presenting offers and gathering statistics, OIP Techs., MPEP 2106.05(d)(II)(iv) claims 25 and 26 “wherein the at least one processor prevents the presentation of the emotional state of the target person to the living subject that is the close social relationship if the emotional state of the target subject is a negative emotion.” , is equivalently, Presenting offers and gathering statistics, OIP Techs., MPEP 2106.05(d)(II)(iv) Looking at the limitations as an ordered combination adds nothing that is not already present when looking at the elements taken individually. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely provide conventional computer implementation. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-3, and 20- 28 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Yeow et al. (WO2015122846). Regarding claim 1, Yeow teaches. An assistance apparatus comprising: ([page 5, lines 19-20] “The brainwave-sensing device, a lightweight and portable instrument, will for example be worn on a specific region of the head to track the brainwaves of the use”) the at least one processor which acquires brain wave information ([Fig 3a] “a portable EEG device” acquires brainwave information via a processor; see also [page 33, lines 23- 27] “embodiments can be the potential integration of the brainwave detection device and brainwave identification software with consumer electronics such as computers, tablets, smart phones, cameras (both handheld or computer cameras).” Where the system uses computer processors) of a target person when a livinq subject engages in communication to the target person, wherein the target person is in a medical condition in which communication from the target person to the living subject is substantially limited and wherein the communication from the target person to the living subject is at least one of oral communication or visual communication in a case where a living subject conducts an engagement for the target person; and ([page 6, lines 1-8] “The utility of brainwave data for various applications according to example embodiments has been broadly categorized into… communication via brainwave-engagement with illness-stricken parties who are in comatose/vegetative state, stroke and/or unable to communicate or express themselves” comprises illness-stricken parties [i.e., a target person who is in a medical condition] who are unable to express themselves [i.e., a living subject conducts an engagement for the target person]; see also [page 11, lines 9-15 “the EEG-device can be integrated for use with external systems such as cameras e.g. 916 with video function, as illustrated in Fig. 9B). The brainwave information at the particular time event can be recorded during the video recording, and projected 918 on a colored/numerical scale to indicate the user's brainwave levels during a certain experience or activity performed by the user.” Where an EEG device integrated with a camera containing video functioning which captures the user performing an activity comprises communication via oral or visual communication from the target person) the at least one processor which determines an emotional state of the target person based on the brain wave information. ([page 5, line 27-29] “The software processes the brainwave signals, for example to identify and display the brainwave states upon calibration of the user's basal brainwave states.” Where the software [i.e., the at least one processor] identifies the users brainwave states [i.e., determines a state of the target person]; see also page 6, lines 13-15] “allows for the detection and display of mental states such as emotion” where the states are emotions) the at least one processor presents the emotional state of the target person which is determined by the at least one processor; ([page 11, lines 13-15] “brainwave-sensing device is provided which allows for the detection and display of mental states such as emotions (happiness, anger, sadness, fear, excitement),pain, anxiety, sleep, mental fatigue, comfort and pleasure.” Where displaying emotions comprises presenting the emotional state of a target person) the at least one processor acquires attribute information indicating an attribute of the living subject; ([Figure 29, page 20] “embodiments of the invention can be extended to group chats where a customized cluster of people, such as family or close friends are given the option to share and track one another's brainwaves via a platform 2800” where customizing a cluster of people by adding a friend comprises collecting attribute information as a type of relationship between the living subject and a person) and the at least one processor presents the emotional state of the target person based on the attribute information ([figure 9B and page 11, lines 9-15] “the EEG-device can be integrated for use with external systems such as cameras e.g. 916 with video function, as illustrated in Fig. 9B). The brainwave information at the particular time event can be recorded during the video recording, and projected 918 on a colored/numerical scale to indicate the user's brainwave levels during a certain experience or activity performed by the user. Such embodiments can e.g. provide video recording of a user experience on the camera screen 920, with display of brainwave information with respect to time.” Where the interaction of the user based on surrounding activity of the user comprises presenting the emotional state of the target person based on the attribute information) Regarding claim 2, Yeow teaches all of the limitations of claim 1. Yeow also teaches: wherein the at least one processor further acquires biological information of the target person, and ([page 13, lines 19-23] “The portable EEG device can have designs to allow for flexible placement of electrodes or other sensors. This preferably allows the addition of electrodes to the device to provide users with a broader range of brainwaves information at different parts of the brain; or the addition of sensors such as heart rate or temperature sensors for measurement of other physiological signals.” Where the addition of physiological sensors to the portable EEG device [i.e., the acquisition unit] acquires physiological signal [i.e., biological information] of the target person) the at least one processor determines the emotional state of the target person [page 9, lines 16-19] “This serves to detect brainwave status information such as, but not limited to, sleep, attention, happiness, anger, sadness, pain, anxiety, fear and excitement, where the electrode placement can be adjusted to suit detection of different brainwave states.” Where corelating data includes determining a state of the target person) based on the brain wave information and the biological information. ([page 12, lines 23-24] ”This expands the scope of use of the device in its ability to correlating brainwave data with other vital signs.”) Regarding claim 3, Yeow teaches all of the limitations of claim 2. Yeow also teaches wherein the at least one processor acquires the brain wave information of the target person before the engagement with the living subject, and ([page 10, lines 26-28] “Based on the calibration, brainwave state identification can be obtained and the real-time brainwave state of users 800( 1)-(N) can be detected and transmitted to a centralized receiving system 802” where brainwave information is acquired real time [i.e., before any engagement]) the at least one processor generates emotional state information indicating the emotional state of the target person based on a change of the brain wave information from before the engagement with the living person to the brain wave information after the engagement with the living person and the biological information, and determines the state of the target person based on the state information generated. ([pages 10-11, lines 38- 7] “For example, in the data visualization step according one embodiment illustrated in Fig. 9A), the brainwave information can be plotted against time (plot 900), whereby selecting a time segment e.g. 902 of the plot 900 can reveal the brainwave state 904 at the particular time event. The visualization step can allow various plot options 906 for analysis, such as, but not limited to, plotting different time segments of interest 908, comparing different time segments before and after an event 910, plotting continuous real time brainwave information on various time scales 912 (such as daily, weekly and month). Physiological measurements such as heart rate and body temperature measured by incorporated sensors can also be displayed and correlated with the brainwave data for more meaningful data interpretation.” Where interpreting the brainwave and physiological information on various time scales is indicating the state of the target person before and after engagement; see also [page 10, lines 14-16] “With reference to Fig. 7, brainwaves of interest that can be captured in example 15 embodiments include, but are not limited to, brainwave states of happiness, excitement, attention, motivation, anger, sadness/depression, pain, sleep, anxiety and fear.” Where the information state is one of an emotion; see also [page 18, lines 25-30] “brainwave information can be potentially combined with electrocardiography (ECG) data to provide a better gauge of the response of the comatose patients, which may be triggered by family members' stimulatory methods.)” where the stimulatory methods of family comprises emotional state changes throughout engagement with a living person) Regarding claim 19, Yeow teaches: An assistance method performed by at least one processor comprising: ([page 7, lines 39-40] “The present specification also discloses apparatus for implementing or performing the operations of the methods.” And [Fig 3a] “a portable EEG device” acquires brainwave information via a processor; see also [page 33, lines 23- 27] “embodiments can be the potential integration of the brainwave detection device and brainwave identification software with consumer electronics such as computers, tablets, smart phones, cameras (both handheld or computer cameras).” Where the system uses computer processors)) acquiring, the at least one processor, ([Fig 3a] “a portable EEG device” acquires brainwave information with a device containing a processor) brain wave information of a target person when a livinq subject engages in communication to the target person, wherein the target person is in a medical condition in which communication from the target person to the livinq subject is substantially limited ([page 6, lines 1-8] “The utility of brainwave data for various applications according to example embodiments has been broadly categorized into… communication via brainwave-engagement with illness-stricken parties who are in comatose/vegetative state, stroke and/or unable to communicate or express themselves” comprises illness-stricken parties [i.e., a target person who is in a medical condition] who are unable to express themselves [i.e., a living subject conducts an engagement for the target person]; see also [page 11, lines 9-15 “the EEG-device can be integrated for use with external systems such as cameras e.g. 916 with video function, as illustrated in Fig. 9B). The brainwave information at the particular time event can be recorded during the video recording, and projected 918 on a colored/numerical scale to indicate the user's brainwave levels during a certain experience or activity performed by the user.” Where an EEG device integrated with a camera containing video functioning which captures the user performing an activity comprises communication via oral or visual communication from the target person) and wherein the communication from the target person to the living subject is at least one of oral communication or visual communication; see also [page 11, lines 9-15 “the EEG-device can be integrated for use with external systems such as cameras e.g. 916 with video function, as illustrated in Fig. 9B). The brainwave information at the particular time event can be recorded during the video recording, and projected 918 on a colored/numerical scale to indicate the user's brainwave levels during a certain experience or activity performed by the user.” Where an EEG device integrated with a camera containing video functioning which captures the user performing an activity comprises communication via oral or visual communication from the target person) determining, by the at least one processor, an emotional state of the target person based on the brain wave information acquired in the acquiring. ([page 11, lines 13-15] “brainwave-sensing device is provided which allows for the detection and display of mental states such as emotions (happiness, anger, sadness, fear, excitement),pain, anxiety, sleep, mental fatigue, comfort and pleasure.” Where displaying emotions comprises presenting the emotional state of a target person) presentinq, by the at least one processor, the emotional state of the target person which is determined by the at least one processor; ([page 11, lines 13-15] “brainwave-sensing device is provided which allows for the detection and display of mental states such as emotions (happiness, anger, sadness, fear, excitement),pain, anxiety, sleep, mental fatigue, comfort and pleasure.” Where displaying emotions comprises presenting the emotional state of a target person) acquiring, by the at least one processor, attribute information indicating an attribute of the living subject; and ([Figure 29, page 20] “embodiments of the invention can be extended to group chats where a customized cluster of people, such as family or close friends are given the option to share and track one another's brainwaves via a platform 2800” where customizing a cluster of people by adding a friend comprises collecting attribute information as a type of relationship between the living subject and a person) presenting, by the at least one processor, the emotional state of the target person based on the attribute information. ([figure 9B and page 11, lines 9-15] “the EEG-device can be integrated for use with external systems such as cameras e.g. 916 with video function, as illustrated in Fig. 9B). The brainwave information at the particular time event can be recorded during the video recording, and projected 918 on a colored/numerical scale to indicate the user's brainwave levels during a certain experience or activity performed by the user. Such embodiments can e.g. provide video recording of a user experience on the camera screen 920, with display of brainwave information with respect to time.” Where the interaction of the user based on surrounding activity of the user comprises presenting the emotional state of the target person based on the attribute information) Regarding claim 20, Yeow teaches: A computer readable medium having recorded thereon an assistance program that, when executed by a computer, causes the computer to execute: ([page 7-8, lines 39-4] “The present specification also discloses apparatus for implementing or performing the operations of the methods. Such apparatus may be specially constructed for the required purposes, or may comprise a device selectively activated or reconfigured by a computer program stored in the device. Such a computer program may be stored on any computer readable medium.”) acquiring brain wave information of a target person when a living subject engages in communication to the target person, wherein the target person is in a medical condition in which communication from the target person to the living subject is substantially limited, and wherein the communication from the target person to the living subject is at least one of oral communication or visual communication; ([page 6, lines 1-8] “The utility of brainwave data for various applications according to example embodiments has been broadly categorized into… communication via brainwave-engagement with illness-stricken parties who are in comatose/vegetative state, stroke and/or unable to communicate or express themselves” comprises illness-stricken parties [i.e., a target person who is in a medical condition] who are unable to express themselves [i.e., a living subject conducts an engagement for the target person]; see also [page 11, lines 9-15 “the EEG-device can be integrated for use with external systems such as cameras e.g. 916 with video function, as illustrated in Fig. 9B). The brainwave information at the particular time event can be recorded during the video recording, and projected 918 on a colored/numerical scale to indicate the user's brainwave levels during a certain experience or activity performed by the user.” Where an EEG device integrated with a camera containing video functioning which captures the user performing an activity comprises communication via oral or visual communication from the target person) determining an emotional state of the target person based on the brain wave information acquired in the acquiring. ([page 11, lines 13-15] “brainwave-sensing device is provided which allows for the detection and display of mental states such as emotions (happiness, anger, sadness, fear, excitement),pain, anxiety, sleep, mental fatigue, comfort and pleasure.” Where displaying emotions comprises presenting the emotional state of a target person) presenting the determined emotional state of the target person; ([page 11, lines 13-15] “brainwave-sensing device is provided which allows for the detection and display of mental states such as emotions (happiness, anger, sadness, fear, excitement),pain, anxiety, sleep, mental fatigue, comfort and pleasure.” Where displaying emotions comprises presenting the emotional state of a target person) acquiring attribute information indicating an attribute of the living subject; and ([Figure 29, page 20] “embodiments of the invention can be extended to group chats where a customized cluster of people, such as family or close friends are given the option to share and track one another's brainwaves via a platform 2800” where customizing a cluster of people by adding a friend comprises collecting attribute information as a type of relationship between the living subject and a person) presenting the emotional state of the target person based on the attribute information. ([figure 9B and page 11, lines 9-15] “the EEG-device can be integrated for use with external systems such as cameras e.g. 916 with video function, as illustrated in Fig. 9B). The brainwave information at the particular time event can be recorded during the video recording, and projected 918 on a colored/numerical scale to indicate the user's brainwave levels during a certain experience or activity performed by the user. Such embodiments can e.g. provide video recording of a user experience on the camera screen 920, with display of brainwave information with respect to time.” Where the interaction of the user based on surrounding activity of the user comprises presenting the emotional state of the target person based on the attribute information) Regarding claim 21, Yeow teaches all of the limitations of claim 1. Yeow also teaches: wherein the attribute information of the living subject is a type of relationship between the living subject and the target person. ([Figure 29, page 20] “embodiments of the invention can be extended to group chats where a customized cluster of people, such as family or close friends are given the option to share and track one another's brainwaves via a platform 2800” where customizing a cluster of people by adding a friend comprises collecting attribute information as a type of relationship between the living subject and a person) Regarding claim 22, Yeow teaches all of the limitations of claim 21. Yeow also teaches: wherein the type of relationship between the living subject and the target person is one of: a close social relationship between the living subject and the target person; a distant social relationship between the living subject and the target person; and a service provider relationship between the living subject and the target person. ([page 17, lines 9-10] “psychologists, counselors and psychiatrists could monitor their clients' disorder/condition (such as autism, bipolar disorder, ADHD etc.) in real time from their workplace, enabling tracking of the progressive condition of their client and to administer appropriate treatments to improve the patient's brainwave states.” where sharing brainwave information with a healthcare provider comprises a service provider relationship comprises a medical service provider of the target person) Regarding claim 23, Yeow teaches all of the limitations of claim 22. Yeow also teaches: wherein the close social relationship is that the living subject is a family member of the target person. (see [Figure 29, page 20] above, where the family members acquire brainwave functions based on the user preferences) Regarding claim 24, Yeow teaches all of the limitations of claim 22. Yeow also teaches: wherein the at least one processor presents the emotional state of the target person to the living subject that is close social relationship even if the emotional state of the target subject is a negative emotion. ([pages 15-16, lines 38-40 and 1] “The application ('app') or device in such embodiments displays the user's brainwave state and the user will be notified only when the brainwaves go beyond a certain emotional level that is calibrated as 'healthy'. At this point, for example in state 2200 dominated by fear, the user 2202 will be given an option to activate a therapeutic audio-visual solution” where notifying a user of a fearful emotional state comprises presenting the emotional state of the target person to the living subject if the emotional state is a negative emotion) Regarding claim 25, Yeow teaches all of the limitations of claim 22. Yeow also teaches wherein the at least one processor prevents the presentation of the emotional state of the target person to the living subject that is the close social relationship ([page 28, lines 37-40] “Example embodiments can provide a mobile app platform whereby the raw and/or processed brainwave (and/or other physiological) information can be transmitted to and displayed on the mobile phone for user to utilize in a convenient and meaningful manner. Users are able to monitor and track their mental states at their convenience through a mobile app, and take the necessary preventive or interventive steps as required.” if the emotional state of the target subject is a negative emotion. ([page 24, lines 1-5] “Embodiments of the invention can be part of a feedback· system that receives brainwave information from the patient and determines whether the current brainwave state is below the· desired 'healthy threshold such as negative emotions, anxiety or pain.” Where the feedback system determining the negative emotions of a target subject comprises determining if the emotional state of the target subject is a negative emotion) Regarding claim 26, Yeow teaches all of the limitations of claim 22. Yeow also teaches: wherein: the distant social relationship is that the living subject is a friend of the target person; ([page 20, lines 15-20 “embodiments of the invention can be extended to group chats where a customized cluster of people, such as family or close friends are given the option to share and track one another's brainwaves via a platform 2800” where the customize cluster of people comprise a friend of the target person) and the at least one processor prevents the presentation of the emotional state of the target person to the living subject that is the distant social relationship ([page 28, lines 37-40] “Example embodiments can provide a mobile app platform whereby the raw and/or processed brainwave (and/or other physiological) information can be transmitted to and displayed on the mobile phone for user to utilize in a convenient and meaningful manner. Users are able to monitor and track their mental states at their convenience through a mobile app, and take the necessary preventive or interventive steps as required.” Where preventative or intervening steps for displaying the processed brain information comprises preventing presenting the emotional state of the user) if the emotional state of the target subject is a negative emotion. ([page 24, lines 1-5] “Embodiments of the invention can be part of a feedback· system that receives brainwave information from the patient and determines whether the current brainwave state is below the· desired 'healthy threshold such as negative emotions, anxiety or pain.” Where the feedback system determining the negative emotions of a target subject comprises determining if the emotional state of the target subject is a negative emotion) Regarding claim 27, Yeow teaches all of the limitations of claim 22. Yeow also teaches: wherein: the service provider relationship is that the living subject is a medical service provider of the target person; ([page 17, lines 9-10] “psychologists, counselors and psychiatrists could monitor their clients' disorder/condition (such as autism, bipolar disorder, ADHD etc.) in real time from their workplace, enabling tracking of the progressive condition of their client and to administer appropriate treatments to improve the patient's brainwave states.” where sharing brainwave information with a healthcare provider comprises a service provider relationship comprises a medical service provider of the target person) and the at least one processor prevents the presentation of the emotional state of the target person to the living subject that is the distant social relationship ([page 28, lines 37-40] “Example embodiments can provide a mobile app platform whereby the raw and/or processed brainwave (and/or other physiological) information can be transmitted to and displayed on the mobile phone for user to utilize in a convenient and meaningful manner. Users are able to monitor and track their mental states at their convenience through a mobile app, and take the necessary preventive or interventive steps as required.” Where preventative or intervening steps for displaying the processed brain information comprises preventing presenting the emotional state of the user) if the emotional state of the target subject is a negative emotion. ([page 24, lines 1-5] “Embodiments of the invention can be part of a feedback· system that receives brainwave information from the patient and determines whether the current brainwave state is below the· desired 'healthy threshold such as negative emotions, anxiety or pain.” Where the feedback system determining the negative emotions of a target subject comprises determining if the emotional state of the target subject is a negative emotion) Regarding claim 28, Yeow teaches all of the limitations of claim 22. Yeow also teaches: wherein the medical service provider is a nurse. ([page 17, lines 7-15 “This brainwave information is most often important to, but is not limited to, parents, guardians, counselors, healthcare personnel or medical doctors to track the real-time brainwaves of their child, client and/or a group of 10 people concurrently. For example, psychologists, counselors and psychiatrists could monitor their clients' disorder/condition (such as autism, bipolar disorder, ADHD etc.) in real time from their workplace, enabling tracking of the progressive condition of their client and to administer appropriate treatments to improve the patient's brainwave states.” Where a healthcare personnel comprises a nurse”) Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 4-9 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Yeow et al (WO2015122846) in view of Song (US20230218215). Regarding claim 4, Yeow teaches all of the limitations of claim 3. Yeow also teaches: wherein the at least one processor generates the emotional state information ([page 11, lines 13-15] “brainwave-sensing device is provided which allows for the detection and display of mental states such as emotions (happiness, anger, sadness, fear, excitement),pain, anxiety, sleep, mental fatigue, comfort and pleasure.” Where displaying emotions comprises presenting the emotional state of a target person) based on a change from a first brainwave proportion to a second brainwave proportion and from a change of a heartbeat proportion ([page 11, lines 1-10] “multiple electrode(s) can be placed on a localized area of the head to detect specific brainwaves of interest of the user. This serves to detect brainwave status information such as, but not limited to, sleep, attention, happiness, anger, sadness, pain, anxiety, fear and excitement,” and “the brainwave information can be plotted against time (plot 900), whereby selecting a time segment e.g. 902 of the plot 900 can reveal the brainwave state 904 at the particular time event. The brainwave information can for example be … plotting continuous real time brainwave information… Physiological measurements such as. heart rate and body temperature measured by incorporated sensors can also be displayed and correlated with the brainwave data for more meaningful data interpretation” where continuous physiological measurements which generate measurement of a subject’s emotion comprises a change in a brainwave and heartbeat proportion) the first brainwave proportion is a proportion of an amplitude of a brain wave in a predetermined frequency band to a total amplitude in the brain wave information before the engagement, the second brainwave proportion is to a proportion of the amplitude of the brain wave in the frequency band to the total amplitude in the brain wave information after the engagement ([Figure 7, and page 10 lines 14-25] “Fig. 7, brainwaves of interest that can be captured in example embodiments include, but are not limited to, brainwave states of happiness, excitement, attention, motivation, anger, sadness/depression, pain, sleep, anxiety and fear. Different brainwave states tend to show activation in different brain regions and exhibit different waveforms of varying frequency, see plots 702-706. These waveforms include alpha (8- 13Hz), delta (0.5-4Hz), beta ( 14-30Hz) and theta (4-8Hz). In example embodiments, a calibration procedure/system of the basal level of the user's brainwave states can be performed. The user 707 will for example first be exposed to different audio-visual brainwave stimuli for a particular brainwave state 708-712, and the waveforms that arise from the triggered brainwaves will be detected, see plots 702-706.” Where the waveforms of different brainwaves (comprising the first and second brainwave proportion) are measured before and after audio-visual stimuli (i.e., before and after engagement) and encompass measuring the change from one state of mind to another (comprising one proportion of amplitude to another)) the heartbeat proportion is a proportion of a magnitude of a first power spectrum in a heartbeat of the target person to a magnitude of a second power spectrum in the heartbeat of the target person, (see [page 11, lines 1-10] where plotting physiological signals of heart rate against time comprises a power spectrum of a heartbeat of the target person) Yeow does not explicitly teach, as taught by Song: the total amplitude is a sum of amplitudes of an alpha wave, a beta wave, a theta wave, a gamma wave, and a delta wave, and ([0081] “According to the range of oscillating frequencies, EEG signals are artificially classified into and referred to as delta (δ) waves (0.2 to 3.99 Hz), theta (θ) waves (4 to 7.99 Hz), alpha (α) waves (8 to 12.99 Hz), beta (β) waves (13 to 29.99 Hz), and gamma (γ) waves (30-50 Hz).” Where classifying the signals into different waves comprise measuring the summation of each wave; see also [Fig. 22] where the proportion of frequencies are mapped out to include a changing portion of the brain wave amplitudes before and after engagement) a frequency band of the second power spectrum is a higher frequency band than a frequency band of the first power spectrum. ([0224] “In the present invention, time-series characteristics are generated from data at the time when the data changes from the normal EEG waves of joy and pleasure to the abnormal EEG waves of fear, sadness, anger, disgust, and depression by applying the CRNN to the power spectrum distribution structure separated into multiple waves in the frequency region, and brain states having different lengths are detected for each person and for each occurrence by using these characteristics, thereby generating emotional data.” Where different lengths of brain states correlates to different frequency bands of power spectrums) It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Yeow with the teachings of Song, with a reasonable expectation of success, by explicitly using Song’s artificial intelligence emotional data extraction and power spectrum analysis to map the time series features of a brainwave signals before and after events. This would have provided a more in depth display of information for healthcare providers and thus better communicating an emotional state of a user. Song is adaptable to Yeow as both inventions utilize supervised learning protocols that communicate via standardized networks. Yeow would have found Song’s teaching while attempting to solve “a problem in that the reliability of diagnosis is lowered due to the subjective determination of interpretation” [0013]. Regarding claim 5, Yeow-Song as a combination teaches all of the limitations of claim 4. Regarding claim 5, Yeow continues to teach: wherein the at least one processor generates the emotional state information based on the change of the heartbeat proportion and a magnitude relationship between the heartbeat proportion of the magnitude of the first power spectrum to the magnitude of the second power spectrum after the engagement and a predetermined threshold of the heartbeat proportion of the magnitude of the first power spectrum to the magnitude of the second power spectrum ([page 11, lines 1-10] “multiple electrode(s) can be placed on a localized area of the head to detect specific brainwaves of interest of the user. This serves to detect brainwave status information such as, but not limited to, sleep, attention, happiness, anger, sadness, pain, anxiety, fear and excitement,” and “the brainwave information can be plotted against time (plot 900), whereby selecting a time segment e.g. 902 of the plot 900 can reveal the brainwave state 904 at the particular time event. The brainwave information can for example be … plotting continuous real time brainwave information… Physiological measurements such as. heart rate and body temperature measured by incorporated sensors can also be displayed and correlated with the brainwave data for more meaningful data interpretation” where continuous physiological measurements which generate measurement of a subject’s emotion comprises a change in a brainwave and heartbeat proportion; see also [page 27, lines 39-44] “addition of electrodes to the device can provide users with a broader range of brainwave information at different parts of the brain; or the addition of sensors such as heart rate or temperature sensors for measurement of other physiological signals. This expands the scope of use of the device in its ability to read brainwave data whilst correlating it with other vital signs.” Where the proportions of heart rates increases to a threshold that indicates a different emotion) Regarding claim 6, Yeow-Song as a combination teaches all of the limitations of claim 4. Yeow also teaches: wherein the emotional state information includes information according to a plurality of emotional states of the target person, and the at least one processor generates the emotional state information according to one emotional state among the plurality of emotional states based on the change of the heartbeat proportion and the heartbeat proportion of the magnitude of the first power spectrum to the magnitude of the second power spectrum. ([page 10, lines 14-25] “With reference to Fig. 7, brainwaves of interest that can be captured in example embodiments include, but are not limited to, brainwave states of happiness, excitement, attention, motivation, anger, sadness/depression, pain, sleep, anxiety and fear. Different brainwave states tend to show activation in different brain regions and exhibit different waveforms of varying frequency, see plots 702-706. These waveforms include alpha (8- 13Hz), delta (0.5-4Hz), beta (14-30Hz) and theta (4-8Hz). In example embodiments, a calibration procedure/system of the basal level of the user's brainwave states can be performed. The user 707 will for example first be exposed to different audio-visual brainwave stimuli for a particular brainwave state 708-7 12, and the waveforms that arise from the triggered brainwaves will be detected, see plots 702-706. This procedure allows generating user-specific calibrated brainwave scales 714 to facilitate subsequent brainwave state identification.” Where the system captures brainwaves of interest [i.e., generates the emotion information according to one emotion among the plurality of emotions] and tracks the transition from a high to low state based on the basal level of the user’s physiological signals [i.e., based on the change and the proportion of the magnitude of the first power spectrum to the magnitude of the second power spectrum]; see also [page 11, lines 1-10] “multiple electrode(s) can be placed on a localized area of the head to detect specific brainwaves of interest of the user. This serves to detect brainwave status information such as, but not limited to, sleep, attention, happiness, anger, sadness, pain, anxiety, fear and excitement,” and “the brainwave information can be plotted against time (plot 900), whereby selecting a time segment e.g. 902 of the plot 900 can reveal the brainwave state 904 at the particular time event. The brainwave information can for example be … plotting continuous real time brainwave information… Physiological measurements such as. heart rate and body temperature measured by incorporated sensors can also be displayed and correlated with the brainwave data for more meaningful data interpretation” where continuous physiological measurements which generate measurement of a subject’s emotion comprises a change in a brainwave and heartbeat proportion; see also [page 27, lines 39-44] “addition of electrodes to the device can provide users with a broader range of brainwave information at different parts of the brain; or the addition of sensors such as heart rate or temperature sensors for measurement of other physiological signals. This expands the scope of use of the device in its ability to read brainwave data whilst correlating it with other vital signs.” Where the proportions of heart rates increases to a threshold that indicates a different emotion) Regarding claim 7, Yeow-Song as a combination teaches all of the limitations of claim 6. Yeow also teaches: wherein the brain wave in the predetermined frequency band is at least one of a delta wave, a theta wave, a low alpha wave, or a medium alpha wave. ([page 10, lines 18-19] “These waveforms include alpha (8-13Hz), delta (0.5-4Hz), beta (14-30Hz) and theta (4-8Hz).” Where depicting the various wave forms to label emotions comprise categorizing predetermined frequencies) Regarding claim 8, Yeow-Song as a combination teaches all of the limitations of claim 6. Yeow also teaches: wherein the brain wave in the predetermined frequency band is at least one of a high alpha wave, a low beta wave, a high beta wave, or a gamma wave. ([page 10, lines 18-19] “Different brainwave states tend to show activation in different brain regions and exhibit different waveforms of varying frequency, see plots 702-706. These waveforms include alpha (8- 13Hz), delta (0.5-4Hz), beta (14-30Hz) and theta (4-8Hz).” Where the range of alpha waves comprise both a low and high beta wave and depicting the various wave forms to label emotions comprise categorizing predetermined frequencies) Regarding claim 9, Yeow teaches all of the limitations of claim 3. Yeow does not explicitly teach, as taught by Song: wherein: the at least one processor generates, by performing machine learning on a relationship between the brain wave information and the engagement and the engagement for putting the emotional state of the target person into a predetermined emotional state, an engagement inference model for making an inference of the engagement for putting the emotional state of the target person into the predetermined state based on the brain wave information and the emotional state of the target person. (see [Fig 22] where the time-series feature map depicts the relationship between the brainwave information based on events that cause the EEG data to change; see also [0130] “The AI control module 400 for emotion-tailored CBT serves to perform control in order to receive EEG signals measured by the multi-channel EEG helmet module through the EEG-BCI module, to analyze the EEG signals with wavelet transformation and power spectrum, to generate emotional data according to the current brain wave of a user by performing inference while performing learning, and to transmit metaverse virtual space content for 1:1 emotion-tailored cognitive behavioral therapy appropriate for the generated emotional data to the metaverse driving HMD module.” Where transmitting 1:1 emotion-tailored cognitive behavior therapy comprises an engagement inference model for making an inference of the engagement for putting the state of the target person into the predetermined state based on the brain wave information and the state of the target person. ) It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Yeow with the teachings of Song, with a reasonable expectation of success, by explicitly using Song’s artificial intelligence emotional data extraction and power spectrum analysis to map the time series features of a brainwave signals before and after events. This would have provided a more in depth display of information for healthcare providers and thus better communicating an emotional state of a user. Song is adaptable to Yeow as both inventions utilize supervised learning protocols that communicate via standardized networks. Yeow would have found Song’s teaching while attempting to solve “a problem in that the reliability of diagnosis is lowered due to the subjective determination of interpretation” [0013]. Regarding claim 18, Yeow-Song as a combination teaches all of the limitations of claim 1. Yeow also teaches: wherein the at least one processor changes a presentation mode of the emotional state of the target person based on the attribute information. (see [page 9, lines 7-11] above, where presentation unit is a computer; see also [page 11, lines 16-33] “For the upload/download step according to one embodiment illustrated in Fig. 10, media 1000 such as photos, videos and music can be uploaded to the centralized receiving system to provide additional information on the selected time event 1002. In addition, the user can select their desired time events and download them to their personal electronic storage devices, with the option to convert the brainwave data with or without the personalized details into a certain export format 1004 based on different available templates such as, but not limited to, a fixed and/or customizable choice of diary, memopads, calendar, notebook, coasters, gifts etc.… This can provide a digital online capability for the input of personalized documents (comments/photos/files) based on each brainwave information triggered and recorded, and can· further allow for the documentation, recording, tracking of events and storage of personal brainwave data collected over a span of a specified period of time in multimedia and printable format. Such an option can be especially advantageous for both personal reference and for institutional research references (e.g. patient records and population-based healthcare monitoring of specific diseased patients).” Where the system presents brainwave information [i.e., based on the attribute information] with personalized details in forms like an online diary for a patient [i.e., a presentation mode of the state of the target person]) Response to Arguments Examiner thanks applicant for the response and will address the arguments in the order they were presented. Regarding pages 11 and 12, Applicant’s arguments regarding the interpretation under 112(f) and rejection under 112(b) have been fully considered and, in light of the amended claim language, are respectfully withdrawn. Regarding pages 12 and 13, Applicant’s arguments have been fully considered but are not persuasive. Applicant argues that “‘acquiring brain wave information’ cannot reasonably be in the category of mental process or organizing human activity”. Without conceding to the argument that this step cannot be attributed to an abstract idea, Examiner submits that this claim language was categorized as being directed to an extra solution activity rather than an abstract idea (see Non-Final Office Action dated 08/27/2025 at Pg. 6). As such, this argument cannot be persuasive. Regarding page 12-13, Applicant’s arguments have been fully considered but are not persuasive. Applicant argues that the claims are not well known because they are supported by a weak rejection. The Examiner respectfully disagrees. MPEP 2106.05(d) states: “Another consideration when determining whether a claim recites significantly more than a judicial exception is whether the additional element(s) are well-understood, routine, conventional activities previously known to the industry (emphasis added).” Further, MPEP 2106.05(I) states: “As made clear by the courts, the novelty of any element or steps in a process, or even of the process itself, is of no relevance in determining whether the subject matter of a claim falls within the § 101 categories of possibly patentable subject matter (internal quotations omitted, emphasis original).” As such, it is only the additional elements identified by the Examiner to not be part of the abstract idea that are analyzed to determine whether they represent well-understood, routine, conventional activities in the field of the invention. In that regard, MPEP 2106.05(d)(I) indicates that in determining whether the additional elements represent are well-understood, routine, conventional activities, the Examiner should consider whether the additional elements (1) provide an improvement to the technological environment to which the claim is confined, (2) whether the additional elements are mere instructions to apply the judicial exception, or (3) whether the additional elements represent insignificant extra-solution activity. The additional elements of the claims do not provide significantly more based on this inquiry. Taking these in turn, whether the additional elements of the claim provide an improvement was analyzed/addressed in the 2A2 analysis and no improvement was made. The technological environment to which the claims are confined (a general-purpose computer performing generic computer functions [see Spec. Para. 0086]) is recited at a high level of generality and has been found by the courts to be insufficient to provide a practical application (see MPEP 2106.05(d)(II); Alice Corp.). No additional elements of the claim were found while representing extra-solution activity and thus no well-understood, routine, conventional analysis is required. Regarding page 13, Applicant’s arguments have been fully considered but are not persuasive. Applicant argues that the claims are significant because the art does not cover the claims. MPEP 2106.04(d) recites the SME guidance to determine whether the claims recite eligible subject matter by integrating a Judicial Exception into a practical Application. It states that “Step 2A Prong Two is similar to Step 2B in that both analyses involve evaluating a set of judicial considerations to determine if the claim is eligible.” Examiner utilizes the Court’s rulings in previous cases to evaluate a set of judicial considerations which distinguish steps that are significantly more than the abstract idea. These judicial considerations do not merely apply prior art to interpret the claims under the MPEP guidance for Subject Matter Eligibility but rather view the claims as a whole under the Court’s previous ruling. Therefore, Examiner maintains that the claims are directed to insignificant application. Regarding page 13, Applicant’s arguments have been fully considered but are not persuasive. Applicant argues that the claims are necessarily more than gathering and outputting data. Examiner does not find rationale to clarify a dissenting interpretation than the analysis provided by MPEP 2106.05’s Step 2a analysis proposed. Therefore, Examiner maintains that the arguments are directed toward ineligible subject matter. Regarding page 13-16, Applicant’s arguments have been fully considered but are not persuasive. Applicant argues that the prior art does not teach the amended claims in addition to the attribute information. While the argument is moot in view of the amended claim language, Examiner will further elaborate on the interpretation of attribute information. When interpreting the breadth of “attribute information”, Examiner turned towards paragraph [0066] of Applicant’s specification, which states: “the attribute information may include information related to at least one of an age, a gender, an occupation, or a preference of the human being.”. Under broadest reasonable interpretation, the prior art on record encompasses a preference for a user because it gathers the user’s preferences to automatically export personalized details in multimedia documentation. This preference is elaborated upon in Yeow [page 20] that discloses having privacy options to share the user’s brainwave with people on social media and user preferences for sharing brainwave information amongst the user’s relationships. Therefore, Examiner maintains that the prior art teaches the claims as written. Additional Considerations The prior art made of record and not relied upon that is considered pertinent to applicant’s disclosure can be found on PTO-892 of the prior office action dated August 27th, 2025. The following prior art is considered relevant but not cited. Fung et al. (US 20230211205) comprises a gamified a healthcare diagnostics and treatment module integrated into the platform comprising a HIPPA-compliant security gateway, an AI-assisted healthcare diagnostics module, and an alert and treatments module. Weng et al. (Pat. 11531393) discloses a human computer interface that measures various brain waves and other various physiological sensors to determine an emotion predictive signal. Shimizu et al. (Pat. 12426816) contains an emotion detection apparatus which determines the emotional state of a subject based on heart rate variations. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ROBERT ANTHONY SKROBARCZYK whose telephone number is (571)272-3301. The examiner can normally be reached Monday thru Friday 7:30AM -5PM CST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kambiz Abdi can be reached at 571-272-6702. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /R.A.S/Examiner, Art Unit 3792 /KAMBIZ ABDI/Supervisory Patent Examiner, Art Unit 3685
Read full office action

Prosecution Timeline

Jun 24, 2024
Application Filed
Aug 22, 2025
Non-Final Rejection — §101, §102, §103
Nov 19, 2025
Interview Requested
Dec 08, 2025
Applicant Interview (Telephonic)
Dec 09, 2025
Examiner Interview Summary
Dec 23, 2025
Response Filed
Jan 27, 2026
Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12527889
SYSTEM AND METHOD FOR MAINTAINING STERILE FIELDS IN A MONITORED ENCLOSURE
2y 5m to grant Granted Jan 20, 2026
Patent 12502067
Cloud Based Corneal Surface Difference Mapping System and Method
2y 5m to grant Granted Dec 23, 2025
Patent 12469593
COMPUTER-BASED SYSTEMS WITH IMPLEMENTING A SOFTWARE PLATFORM AND METHODS OF USE THEREOF
2y 5m to grant Granted Nov 11, 2025
Study what changed to get past this examiner. Based on 3 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
20%
Grant Probability
58%
With Interview (+37.5%)
2y 8m
Median Time to Grant
Moderate
PTA Risk
Based on 10 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month