Prosecution Insights
Last updated: April 19, 2026
Application No. 16/620,415

Method of Constructing Database

Non-Final OA §101§102§103
Filed
Dec 06, 2019
Examiner
WILLIAMS, TERESA S
Art Unit
3687
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Smart Beat Profits Limited
OA Round
7 (Non-Final)
24%
Grant Probability
At Risk
7-8
OA Rounds
5y 0m
To Grant
42%
With Interview

Examiner Intelligence

Grants only 24% of cases
24%
Career Allow Rate
107 granted / 438 resolved
-27.6% vs TC avg
Strong +18% interview lift
Without
With
+18.0%
Interview Lift
resolved cases with interview
Typical timeline
5y 0m
Avg Prosecution
48 currently pending
Career history
486
Total Applications
across all art units

Statute-Specific Performance

§101
31.8%
-8.2% vs TC avg
§103
40.4%
+0.4% vs TC avg
§102
13.3%
-26.7% vs TC avg
§112
11.3%
-28.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 438 resolved cases

Office Action

§101 §102 §103
DETAILED ACTION Status of Claims Notice of Pre-AIA or AIA Status -The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This communication is in response to the Request for Continued Examination filed 11/05/2025. Claims 1-5, 7, 11 and 13 have been cancelled. Claims 6, 9-10 and 12 have been amended. Claims 14-16 have been newly added. Claims 6, 8-10, 12 and 14-16 are currently pending and have been examined. Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 11/05/2025 has been entered. Claim Rejections – 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 6, 8-10, 12 and 14-16 are rejected under 35 U.S.C. §101 because the claimed invention is directed to an abstract idea without significantly more. Step 1: Claims 6, 8-9, 12 and 14 are directed to methods (i.e., a process) and claims 10 and 15-16 are directed to a system (i.e., a machine). Accordingly, claims 6, 8-10, 12 and 14-16 are all within at least one of the four statutory categories. Step 2A – Prong One: Regarding Prong One of Step 2A, the claim limitations are to be analyzed to determine whether they “recite” a judicial exception or in other words whether a judicial exception is “set forth” or “described” in the claims. An “abstract idea” judicial exception is subject matter that falls within at least one of the following groupings: a) mathematical concepts, b) certain methods of organizing human activity, and/or c) mental processes. Representative independent claim 6 includes limitations that recite an abstract idea. Note that independent claim 10 covers a system claim. Specifically, independent claim 10 recites: A collecting and visualizing system, comprising: a heart rate variability sensor that is configured to determine heart rate variability data of a user and to be mounted to the body of the user or installed in a vicinity of the user; wherein the heart rate variability sensor is configured to send first vital data comprising the heart rate variability data; at least one camera installed in a vicinity of the user for generating image data indicating movements of at least one body part of the user; wherein the plurality of sensors are configured to send second vital data comprising the second information data; a correlation information database storing a first data correlation associating a particular heart rate variability data with a particular stress level of the user; an analyzer circuit for: during a first time period: receiving (a) the first vital data comprising the heart rate variability data and (b) a first quantity of image data generated by the at least one camera indicating particular movement of at least one particular body part; calculating, based on the received first vital data and the received first quantity of image data indicating the particular movement of the at least one particular body part, a coefficient of correlation between the particular movement of the at least one particular body part and the particular heart rate variability data; determining if the coefficient of correlation is equal to or larger than a prescribed value; and in response to determining the coefficient of correlation is equal to or larger than the prescribed value, storing a second data correlation associating the particular movement of the at least one particular body part with the particular stress level of the user correlated with the particular heart rate variability according to the stored first data correlation; and during a second time period: receiving a second quantity of image data generated by the at least one camera; determining the second quantity of image data indicates the particular movement of the at least one particular body part, and in response, automatically estimating the particular stress level of the user based on the stored second data correlation which associates the particular movement of the at least one particular body part with the particular stress level of the user; an avatar data creating circuit for automatically generating an electronic avatar based on the estimated particular stress level of the user and the stored correlation information, the electronic avatar indicating the particular stress level of the user; and a display device for displaying the electronic avatar reflecting the particular stress level of the user. The Examiner submits that the foregoing underlined limitations constitute: (a) “certain methods of organizing human activity” because estimating a stress level of the user, maintaining records about a user’s heart rate, correlating physiological data and personal data associated with identification of the user, and estimating the particular stress level of the user based on the stored data correlation which associates the particular movement of the at least one particular body part with the particular stress level of the user are ways of managing the person’s physical behavior/activities, assessing the personal health of a patient and are interactions between people. Furthermore, these limitations constitute (b) “mathematical concept” because calculating and determining that a coefficient of correlation is equal or larger than a prescribed value based on the received first vital data and the received first quantity of image data indicating the particular movement of the at least one particular body part relates to mathematical relationships and calculations. Accordingly, the claim describes at least one abstract idea. Furthermore, dependent claims 8-9, 12 and 14-16 further define the at least one abstract idea (and thus fail to make the abstract idea any less abstract) as set forth below. In relation to claims 8 and 9 (similarly to claim 6), these claims merely recite specific kinds of encounter information that was obtained such as the second information data comprising a change in a tone of voice of the user, and a spinal extension of the user, non-vital data, comprising one or more of a date of birth, time or place of birth, a blood type, a DNA profile of the user, a result of a divination, or an assessment of vital data by the user. Claims 12 and 14-16 recite determining steps such as claim 12- analyzing a correlation between multiple vital data and non-vital data; estimating one or more of a mental status, a state of health, or a state of activity of the user based on the vital data and non- vital and generating the avatar based on the one or more of the mental status, state of health, or state of activity of the user, claim 14 - in response to the received user input, either (a) displaying on the electronic avatar a prior stress level of the user associated with a prior time, (b) displaying an internal view of the electronic avatar, (c) or displaying a cross-section of the electronic avatar, claim 15 - indicating particular movement of at least one particular body part comprises image data indicating a particular eye movement and claim 16 - indicating particular movement of at least one particular body part comprises image data indicating a particular movement of the face or scalp. Step 2A – Prong Two: Regarding Prong Two of Step 2A, it must be determined whether the claim as a whole integrates the abstract idea into a practical application. As noted, it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.” For the following reasons, the Examiner submits that the above identified additional limitations do not integrate the above-noted at least one abstract idea into a practical application. Regarding the additional limitations of a collecting and visualizing computer system that includes an information processing server, a processor, a storage unit, a processing server, a user management database, an electronic avatar, a display device and a correlation information database, the Examiner submits that these limitations amount to merely using a computer to perform the at least one abstract idea (see MPEP § 2106.05(f)) and are mere instructions to apply the above-noted at least one abstract idea (Id.). The judicial exception is not integrated into a practical application. In particular, the collecting and visualizing computer system that includes the information processing server, processor, storage unit, processing server, user management database, electronic avatar, display device and correlation information database are recited at high levels of generality (i.e., as generic computer components performing generic computer functions of receiving data/inputs, determining and providing data) such that it amounts no more than mere instructions to apply the exception using the generic computer components. Regarding the additional limitations, “receiving (a) the first vital data comprising the heart rate variability data and (b) a first quantity of image data” and “receiving a second quantity of image data generated by the at least one camera,” the Examiner submits that this additional limitation merely adds insignificant pre-solution activity (data gathering; selecting data to be manipulated) to the at least one abstract idea (see MPEP § 2106.05(g)). Regarding the additional limitations, “automatically generating an electronic avatar based on the estimated particular stress level of the user and the stored correlation information, the electronic avatar indicating the particular stress level of the user”, the Examiner submits that these limitations amount to merely using a computer to perform the at least one abstract idea (see MPEP § 2106.05(f)) and are mere instructions to apply the above-noted at least one abstract idea (Id.). Thus, taken alone, the additional elements do not integrate the at least one abstract idea into a practical application. Looking at the additional limitations as an ordered combination adds nothing that is not already present when looking at the elements taken individually. For instance, there is no indication that the additional elements, when considered as a whole, reflect an improvement in the functioning of a computer or an improvement to another technology or technical field, apply or use the above-noted judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition, implement/use the above-noted judicial exception with a particular machine or manufacture that is integral to the claim, effect a transformation or reduction of a particular article to a different state or thing, or apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is not more than a drafting effort designed to monopolize the exception (see MPEP § 2106.05). Thus, claims 6, 8-10, 12 and 14-16 as a whole do not integrate the above-noted at least one abstract idea into a practical application. Step 2B: Regarding Step 2B, in representative independent claim 10, regarding the additional limitations of the collecting and visualizing computer system that includes the information processing server, processor, storage unit, processing server, user management database, electronic avatar, display device and correlation information database, the Examiner submits that these limitations amount to merely using a computer to perform the at least one abstract idea (see MPEP § 2106.05(f)). Thus, representative independent claim 6 and analogous independent claim 10 do not include additional elements (considered both individually and as an ordered combination) that are sufficient to amount to significantly more than the judicial exception for the same reasons to those discussed above with respect to determining that the claim does not integrate the abstract idea into a practical application. In dependent claims 6 and claim 10, analogous dependent claims 8-9, 12 and 14-16, there is no additional elements. Therefore, claims 6, 8-10, 12 and 14-16 are ineligible under 35 USC §101. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 6, 10 and 14-16 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Jain (US 2017/0245759 A1). Claim 6: Jain discloses A method for collecting and visualizing information related to a living body of a user with an information processing server including a processor and a storage unit (See Fig. 1, P0046, P0055 processor, memory, servers and prompting patient images of particular parts of the body in P0061.), comprising: storing, in a correlation information database in the storage unit, a first data correlation between a particular heart rate variability of the user and a particular stress level of the user (See trends and magnitude of HRV changes assigned a proportional weight based upon the physiological traits of the given person programmable using databases mentioned in P0255. See P0312-P0313 reference database of facial regions for generating predictive models for “patient under stress”. Also, see data structures P0050-P0051.); during a first time period: generating, by a first sensor mounted to the body of a user or installed in a vicinity of the user, first vital data, comprising detected heart rate variability data indicating the particular heart rate variability (See collected patient data over time in P0067, including motion sensor in P0039, proximity sensor in P0051, wearable computing device and sensors in P0055, P0162. See P0117 HR variability (HRV) analysis determined from sensor data.); generating a first quantity of second vital data by at least one sensor mounted to the body of the user or installed in a vicinity of the user, which second vital data comprises second information data indicating one or more of blood pressure, body temperature, a movement of an eye, a movement of muscle, or a voice of the user (See P0043, P0205 voice recognition and analysis, detected movement, P0051 heart rate sensor, P0059 detecting chronotropic incompetence (CI) or degradation in ventilation threshold, P0070-P0071, P0102 rating of perceived exertion (RPE) as scoring, measure of physical activity intensity level, P0095-P0096 estimate of activity level (EAL), P0210 blood pressure during sleep and P0261, P0270-P0271 detecting the approximate position of the patient's eye and dilation in the pupil.); calculating, by an analyzer circuit, based on the detected heart rate variability data indicating the particular heart rate variability and the second vital data, a coefficient of correlation between the particular heart rate variability data and the second information data (See P0086 where fever, blood pressure, or other Autonomic Nervous System (ANS) markers serve as the second vital data and P0117-P0119 where HRV baseline correlate with estimate of activity level (EAL), detecting chronotropic incompetence (CI) or degradation in ventilation threshold, rating of perceived exertion (RPE) and measure of physical activity intensity level. Also, see exemplary estimated RPE serve as a coefficient in Fig. 8 steps 825, 830, 835 mentioned in P0181-P0183.); determining, by the analyzer circuit, if the coefficient of correlation is equal to or larger than a prescribed value (See exemplary estimated RPE serve as a coefficient in Fig. 8 steps 825, 830, 835 mentioned in [P0181-P0183] the system is capable of determining whether the estimated RPE is greater than or equal to a threshold RPE value.); and in response to determining the coefficient of correlation between the particular heart rate variability data and the second information data is equal to or larger than the prescribed value (See [P0167] the system is capable of determining whether the user is subject to stress and whether the amount of stress exceeds a baseline amount of stress based upon heart rate (e.g., energy) and heart rate variability (e.g., mood) of the user. Also, see P0212-P0213 and exceeded threshold in P0219-P0220.), storing, by the analyzer circuit, in the correlation information database in the storage unit, a second data correlation which associates the second information data with the particular stress level of the user correlated with the particular heart rate variability according to the stored first data correlation (See trends and magnitude of HRV changes assigned a proportional weight based upon the physiological traits of the given person programmable using databases mentioned in P0255. See P0312-P0313 reference database of facial regions for generating predictive models for “patient under stress”. Also, see data structures P0050-P0051.); and during a second time period: receiving a second quantity of second vital data from at least one sensor (See collected patient data over time in P0067, including motion sensor in P0039, proximity sensor in P0051, wearable computing device and sensors in P0055, P0162. See P0117 HR variability (HRV) analysis determined from sensor data. Also, see HRV profile and an amount of chronic stress episodes experienced by the patient mentioned in P0215-P0216.); and determining the second quantity of second vital data includes information data matching the second information data of the first quantity of second vital data, and in response, automatically estimating the particular stress level of the user based on the stored second data correlation which associates the second information data with the particular stress level of the user (See baseline stress level comparisons in [P0212-P0213] the system is capable of determining whether the patient is subject to stress and whether the amount of stress exceeds a baseline amount of stress based upon HR (e.g., energy) and HRV (e.g., mood) of the patient both being low (e.g., below a baseline for HR and/or a baseline for HRV) at the same time and/or remaining low for at least a minimum amount of time, and the system is capable of determining whether the patient is subject to stress and whether the amount of stress exceeds a baseline amount of stress based upon HR (e.g., energy) and HRV (e.g., mood) of the patient; with the HR being high (above a certain baseline).); and in response to the automatically estimating the particular stress level of the user, automatically displaying, on a display device, an electronic avatar indicating the particular stress level of the user (See Fig. 14 and interactive 3D model as an electronic avatar in [P0288-P0300] The image data may be taken from multiple, different viewpoints. The system uses the image data to generate a three dimensional (3D) model of the body part of the patient. In one aspect, the 3D model is converted into a morphable model that allows the system to analyze an amount of pitting in the body part. The morphable model is manipulated through patient intervention (e.g., user inputs) or programmatically (e.g., automatically) to estimate the level of edema suffered by the patient. Also, see Fig. 9, P0215 and survey questioning the patient’s mood in P0218-P0219.). Claim 10: Jain discloses A collecting and visualizing system (See Fig. 1, P0046, P0055 processor, memory, servers and prompting patient images of particular parts of the body in P0061.), comprising: a heart rate variability sensor that is configured to determine heart rate variability data of a user and to be mounted to the body of the user or installed in a vicinity of the user (See trends and magnitude of HRV changes assigned a proportional weight based upon the physiological traits of the given person programmable using databases mentioned in P0255. See collected patient data over time in P0067, including motion sensor in P0039, proximity sensor in P0051, wearable computing device and sensors in P0055, P0162.); wherein the heart rate variability sensor is configured to send first vital data comprising the heart rate variability data (See P0117 HR variability (HRV) analysis determined from sensor data.); at least one camera installed in a vicinity of the user for generating image data indicating movements of at least one body part of the user (See camera for facial recognition sensor in P0216. Also, see motion sensor in P0200, exemplary camera, ECG and/or ultrasound sensors in P0255.); wherein the plurality of sensors are configured to send second vital data comprising the second information data (See detected heart rate, blood pressure in P086, P0210, exemplary camera, ECG and/or ultrasound sensors in P0255.); a correlation information database storing a first data correlation associating a particular heart rate variability data with a particular stress level of the user (See [P0212-P0213] the system is capable of determining whether the patient is subject to stress and whether the amount of stress exceeds a baseline amount of stress based upon HR (e.g., energy) and HRV (e.g., mood) of the patient both being low (e.g., below a baseline for HR and/or a baseline for HRV) at the same time and/or remaining low for at least a minimum amount of time, and the system is capable of determining whether the patient is subject to stress and whether the amount of stress exceeds a baseline amount of stress based upon HR (e.g., energy) and HRV (e.g., mood) of the patient; with the HR being high (above a certain baseline).); an analyzer circuit for: during a first time period: receiving (a) the first vital data comprising the heart rate variability data and (b) a first quantity of image data generated by the at least one camera indicating particular movement of at least one particular body part (See correlated heart rate sensor 124, camera subsystem 126 and ECG sensor accessed for analysis in P0040, P0051-P0052.); calculating, based on the received first vital data and the received first quantity of image data indicating the particular movement of the at least one particular body part, a coefficient of correlation between the particular movement of the at least one particular body part and the particular heart rate variability data (See P0086 where fever, blood pressure, or other Autonomic Nervous System (ANS) markers serve as the second vital data and P0117-P0119 where HRV baseline correlate with estimate of activity level (EAL), detecting chronotropic incompetence (CI) or degradation in ventilation threshold, rating of perceived exertion (RPE) and measure of physical activity intensity level. Also, see exemplary estimated RPE serve as a coefficient in Fig. 8 steps 825, 830, 835 mentioned in P0181-P0183.); determining if the coefficient of correlation is equal to or larger than a prescribed value (See exemplary estimated RPE serve as a coefficient in Fig. 8 steps 825, 830, 835 mentioned in [P0181-P0183] the system is capable of determining whether the estimated RPE is greater than or equal to a threshold RPE value.); and in response to determining the coefficient of correlation is equal to or larger than the prescribed value, storing a second data correlation associating the particular movement of the at least one particular body part with the particular stress level of the user correlated with the particular heart rate variability according to the stored first data correlation (See exemplary estimated RPE serve as a coefficient in Fig. 8 steps 825, 830, 835 mentioned in [P0181-P0183] the system is capable of determining whether the estimated RPE is greater than or equal to a threshold RPE value.); and during a second time period: receiving a second quantity of image data generated by the at least one camera (See camera, measured emotional state and mood detected from patient’s facial expression and features in P0216.); determining the second quantity of image data indicates the particular movement of the at least one particular body part, and in response (See P0216 the camera and facial recognition sensors in P0216.) , automatically estimating the particular stress level of the user based on the stored second data correlation which associates the particular movement of the at least one particular body part with the particular stress level of the user (See P0039, P0249 where questions are adapted to inquire about the current mood of the patient and correlated movement of the patient serve as indicating the particular movement of the particular body part, in response, automatically estimating the particular stress level.); an avatar data creating circuit for automatically generating an electronic avatar based on the estimated particular stress level of the user and the stored correlation information, the electronic avatar indicating the particular stress level of the user; and a display device for displaying the electronic avatar reflecting the particular stress level of the user (See Fig. 14 and interactive 3D model as an electronic avatar in [P0288-P0300] The image data may be taken from multiple, different viewpoints. The system uses the image data to generate a three dimensional (3D) model of the body part of the patient. In one aspect, the 3D model is converted into a morphable model that allows the system to analyze an amount of pitting in the body part. The morphable model is manipulated through patient intervention (e.g., user inputs) or programmatically (e.g., automatically) to estimate the level of edema suffered by the patient. Also, see Fig. 9, P0215 and survey questioning the patient’s mood in P0218-P0219.). Regarding claim 14, Jain discloses the method according to claim 6, comprising: receiving input via a user interface associated with the display device; and in response to the received user input, either (a) displaying on the electronic avatar a prior stress level of the user associated with a prior time, (b) displaying an internal view of the electronic avatar, (c) or displaying a cross-section of the electronic avatar (See [P0288-P0289] The image data may be taken from multiple, different viewpoints. The system uses the image data to generate a three dimensional (3D) model of the body part of the patient. In one aspect, the 3D model is converted into a morphable model that allows the system to analyze an amount of pitting in the body part. The morphable model is manipulated through patient intervention (e.g., user inputs) or programmatically (e.g., automatically) to estimate the level of edema suffered by the patient, and the system determines that there is a likelihood that the patient is under emotional pressure such as being stressed, fearful, or in pain. Accordingly, the system prompts the patient to capture image data.). Regarding claim 15, Jain discloses the system according to claim 10, wherein the image data generated by the at least one camera indicating particular movement of at least one particular body part comprises image data indicating a particular eye movement (See detecting movement in P0043, P0051, P0249, image analysis of dilated pupils in P0264, P0268-P0269, P0276-P0277.). Regarding claim 16, Jain discloses the system according to claim 10, wherein the image data generated by the at least one camera indicating particular movement of at least one particular body part comprises image data indicating a particular movement of the face or scalp (See Fig. 15 analysis of facial regional, features mentioned in P0305-P0311.). Claim Rejections – 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over Jain (US 2017/0245759 A1) in view of Moffat (U.S. 10,234,938 B2). Regarding claim 8, although Jain teaches the method according to claim 6 as mentioned above, Lange and Adler do not explicitly teach changes in spinal extension. Moffat teaches wherein the second information additionally comprises one or more of a change in a tone of voice of the user, and a spinal extension of the user (See column 26, line 66 to column 27, line 7, where semispinalis muscles are measured and quantized. See communication tasked using one’s voice in column 2, lines 27-31. Also, see column 1, lines 31-44.). Therefore, it would have been obvious to one of ordinary skill in the art of user body distortion before the effective filing date of the claimed invention to modify the method of Jain to include changes in spinal extension as taught by Moffat in order to provide helpful suggestions that a break is needed from overexertion. Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over Jain (US 2017/0245759 A1) in view of Halperin (US 2010/0070455 A1). Regarding Claim 9, although Jain discloses the method according to claim 6 mentioned above, Jain does not explicitly teach second information is information such as a date of birth and a DNA profile of the user. Halperin teaches that it was known in the art of environmental and genetic risk studies at the time of filing to include: wherein the second information data additionally comprises non-vital data, comprising one or more of a date of birth, time or place of birth, a blood type, a DNA profile of the user, a result of a divination, or an assessment of vital data by the user (see obtained DNA samples in P0012, genomic profile and individual’s age and birthplace in P0041.). Therefore, it would have been obvious to one of ordinary skill in the art of environmental and genetic risk studies before the effective filing date of the claimed invention to modify the method of Jain to include second information is information such as a date of birth and a DNA profile of the user as taught by Halperin in order to implement early medical intervention among individuals in particular cohort. Claim 12 is rejected under 35 U.S.C. 103 as being unpatentable over Lange (US 2013/0245502 A1) in view of Adler (US 2012/0127157 A1) further in view of Cheng (US 2015/0264432 A1). Regarding claim 12, although Jain discloses the method according to claim 6 mentioned above, Jain does not explicitly teach analyzing a correlation between multiple vital data and non-vital data, estimating a mental status, a state of health, a state of activity and generating an electronic avatar based on the estimated state of the user. Cheng teaches comprising successively analyzing a correlation between multiple vital data and non-vital data (See P0017, where relating a variety of parameters construe correlation between multiple vital data and non-vital data.); estimating one or more of a mental status, a state of health, or a state of activity of the user based on virtual data and non-vital (In P0017] Media program manager 110 may determine that a condition or criteria is satisfied based on user state 161, the condition being associated with a type of media program. The condition may relate to a variety of parameters associated with user state 161. The condition may relate to a user's current, past, and/or future activity, mood or emotions, sleep quality or quantity, schedule, and the like.); generating the avatar based on the one or more of the mental status, state of health, or state of activity of the user (Avatar is taught in P0030, with exemplary user’s state such as being tired or happy, and in [P0036-P0037] information associated with one or more user states may include avatars 462-463. Also, see wearable device with sensor to monitor heart rate in P0022, as the state of the user in P0026, Fig. 1, Fig. 4.). Therefore, it would have been obvious to one of ordinary skill in the art of user state media correlations before the effective filing date of the claimed invention to modify the system of Jain to include analyzing a correlation between multiple vital data and non-vital data, estimating a mental status, a state of health, a state of activity and generating an electronic avatar based on the estimated state of the user, as taught by Cheng to pinpoint exact activities that effect the user’s good or bad mood a suggestion is offered to the user. Response to Arguments Applicant’s arguments and amendments, see page 7, filed 10/07/2025, with respect to 112 rejections have been fully considered and are persuasive. The 112 rejections of claims 6, 8-10 and 12 has been withdrawn. Applicant argues that the present amendments to claims 6 and 10 further link the alleged abstract idea into a practical application, e.g. see pgs. 7-8 of Remarks – Examiner disagrees. The claimed features, “the heart rate variability sensor is configured to send first vital data comprising the heart rate variability data” and “at least one camera installed in a vicinity of the user for generating image data indicating movements of at least one body part of the user”, for determining an estimated stress level of the user merely add insignificant pre-solution activity (data gathering; selecting data to be manipulated) to the at least one abstract idea. Viewing readings taken form a heart rate variability sensor and viewing a facial expression captured by a camera to determine if a person is experiencing stress is not using technology to do and doesn’t even need a computer to do, but rather relies on a human. Furthermore, superimposing of the avatar to pulsate a heart rate, facial expression, color and estimating the emotions or stress levels of the user, but only rises to the level of “apply it” to implement the abstract idea is merely using the computer in the manner in which it was designed to be used, i.e., performing generic computer functions. Applicant’s arguments have been fully considered, but are now moot in view of the new grounds of rejection. The Examiner has entered a new rejection under 35 USC § 102 and applied new art and art already of record. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to TERESA S WILLIAMS whose telephone number is (571)270-5509. The examiner can normally be reached Mon-Fri, 8:30 am -6:30 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Mamon Obeid can be reached at (571) 270-1813. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /T.S.W./Examiner, Art Unit 3687 02/28/2026 /ALAAELDIN M. ELSHAER/Primary Examiner, Art Unit 3687
Read full office action

Prosecution Timeline

Dec 06, 2019
Application Filed
Jan 29, 2022
Non-Final Rejection — §101, §102, §103
May 16, 2022
Response Filed
Aug 26, 2022
Final Rejection — §101, §102, §103
Dec 02, 2022
Interview Requested
Dec 02, 2022
Response after Non-Final Action
Dec 06, 2022
Applicant Interview (Telephonic)
Dec 06, 2022
Response after Non-Final Action
Jan 03, 2023
Request for Continued Examination
Jan 05, 2023
Response after Non-Final Action
Jan 13, 2023
Non-Final Rejection — §101, §102, §103
Apr 19, 2023
Response Filed
Jul 28, 2023
Final Rejection — §101, §102, §103
Feb 20, 2024
Response after Non-Final Action
Aug 14, 2024
Request for Continued Examination
Oct 23, 2024
Response after Non-Final Action
Dec 04, 2024
Non-Final Rejection — §101, §102, §103
Mar 18, 2025
Response Filed
Jun 28, 2025
Final Rejection — §101, §102, §103
Oct 07, 2025
Response after Non-Final Action
Nov 05, 2025
Request for Continued Examination
Nov 10, 2025
Response after Non-Final Action
Mar 02, 2026
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12396675
METHODS OF ASSESSING HEPATIC ENCEPHALOPATHY
2y 5m to grant Granted Aug 26, 2025
Patent 12266431
MACHINE LEARNING ENGINE AND RULE ENGINE FOR DOCUMENT AUTO-POPULATION USING HISTORICAL AND CONTEXTUAL DATA
2y 5m to grant Granted Apr 01, 2025
Patent 12205725
METHODS AND APPARATUS FOR EVALUATING DEVELOPMENTAL CONDITIONS AND PROVIDING CONTROL OVER COVERAGE AND RELIABILITY
2y 5m to grant Granted Jan 21, 2025
Patent 12191035
ADVANCED AUGMENTED ASSISTIVE DEVICE FOR DEMENTIA, ALZHEIMER'S DISEASE, AND VISUAL IMPAIRMENT PATIENTS
2y 5m to grant Granted Jan 07, 2025
Patent 12159698
SYSTEM FOR PROVIDING AGGREGATED PATIENT DATA
2y 5m to grant Granted Dec 03, 2024
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

7-8
Expected OA Rounds
24%
Grant Probability
42%
With Interview (+18.0%)
5y 0m
Median Time to Grant
High
PTA Risk
Based on 438 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month