Prosecution Insights
Last updated: April 19, 2026
Application No. 18/629,437

METHOD AND APPARATUS FOR DETERMINING HEALTH STATUS

Non-Final OA §101§102§103§DP
Filed
Apr 08, 2024
Examiner
SISON, CHRISTINE ANDREA PAN
Art Unit
3796
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Aicure Corporation
OA Round
1 (Non-Final)
32%
Grant Probability
At Risk
1-2
OA Rounds
3y 9m
To Grant
76%
With Interview

Examiner Intelligence

Grants only 32% of cases
32%
Career Allow Rate
13 granted / 40 resolved
-37.5% vs TC avg
Strong +44% interview lift
Without
With
+44.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 9m
Avg Prosecution
43 currently pending
Career history
83
Total Applications
across all art units

Statute-Specific Performance

§101
8.2%
-31.8% vs TC avg
§103
39.9%
-0.1% vs TC avg
§102
15.9%
-24.1% vs TC avg
§112
30.4%
-9.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 40 resolved cases

Office Action

§101 §102 §103 §DP
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims Claims 1-16 are currently pending in this application. Claim Objections Claim 12 is objected to because of the following informalities: “plurality of sensor” in line 1 should read “plurality of sensors”. Appropriate correction is required. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-16 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception without significantly more. Determination as to whether a claim satisfies the criteria for subject matter eligibility is a stepwise process (MPEP 2016). Step 1: Does the claim fall within a statutory category of invention? Claims 1-8 recite a process (method), and claims 9-16 recite a machine (system), which are within the four statutory categories. Therefore, claims 1-16 are directed to a statutory category of invention. Step 2A, Prong 1: Does the claim recite an abstract idea, law of nature, or natural phenomenon? Claims 1-16 are directed to an abstract idea. Claim 1 is directed to a method for monitoring a state of an individual, the method comprising: obtaining a video record of the individual performing a physical activity; de-identifying the video record, wherein de-identifying the video record comprises identifying a subset of points to be extracted from the video record, the subset of points allowing for future analysis of the physical activity, and storing the subset of points without storing the video record to inhibit re-identification of the individual in the video record; measuring the physical activity as recorded by the stored subset of points; comparing the measured physical activity to an expected physical activity to obtain a comparison result; and determining one or more status categories of the individual based on the comparison result. Claim 9 recites a system that carries out the same method recited in claim 1. The limitations of identifying a subset of points, measuring the physical activity, comparing the physical activity, and determining a status category, as drafted, under their broadest reasonable interpretations, are merely mental processes, because these steps are akin to having a doctor or other human actor performing these operations with pen and paper. For example, “identifying a subset of points” encompasses nothing more than a human actor mentally evaluating the video data and coming to a conclusion about which data points to retain. The limitations of “obtaining a video record” and “storing the subset of points” encompasses nothing more than a human actor collecting these pieces of information by hand. The limitation “storing the subset of points” indicates that the data may be stored for analysis and display at a later time. Therefore, a human actor can perform the analysis with pen and paper. Therefore, claims 1 and 9 recite an abstract idea. Claims 2-8 depend on claim 1, and claims 10-16 depend on claim 9. These dependent claims only recite additional features of the analysis described in claims 1 and 9, which may also be performed by a human actor mentally and using a pen and paper. For example, claims 4 and 12 recite “recording a second physical activity by a plurality of sensors; and analyzing the recorded second physical activity to determine a presence or absence of one or more possible recognized status categories”, which encompasses nothing more than a human actor collecting sensor data by hand, and mentally evaluating the data to draw a conclusion about the status category. Therefore, claims 1-16 recite an abstract idea. Step 2A, Prong 2: Does the claim recite additional elements that integrate the judicial exception into a practical application? This judicial exception is not integrated into a practical application. Claim 9 only recites the additional limitations “one or more processors” and “a memory”. These additional elements are recited at a high level of generality (i.e. most generic computers would be known to have these components). Paragraphs [0015], [0020], [0053], [0082], [0141]-[0142], and [0146]-[0147] or the specification describe the processor and memory at a high level of generality. These generic processor and memory limitations are no more than mere instructions to apply the exception using a generic computer component. Accordingly, this additional element does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. Therefore claim 9 does not integrate the judicial exception into a practical application. Claim 9 recites “a video capture device”, and claims 2 and 10 recite the additional limitation “storing non-visual information associated with the video record and selected from one or more sensor outputs”, which amount to no more than mere pre-solution activity of data gathering. Therefore the claimed generic video capture device and sensor elements do not integrate the judicial exception into a practical application. Thus, these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. Therefore, the claims are directed to an abstract idea. As described above, dependent claims 2-8 and 10-16 only recite other limitations of the data analysis steps recited in claims 1 and 9, which may be done mentally by a human actor and/or with a pen and paper. Step 2B: Does the claim include additional elements that are sufficient to amount to significantly more than the judicial exception? The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As explained above with respect to the integration of the judicial exception into a practical application (Step 2A, Prong 2), the additional elements of using computer components to perform the process steps amounts to no more than mere instructions to apply the judicial exception using generic computer elements. The structural elements recited in claim 9 are one or more processors and a memory. These additional elements are recited at a high level of generality (i.e. most generic computers would be known to have these components). Paragraphs [0015], [0020], [0053], [0082], [0141]-[0142], and [0146]-[0147] or the specification describe the processor and memory at a high level of generality, and only provide conventional, well-known computing functions that do not add meaningful limits to practicing the abstract idea. Claim 9 recites “a video capture device”, and claims 2 and 10 recite the additional limitation “storing non-visual information associated with the video record and selected from one or more sensor outputs”. As discussed above with respect to integration of the abstract idea into a practical application (Step 2A, Prong 2), the additional elements of a video capture device and one or more sensors to collect data amounts to no more than mere pre-solution activity of data gathering. This pre-solution activity of data gathering using a video capture device and one or more sensors is well-understood, routine, and conventional in the field of vital sign monitoring technology. For example, see Baig et al. (Review of Vital Signs Monitoring Systems – Patient’s Acceptability, Issues and Challenges, 2014), which describes known methods of monitoring vital signs using cameras and other sensors. Therefore, the claimed generic video capture device and one or more sensors and computer processing elements are all well-understood, routine, and conventional in the field of vital sign monitoring technology. Therefore, claims 1-16 are not patent-eligible under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-2, 4-10, and 12-16 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Kurtz et al. (US 20080292151 A1), hereinafter Kurtz. Regarding claim 1, Kurtz discloses a method for monitoring a state of an individual (Figs. 4, 5a-d), the method comprising: obtaining a video record of the individual performing a physical activity (Fig. 5a, well-being image capture process 550; paragraph [0053], "Camera 120 can also support the motion detection function, for example using image area histograms to detect presence and position"; Table 1, eye and body movements; paragraph [0062], "captured video data can be used to acquire temporal measured wellness parameters 410 for physiological attributes such as such as eye movements (blinks/minute, side to side motions/min.), hand tremors (mm movement/sec), gait, or other attributes that can be indicative of neurological or motor control conditions"); de-identifying the video record, wherein de-identifying the video record comprises identifying a subset of points to be extracted from the video record, the subset of points allowing for future analysis of the physical activity (paragraph [0059], "Although the reference images themselves can be stored in memory 345 as system data in step 522f (or as capture parameter data 415), reference image parameters (including physiological metrics) can also be derived and stored as accessible capture parameters 415 or wellness parameters 410 in the appropriate databases to aid the image capture process. The reference image parameters generally quantify physical attributes or reference features 367 of the subjects."; paragraph [0093], "the default condition for physiological monitoring system 300 is for it to only save the captured wide field of view images cropped down to just head and shoulders images ... an alternate default condition can be for the physiological monitoring system 300 to tabulate and retain wellness parameter data 410 calculated only from these facial images"); storing the subset of points without storing the video record to inhibit re-identification of the individual in the video record (paragraph [0059], "reference image parameters (including physiological metrics) can also be derived and stored as accessible capture parameters 415 or wellness parameters 410 in the appropriate databases"; paragraph [0093], "retain wellness parameter data 410 calculated only from these facial images"; paragraph [0102], "If the system does not detect any significant changes, it can retain only wellness parameters 410, or the trend line data, or perhaps no data at all"; paragraph [0064], "key-frame, key video extraction or video summarization can be used to identify and retain one or more images that meet the target criteria"); measuring the physical activity as recorded by the stored subset of points (paragraph [0062], "captured video data can be used to acquire temporal measured wellness parameters 410 for physiological attributes such as such as eye movements (blinks/minute, side to side motions/min.), hand tremors (mm movement/sec), gait, or other attributes"); comparing the measured physical activity to an expected physical activity to obtain a comparison result (paragraph [0056], "The inference engine 400, which can be algorithm based, or utilize artificial intelligence (AI) or learning methods, is tasked to follow and assess previously identified physiological trends for its subjects (users 10)"; Fig. 5c, paragraph [0083], "It can compare (step 402a) any newly derived wellness parameters 410 to both the wellness parameters 410 derived from the reference data, and to the longitudinal record for the wellness parameters 410, including the trend-line type wellness parameters 410"; paragraph [0084], "The inference engine 400 can also compare (step 402a) wellness parameter data 410 of one individual to that of another"); and determining one or more status categories of the individual based on the comparison result (paragraph [0056], "wellness parameters 410 quantify various wellness, health, and medical conditions or attributes. Once these wellness parameters 410 are calculated and stored to a memory 345, an inference engine 400 (which is functionally supported by a computer 340) is utilized in the subsequent step to assess the status of physiological conditions."; paragraph [0078], "temporal data related to physiological attributes such as gait or eye movements or hand movements can be extracted from the video image data, normalized for current capture conditions, and retained to support assessments of mechanical or neurological condition"; paragraphs [0083], [0085], [0103], [0105]). Regarding claim 2, Kurtz discloses the method of claim 1, as explained above. Kurtz further discloses storing non-visual information associated with the video record and selected from one or more sensor outputs (Fig. 2b, paragraph [0041], "A variety of detectors can be provided, including an ambient light detector 140, a motion detector 142, and various secondary detectors 144 that can be used for measuring ambient light or other physiological or environmental parameters"; paragraphs [0053], [0078], [0108]). Regarding claim 4, Kurtz discloses the method of claim 1, as explained above. Kurtz further discloses: recording a second physical activity by a plurality of sensors (paragraph [0108], "the system can be equipped with secondary detectors 144 or connections (wireless or physical wires) to other bio-monitoring devices, such as microphones, thermometers, digital scales, sensing textiles, ear bud sensors, pulse oximeters, and saliva testing devices ... Using various secondary detector devices, wellness parameters 410 such as blood pressure and heart rate, blood oxygenation, blood glucose levels, or body temperature"; ); and analyzing the recorded second physical activity to determine a presence of absence of one or more possible recognized status categories (paragraph [0109], "the physiological monitoring system 300 can observe the posture, ergonomics, emotional response, attention span, time spent, and fatigue of a subject 10"). Regarding claim 5, Kurtz discloses the method of claim 1, as explained above. Kurtz further discloses that the subset of points comprises facial keypoints (paragraphs [0065], [0070], facial geometry). Regarding claim 6, Kurtz discloses the method of claim 1, as explained above. Kurtz further discloses that the subset of points corresponds to one or more portions of a face of the individual (paragraph [0070], "Distances to various facial features, such as mouth 60, nose, ears, as well as the horizon edges of the cheeks can then be calculated."). Regarding claim 7, Kurtz discloses the method of claim 1, as explained above. Kurtz further discloses displaying, to a user, an interface by which the user is able to modify a number of points to be included in the subset of points (Table 1, "Defining target images for various subjects", "Defining image data management for privacy sensitive body regions for various subjects"; paragraphs [0093], [0095]). Regarding claim 8, Kurtz discloses the method of claim 1, as explained above. Kurtz further discloses: blurring a portion of the video record; and storing the blurred portion in association with the stored subset of points (paragraph [0094], "the physiological monitoring system 300 can collect and maintain torso or full body imagery, along with the associated wellness parameters 410, but with the image data for the generally private body areas respectively blurred or cropped out"). Regarding claim 9, Kurtz discloses a system for monitoring a state of an individual (Fig. 1, paragraph [0040], physiological monitoring system 300), comprising: a video capture device (Fig. 1, paragraph [0040], electronic imaging device 100 and camera 120; paragraph [0064]); one or more processors (Fig. 2b, paragraph [0041], image processor 320, system controller 330, computer 340); and a memory storing one or more non-transitory, computer-readable instructions (Fig. 2a, paragraph [0041], memory 345) that, when executed by the one or more processors, cause the one or more processors to perform operations comprising: obtaining a video record of the individual performing a physical activity (Fig. 5a, well being image capture process 550; paragraph [0053], "Camera 120 can also support the motion detection function, for example using image area histograms to detect presence and position"; Table 1, eye and body movements; paragraph [0062], "captured video data can be used to acquire temporal measured wellness parameters 410 for physiological attributes such as such as eye movements (blinks/minute, side to side motions/min.), hand tremors (mm movement/sec), gait, or other attributes that can be indicative of neurological or motor control conditions"); de-identifying the video record, wherein de-identifying the video record comprises identifying a subset of points to be extracted from the video record, the subset of points allowing for future analysis of the physical activity (paragraph [0059], "Although the reference images themselves can be stored in memory 345 as system data in step 522f (or as capture parameter data 415), reference image parameters (including physiological metrics) can also be derived and stored as accessible capture parameters 415 or wellness parameters 410 in the appropriate databases to aid the image capture process. The reference image parameters generally quantify physical attributes or reference features 367 of the subjects."; paragraph [0093], "the default condition for physiological monitoring system 300 is for it to only save the captured wide field of view images cropped down to just head and shoulders images ... an alternate default condition can be for the physiological monitoring system 300 to tabulate and retain wellness parameter data 410 calculated only from these facial images"); storing the subset of points without storing the video record to inhibit re-identification of the individual in the video record (paragraph [0059], "reference image parameters (including physiological metrics) can also be derived and stored as accessible capture parameters 415 or wellness parameters 410 in the appropriate databases"; paragraph [0093], "retain wellness parameter data 410 calculated only from these facial images"; paragraph [0102], "If the system does not detect any significant changes, it can retain only wellness parameters 410, or the trend line data, or perhaps no data at all"; paragraph [0064], "key-frame, key video extraction or video summarization can be used to identify and retain one or more images that meet the target criteria"); measuring the physical activity as recorded by the stored subset of points (paragraph [0062], "captured video data can be used to acquire temporal measured wellness parameters 410 for physiological attributes such as such as eye movements (blinks/minute, side to side motions/min.), hand tremors (mm movement/sec), gait, or other attributes"); comparing the measured physical activity to an expected physical activity to obtain a comparison result (paragraph [0056], "The inference engine 400, which can be algorithm based, or utilize artificial intelligence (AI) or learning methods, is tasked to follow and assess previously identified physiological trends for its subjects (users 10)"; Fig. 5c, paragraph [0083], "It can compare (step 402a) any newly derived wellness parameters 410 to both the wellness parameters 410 derived from the reference data, and to the longitudinal record for the wellness parameters 410, including the trend-line type wellness parameters 410"; paragraph [0084], "The inference engine 400 can also compare (step 402a) wellness parameter data 410 of one individual to that of another"); and determining one or more status categories of the individual based on the comparison result (paragraph [0056], "wellness parameters 410 quantify various wellness, health, and medical conditions or attributes. Once these wellness parameters 410 are calculated and stored to a memory 345, an inference engine 400 (which is functionally supported by a computer 340) is utilized in the subsequent step to assess the status of physiological conditions."; paragraph [0078], "temporal data related to physiological attributes such as gait or eye movements or hand movements can be extracted from the video image data, normalized for current capture conditions, and retained to support assessments of mechanical or neurological condition"; paragraphs [0083], [0085], [0103], [0105]). Regarding claim 10, Kurtz discloses the system of claim 9, as explained above. Kurtz further discloses storing non-visual information associated with the video record and selected from one or more sensor outputs (Fig. 2b, paragraph [0041], "A variety of detectors can be provided, including an ambient light detector 140, a motion detector 142, and various secondary detectors 144 that can be used for measuring ambient light or other physiological or environmental parameters"; paragraphs [0053], [0078], [0108]). Regarding claim 12, Kurtz discloses the system of claim 9, as explained above. Kurtz further discloses: recording a second physical activity by a plurality of sensors (paragraph [0108], "the system can be equipped with secondary detectors 144 or connections (wireless or physical wires) to other bio-monitoring devices, such as microphones, thermometers, digital scales, sensing textiles, ear bud sensors, pulse oximeters, and saliva testing devices ... Using various secondary detector devices, wellness parameters 410 such as blood pressure and heart rate, blood oxygenation, blood glucose levels, or body temperature"; ); and analyzing the recorded second physical activity to determine a presence of absence of one or more possible recognized status categories (paragraph [0109], "the physiological monitoring system 300 can observe the posture, ergonomics, emotional response, attention span, time spent, and fatigue of a subject 10"). Regarding claim 13, Kurtz discloses the system of claim 9, as explained above. Kurtz further discloses that the subset of points comprises facial keypoints (paragraphs [0065], [0070], facial geometry). Regarding claim 14, Kurtz discloses the system of claim 9, as explained above. Kurtz further discloses that the subset of points corresponds to one or more portions of a face of the individual (paragraph [0070], "Distances to various facial features, such as mouth 60, nose, ears, as well as the horizon edges of the cheeks can then be calculated."). Regarding claim 15, Kurtz discloses the system of claim 9, as explained above. Kurtz further discloses displaying, to a user, an interface by which the user is able to modify a number of points to be included in the subset of points (Table 1, "Defining target images for various subjects", "Defining image data management for privacy sensitive body regions for various subjects"; paragraphs [0093], [0095]). Regarding claim 16, Kurtz discloses the system of claim 9, as explained above. Kurtz further discloses: blurring a portion of the video record; and storing the blurred portion in association with the stored subset of points (paragraph [0094], "the physiological monitoring system 300 can collect and maintain torso or full body imagery, along with the associated wellness parameters 410, but with the image data for the generally private body areas respectively blurred or cropped out"). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 3 and 11 are rejected under 35 U.S.C. 103 as being unpatentable over Kurtz et al. (US 20080292151 A1), hereinafter Kurtz, in view of Miller et al. (US 10154460 B1), hereinafter Miller. Regarding claim 3, Kurtz discloses the method of claim 1, as explained above. Kurtz does not explicitly disclose identifying a subset of sensors of a plurality of sensors applicable to recognize a particular status category of the individual; recording a second physical activity by the subset of sensors; and analyzing the recorded second physical activity to confirm a presence or absence of the particular status category. However, Miller teaches methods for power adjustments of user measurement device (Abstract), comprising: identifying a subset of sensors of a plurality of sensors applicable to recognize a particular status category of the individual (column 31, lines 9-11 and 22-25, "PMM 1120 determines a subset of less than all of the first set of sensors for a second set of physiological measurements ... PMM 1120 determines a subset of less than all of the first set of sensors based on at least one of an activity type of the user or a selected physiological output"); recording a second physical activity by the subset of sensors (column 31, lines 11-12 and 25-26, "measures the second set using the subset"); and analyzing the recorded second physical activity to confirm a presence or absence of the particular status category (column 29, lines 38-41, "power management activities may be performed to reduce power consumption of the wearable UMD 1100 in response to the determination of activity level"). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Kurtz with the teachings Miller to identify a subset of sensors of a plurality of sensors applicable to recognize a particular status category of the individual; record a second physical activity by the subset of sensors; and analyze the recorded second physical activity to confirm a presence or absence of the particular status category, because doing so enables the device to manage its power consumption based on the user's activity level, which can extend a usable period of time of the portable or mobile electronic devices (Miller, column 1, lines 40-49). Regarding claim 11, Kurtz discloses the system of claim 9, as explained above. Kurtz does not explicitly disclose identifying a subset of sensors of a plurality of sensors applicable to recognize a particular status category of the individual; recording a second physical activity by the subset of sensors; and analyzing the recorded second physical activity to confirm a presence or absence of the particular status category. However, Miller teaches methods for power adjustments of user measurement device (Abstract), comprising: identifying a subset of sensors of a plurality of sensors applicable to recognize a particular status category of the individual (column 31, lines 9-11 and 22-25, "PMM 1120 determines a subset of less than all of the first set of sensors for a second set of physiological measurements ... PMM 1120 determines a subset of less than all of the first set of sensors based on at least one of an activity type of the user or a selected physiological output"); recording a second physical activity by the subset of sensors (column 31, lines 11-12 and 25-26, "measures the second set using the subset"); and analyzing the recorded second physical activity to confirm a presence or absence of the particular status category (column 29, lines 38-41, "power management activities may be performed to reduce power consumption of the wearable UMD 1100 in response to the determination of activity level"). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Kurtz with the teachings Miller to identify a subset of sensors of a plurality of sensors applicable to recognize a particular status category of the individual; record a second physical activity by the subset of sensors; and analyze the recorded second physical activity to confirm a presence or absence of the particular status category, because doing so enables the device to manage its power consumption based on the user's activity level, which can extend a usable period of time of the portable or mobile electronic devices (Miller, column 1, lines 40-49). Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. Claims 1-16 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-16 of U.S. Patent No. 11961620 B2. Although the claims at issue are not identical, they are not patentably distinct from each other because the instant claims represent a broader genus of the patented species. In such situations, the narrower species necessarily anticipates the broader genus represented by the instant claims. Claim 1 of the instant application recites “determining one or more status categories of the individual based on the comparison result” and claim 1 of U.S. Patent No. 11961620 B2 recites “diagnosing one or more aspects of disease based on the comparison result”. Under the broadest reasonable interpretation, “determining one or more status categories of the individual” encompasses “diagnosing one or more aspects of disease”. Therefore, instant claims 1-16 are anticipated by claims 1-16 of U.S. Patent No. 11961620 B2. A brief matching of the claims in each document is provided below. Regarding instant claim 1, the method of instant claim 1 is anticipated by claim 1 of U.S. Patent No. 11961620 B2, and the features of instant claim 1 are anticipated by claim 1 of U.S. Patent No. 11961620 B2. Regarding instant claim 2, the method of instant claim 2 is anticipated by claim 2 of U.S. Patent No. 11961620 B2, and the features of instant claim 2 are anticipated by claim 2 of U.S. Patent No. 11961620 B2. Regarding instant claim 3, the method of instant claim 3 is anticipated by claim 3 of U.S. Patent No. 11961620 B2, and the features of instant claim 3 are anticipated by claim 3 of U.S. Patent No. 11961620 B2. Regarding instant claim 4, the method of instant claim 4 is anticipated by claim 4 of U.S. Patent No. 11961620 B2, and the features of instant claim 4 are anticipated by claim 4 of U.S. Patent No. 11961620 B2. Regarding instant claim 5, the method of instant claim 5 is anticipated by claim 5 of U.S. Patent No. 11961620 B2, and the features of instant claim 5 are anticipated by claim 5 of U.S. Patent No. 11961620 B2. Regarding instant claim 6, the method of instant claim 6 is anticipated by claim 6 of U.S. Patent No. 11961620 B2, and the features of instant claim 6 are anticipated by claim 6 of U.S. Patent No. 11961620 B2. Regarding instant claim 7, the method of instant claim 7 is anticipated by claim 7 of U.S. Patent No. 11961620 B2, and the features of instant claim 7 are anticipated by claim 7 of U.S. Patent No. 11961620 B2. Regarding instant claim 8, the method of instant claim 8 is anticipated by claim 8 of U.S. Patent No. 11961620 B2, and the features of instant claim 8 are anticipated by claim 8 of U.S. Patent No. 11961620 B2. Regarding instant claim 9, the system of instant claim 9 is anticipated by claim 9 of U.S. Patent No. 11961620 B2, and the features of instant claim 9 are anticipated by claim 9 of U.S. Patent No. 11961620 B2. Regarding instant claim 10, the system of instant claim 10 is anticipated by claim 10 of U.S. Patent No. 11961620 B2, and the features of instant claim 10 are anticipated by claim 10 of U.S. Patent No. 11961620 B2. Regarding instant claim 11, the system of instant claim 11 is anticipated by claim 11 of U.S. Patent No. 11961620 B2, and the features of instant claim 11 are anticipated by claim 11 of U.S. Patent No. 11961620 B2. Regarding instant claim 12, the system of instant claim 12 is anticipated by claim 12 of U.S. Patent No. 11961620 B2, and the features of instant claim 12 are anticipated by claim 12 of U.S. Patent No. 11961620 B2. Regarding instant claim 13, the system of instant claim 13 is anticipated by claim 13 of U.S. Patent No. 11961620 B2, and the features of instant claim 13 are anticipated by claim 13 of U.S. Patent No. 11961620 B2. Regarding instant claim 14, the system of instant claim 14 is anticipated by claim 14 of U.S. Patent No. 11961620 B2, and the features of instant claim 14 are anticipated by claim 14 of U.S. Patent No. 11961620 B2. Regarding instant claim 15, the system of instant claim 15 is anticipated by claim 15 of U.S. Patent No. 11961620 B2, and the features of instant claim 15 are anticipated by claim 15 of U.S. Patent No. 11961620 B2. Regarding instant claim 16, the system of instant claim 16 is anticipated by claim 16 of U.S. Patent No. 11961620 B2, and the features of instant claim 16 are anticipated by claim 16 of U.S. Patent No. 11961620 B2. 18/629,437 (Instant Application) U.S. Patent No. 11961620 B2 Claim Element Claim Element 1 A method for monitoring a state of an individual, the method comprising: 1 A method for monitoring a state of an individual, the method comprising: 1 obtaining a video record of the individual performing a physical activity; 1 obtaining a video record of the individual performing a physical activity; 1 de-identifying the video record, wherein de-identifying the video record comprises 1 de-identifying the video record, wherein de-identifying the video record comprises 1 identifying a subset of points to be extracted from the video record, the subset of points allowing for future analysis of the physical activity, and 1 identifying a subset of points to be extracted from the video record, the subset of points allowing for future analysis of the physical activity, and 1 storing the subset of points without storing the video record to inhibit re-identification of the individual in the video record; 1 storing the subset of points without storing the video record to inhibit re-identification of the individual in the video record; 1 measuring the physical activity as recorded by the stored subset of points; 1 measuring the physical activity as recorded by the stored subset of points; 1 comparing the measured physical activity to an expected physical activity to obtain a comparison result; and 1 comparing the measured physical activity to an expected physical activity to obtain a comparison result; and 1 determining one or more status categories of the individual based on the comparison result. 1 diagnosing one or more aspects of disease based on the comparison result. 2 The method of claim 1, comprising storing non-visual information associated with the video record and selected from one or more sensor outputs. 2 The method of claim 1, comprising storing non-visual information associated with the video record and selected from one or more sensor outputs. 3 The method of claim 1, comprising: 3 The method of claim 1, comprising: 3 identifying a subset of sensors of a plurality of sensors applicable to recognize a particular status category of the individual; 3 identifying a subset of sensors of a plurality of sensors applicable to recognize a particular disease; 3 recording a second physical activity by the subset of sensors; and 3 recording a second physical activity by the subset of sensors; and 3 analyzing the recorded second physical activity to confirm a presence or absence of the particular status category. 3 analyzing the recorded second physical activity to confirm a presence or absence of the particular disease. 4 The method of claim 1, comprising: 4 The method of claim 1, comprising: 4 recording a second physical activity by a plurality of sensors; and 4 recording a second physical activity by a plurality of sensors; and 4 analyzing the recorded second physical activity to determine a presence or absence of one or more possible recognized status categories. 4 analyzing the recorded second physical activity to determine a presence or absence of one or more possible recognized diseases. 5 The method of claim 1, wherein the subset of points comprises facial keypoints. 5 The method of claim 1, wherein the subset of points comprises facial keypoints. 6 The method of claim 1, wherein the subset of points corresponds to one or more portions of a face of the individual. 6 The method of claim 1, wherein the subset of points corresponds to one or more portions of a face of the individual. 7 The method of claim 1, comprising displaying, to a user, an interface by which the user is able to modify a number of points to be included in the subset of points. 7 The method of claim 1, comprising displaying, to a user, an interface by which the user is able to modify a number of points to be included in the subset of points. 8 The method of claim 1, comprising: 8 The method of claim 1, comprising: 8 blurring a portion of the video record; and 8 blurring a portion of the video record; and 8 storing the blurred portion in association with the stored subset of points. 8 storing the blurred portion in association with the stored subset of points 9 A system for monitoring a state of an individual, comprising: 9 A system for monitoring a state of an individual, comprising: 9 a video capture device; 9 a video capture device; 9 one or more processors; and 9 one or more processors; and 9 a memory storing one or more non-transitory, computer-readable instructions that, when executed by the one or more processors, cause the one or more processors to perform operations comprising: 9 a memory storing one or more non-transitory, computer-readable instructions that, when executed by the one or more processors, cause the one or more processors to perform operations comprising: 9 obtaining, from the video capture device, a video record of the individual performing a physical activity; 9 obtaining, from the video capture device, a video record of the individual performing a physical activity; 9 de-identifying the video record, wherein de-identifying the video record comprises 9 de-identifying the video record, wherein de-identifying the video record comprises 9 identifying a subset of points to be extracted from the video record, the subset of points allowing for future analysis of the physical activity;, and 9 identifying a subset of points to be extracted from the video record, the subset of points allowing for future analysis of the physical activity, and 9 storing the subset of points without storing the video record to inhibit re-identification of the individual in the video record; 9 storing the subset of points without storing the video record to inhibit re-identification of the individual in the video record; 9 measuring the physical activity as recorded by the stored subset of points; 9 measuring the physical activity as recorded by the stored subset of points; 9 comparing the measured physical activity to an expected physical activity to obtain a comparison result; and 9 comparing the measured physical activity to an expected physical activity to obtain a comparison result; and 9 determining one or more status categories of the individual based on the comparison result. 9 diagnosing one or more aspects of disease based on the comparison result. 10 The system of claim 9, comprising storing non-visual information associated with the video record and selected from one or more sensor outputs. 10 The system of claim 9, comprising storing non-visual information associated with the video record and selected from one or more sensor outputs. 11 The system of claim 9, comprising a plurality of sensors, wherein the operations comprise: 11 The system of claim 9, comprising a plurality of sensors, wherein the operations comprise: 11 identifying a subset of sensors of a plurality of sensors applicable to recognize a particular status category of the individual; 11 identifying a subset of sensors of a plurality of sensors applicable to recognize a particular status category of the individual; 11 recording a second physical activity by the subset of sensors; and 11 recording a second physical activity by the subset of sensors; and 11 analyzing the recorded second physical activity to confirm a presence or absence of the particular status category. 11 analyzing the recorded second physical activity to confirm a presence or absence of the particular status category. 12 The system of claim 9, comprising a plurality of sensors, wherein the operations comprise: 12 The system of claim 9, comprising a plurality of sensors, wherein the operations comprise: 12 recording a second physical activity by a plurality of sensors; and 12 recording a second physical activity by a plurality of sensors; and 12 analyzing the recorded second physical activity to determine a presence or absence of one or more possible recognized status categories. 12 analyzing the recorded second physical activity to determine a presence or absence of one or more possible recognized status categories. 13 The system of claim 9, wherein the subset of points comprises facial keypoints. 13 The system of claim 9, wherein the subset of points comprises facial keypoints. 14 The system of claim 9, wherein the subset of points corresponds to one or more portions of a face of the individual. 14 The system of claim 9, wherein the subset of points corresponds to one or more portions of a face of the individual. 15 The system of claim 9, comprising a display, wherein the operations comprise displaying, toa user, by the display, an interface by which the user is able to modify a number of points to be included in the subset of points. 15 The system of claim 9, comprising a display, wherein the operations comprise displaying, toa user, by the display, an interface by which the user is able to modify a number of points to be included in the subset of points. 16 The system of claim 9, wherein the operations comprise: 16 The system of claim 9, wherein the operations comprise: 16 blurring a portion of the video record; and 16 blurring a portion of the video record; and 16 storing the blurred portion in association with the stored subset of points. 16 storing the blurred portion in association with the stored subset of points. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHRISTINE SISON whose telephone number is (703)756-4661. The examiner can normally be reached 8 am - 5 pm PT, Mon - Fri. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jennifer McDonald can be reached at (571) 270-3061. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /CHRISTINE SISON/Examiner, Art Unit 3796 /Jennifer Pitrak McDonald/Supervisory Patent Examiner, Art Unit 3796
Read full office action

Prosecution Timeline

Apr 08, 2024
Application Filed
Mar 05, 2026
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12594421
Compositions And Methods For Increasing Cancer Cell Sensitivity To Alternating Electric Fields
2y 5m to grant Granted Apr 07, 2026
Patent 12594425
APPARATUS AND METHOD FOR REDUCING THE EFFECT OF LEAD MIGRATION DURING SPINAL CORD STIMULATION
2y 5m to grant Granted Apr 07, 2026
Patent 12544576
Treatment of Inflammatory Disorders
2y 5m to grant Granted Feb 10, 2026
Patent 12502143
METHOD FOR ESTIMATING ARRANGEMENT OF ELECTRODES ON BIOLOGICAL TISSUE
2y 5m to grant Granted Dec 23, 2025
Patent 12502521
INTRAVASCULAR BLOOD PUMP
2y 5m to grant Granted Dec 23, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
32%
Grant Probability
76%
With Interview (+44.0%)
3y 9m
Median Time to Grant
Low
PTA Risk
Based on 40 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month