Prosecution Insights
Last updated: April 19, 2026
Application No. 18/543,760

SENSING PERIPHERAL HEURISTIC EVIDENCE, REINFORCEMENT, AND ENGAGEMENT SYSTEM

Final Rejection §103
Filed
Dec 18, 2023
Examiner
BOLOURCHI, NADER
Art Unit
2631
Tech Center
2600 — Communications
Assignee
State Farm Mutual Automobile Insurance Company
OA Round
5 (Final)
82%
Grant Probability
Favorable
6-7
OA Rounds
2y 8m
To Grant
94%
With Interview

Examiner Intelligence

Grants 82% — above average
82%
Career Allow Rate
591 granted / 723 resolved
+19.7% vs TC avg
Moderate +12% lift
Without
With
+12.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
22 currently pending
Career history
745
Total Applications
across all art units

Statute-Specific Performance

§101
4.0%
-36.0% vs TC avg
§103
34.1%
-5.9% vs TC avg
§102
15.4%
-24.6% vs TC avg
§112
29.4%
-10.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 723 resolved cases

Office Action

§103
DETAILED ACTION Remarks The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This Office Action is responsive to the amendment filed on 10/31/2025. Claims 1-20, of which claims 1, 8 and 15 are independent, were pending in this application and are considered below. Response to Arguments--- Applicant’s arguments regarding the rejection of claims under 35 USC 103 as being obvious over Wright et al. in view of Nagale et al. and further in view of Madden filed on 10/31/2025 have been fully considered but they are not persuasive. Examiner notes that the Applicant's arguments are substantially a reiteration of the arguments in the Applicant's previous responses. Examiner has already addressed these same contents in the previous office action, which has been restated below. The Examiner thoroughly reviewed Applicant’s arguments but stands on the fact that all limitations are taught by either Wright et al., Nagale et al., Madden or their combination and that neither one alone discloses all the limitations. The applicant has failed to show a limitation foreign to the prior art references. Further amendments are necessary for the claims to overcome the rejections. At the outset, Applicant(s) are reminded that MPEP 2141.02 VI. states: A prior art reference must be considered in its entirety, i.e., as a whole, including portions that would lead away from the claimed invention. W.L. Gore & Associates, Inc. v. Garlock, Inc., 721 F.2d 1540, 220 USPQ 303 (Fed. Cir. 1983), cert. denied, 469 U.S. 851 (1984). Applicant’s Argument: “Applicant respectfully submits that the cited references, even in combination, fail to disclose or render obvious "analyzing, by one or more processors, data captured by a plurality of sensors associated with the home environment using a trained neural network model to identify one or more anomalies, ..,. wherein the data captured by the plurality of sensors includes one or more of image data or video data," as recited by claim 1. Claims 8 and 15 recite similar features" (Remarks, lines 22-27 of page 8) Examiner Response: Examiner respectfully disagrees. Wright et al. disclose analyzing, by one or more processor ("central processing unit (CPU) 52", ¶[0044]), the captured data to identify one or more anomalies ("The received data is then saved in one or more historical logs at step 152, e.g., in the online database 22. The data is then analyzed to determine patterns at step 154 … Based on the analyzing, the analytics engine 64 generates a signature for the monitored property 14 at step 158. The signature can include normal activities for certain days of the week, periods within each day, and further layers of granularity as desired", ¶[0063]; "The signature that is generated for that monitored property 14 is then output at step 160 to enable normal, abnormal, and lack of activity to be monitored against the signature", ¶[0065]) … wherein the data captured includes video data ("It is also possible to use modern video monitoring ... equipment to remotely view the person at risk", ¶[0007]). Applicant’s Argument: “The Office Action cites to the "sensor system 12" of Wright in connection with the "data captured by a plurality of sensors" from claim 1. See Office Action, pp. 5-6. However, Applicant respectfully contends that Wright fails to disclose that the "sensor system 12" captures image data or video data." (Remarks, lines 1-4 of page 9) Examiner Response: Examiner respectfully disagrees. At the outset, it should be noted that the Office Action includes rejection based on obviousness. The test for obviousness is not whether the features of one reference may be bodily incorporated into the other to produce the claimed subject matter but simply what the combination of references makes obvious to one of ordinary skill in the pertinent art. In re Bozek, 163 USPQ 545 (CCPA 1969). In re Mapelsden, 51 CCPA 1123, 329 F.2d 321, 141 USPQ 30 (1964). In re Henley, 44 CCPA 701, 239 F.2d 3, 112 USPQ 56 (1956). Wright et al. disclose: “FIG. 2 provides additional detail concerning a sensor system 12 deployed in a particular example of a monitored property 14 to illustrate a network of sensor units 40 that are used to passively gather data and monitor the property 14 and its occupant(s). “ (¶[0041]); i.e., the sensor system 12 includes pluralities of sensors 40 used to gather data. However, as stated above, Wright et al. disclose that "It is also possible to use modern video monitoring ... equipment to remotely view the person at risk" (¶[0007]); i.e., implicitly suggest that the sensor system 12 could be a modern video monitoring equipment. One of the ordinary skills understand that modern video monitoring equipment contains sensors, particularly image sensors (like CMOS/CCD) that capture light and motion sensors (PIR, microwave) for triggering, and with AI, cameras are increasingly functioning as sophisticated sensors themselves, detecting and analyzing events beyond just recording, making them integral parts of larger smart sensor systems. Applicant’s Argument: “Applicant respectfully contends that Wright fails to disclose analyzing image data or video data by a processor to identify anomalies." (Remarks, lines 2-6 of page 9) Examiner Response: Examiner respectfully disagrees. Wright et al. disclose analyzing, by one or more processor ("central processing unit (CPU) 52", ¶[0044]), the captured data to identify one or more anomalies ("The received data is then saved in one or more historical logs at step 152, e.g., in the online database 22. The data is then analyzed to determine patterns at step 154 … Based on the analyzing, the analytics engine 64 generates a signature for the monitored property 14 at step 158. The signature can include normal activities for certain days of the week, periods within each day, and further layers of granularity as desired", ¶[0063]; "The signature that is generated for that monitored property 14 is then output at step 160 to enable normal, abnormal, and lack of activity to be monitored against the signature", ¶[0065]) … wherein the data captured includes video data ("It is also possible to use modern video monitoring ... equipment to remotely view the person at risk", ¶[0007]). Applicant’s Argument: “Applicant respectfully contends that, at best, Wright discloses: "It is also possible to use modern video monitoring and/or audio recording equipment to remotely view the person at risk and ensure that they are well, but few people are known to choose such a system due to the loss of privacy and the requirement for someone to be actively monitoring the person remotely to ensure their health." See Wright, para. [0007]. That is, Applicant respectfully contends that while Wright mentions it is possible for to remotely view a person at risk using video monitoring or audio recording equipment, Wright does not disclose that the "sensor system 12" includes this equipment." (Remarks, lines 7-14 of page 9) and “Applicant respectfully submits that Wright fails to disclose "analyzing, by one or more processors, data captured by a plurality of sensors associated with the home environment using a trained neural network model to identify one or more anomalies, ... wherein the data captured by the plurality of sensors includes one or more of image data or video data," as recited by claim 1. Claims 8 and 15 recite similar features.” (line 26 of page 9 to line 1 of page 10) Examiner Response: Examiner respectfully disagrees. As stated earlier, it should be noted that the Office Action includes rejection based on obviousness. The test for obviousness is not whether the features of one reference may be bodily incorporated into the other to produce the claimed subject matter but simply what the combination of references makes obvious to one of ordinary skill in the pertinent art. In re Bozek, 163 USPQ 545 (CCPA 1969). In re Mapelsden, 51 CCPA 1123, 329 F.2d 321, 141 USPQ 30 (1964). In re Henley, 44 CCPA 701, 239 F.2d 3, 112 USPQ 56 (1956). Wright et al. disclose: “FIG. 2 provides additional detail concerning a sensor system 12 deployed in a particular example of a monitored property 14 to illustrate a network of sensor units 40 that are used to passively gather data and monitor the property 14 and its occupant(s). “ (¶[0041]); i.e., the sensor system 12 includes pluralities of sensors 40 used to gather data. However, as stated above, Wright et al. disclose that "It is also possible to use modern video monitoring ... equipment to remotely view the person at risk" (¶[0007]); i.e., implicitly suggest that the sensor system 12 could be a modern video monitoring equipment. One of the ordinary skills understand that modern video monitoring equipment contains sensors, particularly image sensors (like CMOS/CCD) that capture light and motion sensors (PIR, microwave) for triggering, and with AI, cameras are increasingly functioning as sophisticated sensors themselves, detecting and analyzing events beyond just recording, making them integral parts of larger smart sensor systems. Applicant’s Argument: “Applicant respectfully contends that Wright fails to disclose that data captured using video monitoring or audio recording equipment is analyzed by a processor (such as using a neural network) to identify one or more anomalies." (Remarks, lines 14-17 of page 9) Examiner Response: Examiner respectfully disagrees. Wright et al. expressly disclose: "In FIG. 1 a sensor system 12 is deployed at each of a number of monitored properties 14, and is communicable within the system 12 to provide data to a monitoring, analytics and notification system 16 that is accessible from or otherwise on or within a cloud 18 or cloud-based service (hereinafter the "cloud or cloud-based system 16")" (¶[0038]); i.e., the cloud-based system 16 analyzes the data gathered by the sensor system 12. Specifically, Wright et al. disclose: “FIG. 7 schematically illustrates the collection and monitoring of data from the monitored properties 14 by the cloud-based system 16 ... data that is gathered ... is obtained by the analytics engine 64 ... The analytics engine 64 is used ... to analyze ongoing activities to determine if abnormal activities have or are taking place, in order to enable alerts or notifications or reports to be generated” (¶[0055] – emphasis added) Applicant’s Argument: “Applicant respectfully contends that Wright teaches that there is a requirement for someone to be actively monitoring the person remotely to ensure their health, and thus teaches away from analyzing image data or video data by a processor to identify anomalies (such as using a neural network). That is, Applicant respectfully contends that it would not be obvious to a person having ordinary skill in the art to modify Wright to include analyzing image data or video data by a processor (such as using a neural network) to identify anomalies, because of Wright's stated requirement for someone to be actively monitoring the person remotely to ensure their health." (Remarks, lines 18-25 of page 9) Examiner Response: Examiner respectfully disagrees. As stated earlier above, Wright et al. disclose: “FIG. 7 schematically illustrates the collection and monitoring of data from the monitored properties 14 by the cloud-based system 16 ... data that is gathered ... is obtained by the analytics engine 64 ... The analytics engine 64 is used ... to analyze ongoing activities to determine if abnormal activities have or are taking place, in order to enable alerts or notifications or reports to be generated” (¶[0055] – emphasis added). It should further be noted that, as stated earlier above, the Office Action includes rejection based on obviousness, where Madden discloses a trained neural network model. Applicant’s Argument: “Applicant respectfully submits that even Wright in combination with the other cited references would fail to render this feature obvious at least because Wright teaches away from analyzing image data or video data by a processor (such as using a neural network) to identify anomalies." (Remarks, lines 2-5 of page 10) Examiner Response: Examiner respectfully disagrees. As stated earlier above, Wright et al. disclose: “FIG. 7 schematically illustrates the collection and monitoring of data from the monitored properties 14 by the cloud-based system 16 ... data that is gathered ... is obtained by the analytics engine 64 ... The analytics engine 64 is used ... to analyze ongoing activities to determine if abnormal activities have or are taking place, in order to enable alerts or notifications or reports to be generated” (¶[0055] – emphasis added). It should further be noted that, as stated earlier above, the Office Action includes rejection based on obviousness, where Madden discloses a trained neural network model. Information Disclosure Statement The references cited on the information disclosure statement (IDS) submitted on 09/04/2025 have been considered and made of record by the examiner. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1,148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103(a) are summarized as follows: Determining the scope and contents of the prior art. Ascertaining the differences between the prior art and the claims at issue. Resolving the level of ordinary skill in the pertinent art. Considering objective evidence present in the application indicating obviousness or nonobviousness. The foregoing obviousness inquiry requires an expansive and flexible approach, not a rigid approach demanding express teachings, suggestions and motivations to combine prior art teachings. KSR International Co. v. Teleflex, Inc., 82 USPQ2d 1385, 1395, 97 (US 2007). The rationale supporting a conclusion of obviousness should be made explicit for review, but the rationale does not require precise teachings directed to the specific subject matter of the claim. Id. at 1396. A rejection can rely on inferences and creative steps that a person of ordinary skill in the art would employ. Id. Obviousness rejections are not limited to showing the obviousness of solutions to the problems Applicant was trying to solve. Id. at 1397. Rather, one can show obviousness of a claim by establishing the obviousness of any solution to any known problem in the field of endeavor and addressed by a patent application's subject matter. Id. Moreover, one of ordinary skill in the art is not an automaton, but is possessed of ordinary creativity. Id. One of ordinary skill could find alternative uses for prior art elements beyond the elements' primary purposes and fit prior art teachings together like a puzzle. Id. A combination of prior art teachings does not require absolute predictability. Eli Lilly and Co. v. Zenith Goldline Pharmaceuticals Inc., 81 USPQ2d 1324, 1329 (Fed. Cir. 2006). All that is required is a reasonable expectation of success. Id. Claims 1-20 are rejected under 35 U.S.C. 103(a) as being unpatentable over U.S. Patent Application Publication No. US 2018/0365957 to Wright et al. (see IDS) in view of U.S. Patent Application Publication No. US 2018/153477 to Nagale et al. (see IDS) and further in view of U.S. Patent Application Publication No. US 2019/0228397 A1 to Madden. Regarding claim 1, Wright et al. disclose a computer-implemented method for identifying a condition associated with an individual in a home environment (abstract; figure 1 [system 16 perform functions for individual 26 in home environments 14 connecting with sensor systems 12, through network 18 with databases, services and sources]), comprising: capturing data detected by a plurality of sensors associated with a home environment ("In FIG. 1 a sensor system 12 is deployed at each of a number of monitored properties 14, and is communicable within the system 12 to provide data to a monitoring, analytics and notification system 16 that is accessible from or otherwise on or within a cloud 18 or cloud-based service (hereinafter the "cloud or cloud-based system 16")", ¶[0038]; “the sensor system 12 deployed at a particular monitored property 14, which can include data acquired by multiple sensor units 40 each having multiple sensors 48”, ¶[0063]); analyzing, by one or more processor ("central processing unit (CPU) 52", ¶[0044]), the captured data to identify one or more anomalies ("The received data is then saved in one or more historical logs at step 152, e.g., in the online database 22. The data is then analyzed to determine patterns at step 154 … Based on the analyzing, the analytics engine 64 generates a signature for the monitored property 14 at step 158. The signature can include normal activities for certain days of the week, periods within each day, and further layers of granularity as desired", ¶[0063]; "The signature that is generated for that monitored property 14 is then output at step 160 to enable normal, abnormal, and lack of activity to be monitored against the signature", ¶[0065]); determining, by the one or more processors ("central processing unit (CPU) 52", ¶[0044]), based upon the identified one or more anomalies, the condition associated with the individual in the home environment ("The signature that is generated for that monitored property 14 is then output at step 160 to enable normal, abnormal, and lack of activity to be monitored against the signature", ¶[0065]; 'The data that is currently being evaluated is compared to the signature for the property 14 at step 202 to detect any abnormal activities or events at step 204", ¶[0066]); and generating, by the processor ("central processing unit (CPU) 52", ¶[0044]), a notification indicating the condition associated with the individual (“The data that is currently being evaluated is compared to the signature for the property 14 at step 202 to detect any abnormal activities or events at step 204. When this occurs, the analytics engine 64 can coordinate with the alerts engine 66 to determine a level for the alert or notification, e.g., to determine whether or not the caregiver or service or family member should be contacted immediately, whether or not emergency medical care is required, etc.", ¶[0066]); wherein the notification comprises a snapshot report indicating the condition associated with the individual ("The system can be configured to notify one or more people in the case of a notable event detected by the system ... through SMS, email, phone or other communication systems, performed automatically by the system" ¶[0037]; "The alert or notification is sent at step 208 using the determined format and channel appropriate for the condition, and the abnormal activity is logged at step 210 for establishing historical patterns, etc." [0066]) and an indication of one or more of: falls, bathing, sleeping, or physical activity level of the individual over a time period ("Typical behavior that can be identified ... Using an array of sensors around the home, a "fingerprint" or "signature", or other modeled output or indication of regular behavior can be determined, which is unique to the home and the resident (s) of that home." ¶[0033]; "This signature can be ... used ... to build a model of the home ... can include additional metrics … to address other abnormal time periods such as vacations or holidays." [0034]; "the system can instead report at the metadata level ... informing subscribers of "normal activity", "abnormal activity", "lack of activity", "unusual events", etc." ¶[0035]; "typical behavior may vary ... In the event that unusual activity is detected, the activity may be automatically assessed for risk. Some events may have a high likelihood of requiring critical attention (unusual noise, continuous water flow) or less critical attention (lack of motion or irregular consumption of utilities) that may not trigger urgent attention, but may still be flagged for follow up from either a family member or health care provider." ¶[0036]) and wherein the data captured includes video data ("It is also possible to use modern video monitoring ... equipment to remotely view the person at risk", ¶[0007]). Wright et al. disclose as stated above, except for expressly teaching the additional features, discussed below. Nagale et al. disclose wherein analyzing the captured data comprises analyzing the captured data using a trained machine learning model to identify the one or more anomalies (“The speech analyzer 336 may analyze the non-content speech features including, for example, volume, pitch, rhythm, speed, strength, steadiness, range, tone, and accuracy of speech, to generate a dysarthria indicator. The dysarthria indicator may have a numerical or categorical value, indicating a frequency or degree of patterns of continuous breathy voice, irregular breakdown of articulation, mono-pitch, distorted vowels, word flow without pauses, or hypernasality, among others”, ¶[0068]; “Computationally intensive algorithms, such as machine-learning algorithms, may be implemented in and executed by the external data processor”, ¶[0048]) and wherein the data captured by the plurality of sensors includes one or more of image data or video data ("The sensors 320 may include a camera 322 configured to capture a facial image of the patient … The facial image ... may be captured by the respective sensors", ¶[0066] – truncated; “Camera 322” of “sensor 320” in “Mobile device 301”, Fig. 3 – extracted below). It is desirable to timely detect early indicator and diagnosis of stroke. Therefore, it would have been obvious to one ordinary skill in the art, before the effective filing date of the claimed invention, to use teaching of Nagale et al. to modify the system and method of Wright et al. in order to provide the stroke risk indicator, as suggested by Nagale et al. (¶[0007]). Wright et al. in view of Nagale et al. disclose as stated above, except for expressly teaching that the trained machine learning model is a trained neural network model, wherein the neural network model is trained using a dataset associated with the home environment, by adding one or more layers to the trained neural network model, wherein at least one layer of the one or more layers is associated with at least one selected from a group consisting of an activation function, a loss function, and an optimization function. PNG media_image1.png 380 630 media_image1.png Greyscale Madden discloses a trained neural network model, wherein the neural network model is trained using a dataset associated with a home environment, by adding one or more layers to the trained neural network model (¶[0053]: Neural network 300 may correspond to trained ML model 204 and/or ML model 118 ... The plurality of input values 302 and plurality of input layer 304 may be interconnected ... Neural network 300 may include hidden layers ... wherein each hidden layer may comprise one or more interconnected neurons ... Neural network 300 may also include an output layer 308. Multiple layers of neural network 300 may correspond to respective models) {one of the ordinary skills understand that the neural network could use any kind of data set}, wherein at least one layer of the one or more layers is associated with at least one selected from a group consisting of an activation function, a loss function, and an optimization function (¶[0057] In general, training a ML model may include establishing a network architecture, or topology ... adding layers including activation functions ... loss function, and optimizer). It is desirable to minimize error and optimize the performance, especially when the activity being senses is unusual or abnormality of an individual. Therefore, it would have been obvious to one ordinary skill in the art, before the effective filing date of the claimed invention, to use neural network of Madden in place of machine learning algorithm of the system and method of Wright et al. in view of Nagale et al., which allows training with backpropagation, adjusting the weight and biases in the interconnected neurons to minimize errors and optimize performance. Regarding claim 8, Wright et al. disclose a computer system for identifying a condition associated with an individual in a home environment, comprising: one or more sensors associated with the home environment “the sensor system 12 deployed at a particular monitored property 14, which can include data acquired by multiple sensor units 40 each having multiple sensors 48”, ¶[0063]); one or more processors configured to interface with the one or more sensors ("central processing unit (CPU) 52", ¶[0044]); and one or more memories storing instructions that, when executed by the one or more processors (¶[0069]; claim 17), cause the computer system to: capture data detected by the one or more sensors ("In FIG. 1 a sensor system 12 is deployed at each of a number of monitored properties 14, and is communicable within the system 12 to provide data to a monitoring, analytics and notification system 16 that is accessible from or otherwise on or within a cloud 18 or cloud-based service (hereinafter the "cloud or cloud-based system 16")", ¶[0038]; “the sensor system 12 deployed at a particular monitored property 14, which can include data acquired by multiple sensor units 40 each having multiple sensors 48”, ¶[0063]); analyze the captured data to identify one or more anomalies ("The received data is then saved in one or more historical logs at step 152, e.g., in the online database 22. The data is then analyzed to determine patterns at step 154 … Based on the analyzing, the analytics engine 64 generates a signature for the monitored property 14 at step 158. The signature can include normal activities for certain days of the week, periods within each day, and further layers of granularity as desired", ¶[0063]; "The signature that is generated for that monitored property 14 is then output at step 160 to enable normal, abnormal, and lack of activity to be monitored against the signature", ¶[0065]). determine, based upon the identified one or more anomalies, a condition associated with an individual in the home environment ("The signature that is generated for that monitored property 14 is then output at step 160 to enable normal, abnormal, and lack of activity to be monitored against the signature", ¶[0065]; 'The data that is currently being evaluated is compared to the signature for the property 14 at step 202 to detect any abnormal activities or events at step 204", ¶[0066]); and generate a notification indicating the condition associated with the individual ('The data that is currently being evaluated is compared to the signature for the property 14 at step 202 to detect any abnormal activities or events at step 204. When this occurs, the analytics engine 64 can coordinate with the alerts engine 66 to determine a level for the alert or notification, e.g., to determine whether or not the caregiver or service or family member should be contacted immediately, whether or not emergency medical care is required, etc.", ¶[0066]); wherein the notification comprises a snapshot report and the snapshot report indicating the condition associated with the individual ("The system can be configured to notify one or more people in the case of a notable event detected by the system ... through SMS, email, phone or other communication systems, performed automatically by the system" ¶[0037]; "The alert or notification is sent at step 208 using the determined format and channel appropriate for the condition, and the abnormal activity is logged at step 210 for establishing historical patterns, etc." [0066]); and an indication of one or more of: falls, bathing, sleeping, or physical activity level of the individual over a time period ("Typical behavior that can be identified ... Using an array of sensors around the home, a "fingerprint" or "signature", or other modeled output or indication of regular behavior can be determined, which is unique to the home and the resident (s) of that home." ¶[0033]; "This signature can be ... used ... to build a model of the home ... can include additional metrics … to address other abnormal time periods such as vacations or holidays." [0034]; "the system can instead report at the metadata level ... informing subscribers of "normal activity", "abnormal activity", "lack of activity", "unusual events", etc." ¶[0035]; "typical behavior may vary ... In the event that unusual activity is detected, the activity may be automatically assessed for risk. Some events may have a high likelihood of requiring critical attention (unusual noise, continuous water flow) or less critical attention (lack of motion or irregular consumption of utilities) that may not trigger urgent attention, but may still be flagged for follow up from either a family member or health care provider." ¶[0036]) and wherein the data captured includes video data ("It is also possible to use modern video monitoring ... equipment to remotely view the person at risk", ¶[0007]). Wright et al. disclose as stated above, except for expressly teaching the additional features, discussed below. Nagale et al. disclose wherein analyzing the captured data comprises analyzing the captured data using a trained machine learning model to identify the one or more anomalies (“The speech analyzer 336 may analyze the non-content speech features including, for example, volume, pitch, rhythm, speed, strength, steadiness, range, tone, and accuracy of speech, to generate a dysarthria indicator. The dysarthria indicator may have a numerical or categorical value, indicating a frequency or degree of patterns of continuous breathy voice, irregular breakdown of articulation, mono-pitch, distorted vowels, word flow without pauses, or hypernasality, among others”, ¶[0068]; “Computationally intensive algorithms, such as machine-learning algorithms, may be implemented in and executed by the external data processor”, ¶[0048]) and wherein the data captured by the plurality of sensors includes one or more of image data or video data ("The sensors 320 may include a camera 322 configured to capture a facial image of the patient … The facial image ... may be captured by the respective sensors", ¶[0066] – truncated; “Camera 322” of “sensor 320” in “Mobile device 301”, Fig. 3 – extracted below). It is desirable to timely detect early indicator and diagnosis of stroke. Therefore, it would have been obvious to one ordinary skill in the art, before the effective filing date of the claimed invention, to use teaching of Nagale et al. to modify the system and method of Wright et al. in order to provide the stroke risk indicator, as suggested by Nagale et al. (¶[0007]). Wright et al. in view of Nagale et al. disclose as stated above, except for expressly teaching that the trained machine learning model is a trained neural network model, wherein the neural network model is trained using a dataset associated with a home environment, by adding one or more layers to the trained neural network model, wherein at least one layer of the one or more layers is associated with at least one selected from a group consisting of an activation function, a loss function, and an optimization function. PNG media_image1.png 380 630 media_image1.png Greyscale Madden discloses a trained neural network model (¶[0157]: As will be discussed in more detail later, GPU-based processing may be used as a training tool as part of a deep learning neural network as a way to extract meaningful health-related analytics from the large amount of acquired data from the wearable electronic device 100), wherein the neural network model is trained using a dataset associated with a home environment, by adding one or more layers to the trained neural network model (¶[0053]: Neural network 300 may correspond to trained ML model 204 and/or ML model 118 ... The plurality of input values 302 and plurality of input layer 304 may be interconnected ... Neural network 300 may include hidden layers ... wherein each hidden layer may comprise one or more interconnected neurons ... Neural network 300 may also include an output layer 308. Multiple layers of neural network 300 may correspond to respective models) {one of the ordinary skills understand that the neural network could use any kind of data set}, wherein at least one layer of the one or more layers is associated with at least one selected from a group consisting of an activation function, a loss function, and an optimization function (¶[0057] In general, training a ML model may include establishing a network architecture, or topology ... adding layers including activation functions ... loss function, and optimizer). It is desirable to minimize error and optimize the performance, especially when the activity being senses is unusual or abnormality of an individual. Therefore, it would have been obvious to one ordinary skill in the art, before the effective filing date of the claimed invention, to use neural network of Madden in place of machine learning algorithm of the system and method of Wright et al. in view of Nagale et al., which allows training with backpropagation, adjusting the weight and biases in the interconnected neurons to minimize errors and optimize performance. Regarding claim 15, Wright et al. disclose a non-transitory computer-readable storage medium having stored thereon a set of non-transitory instructions, executable by a processor (¶[0069]; claim 17), for identifying a condition associated with an individual in a home environment (abstract; figure 1 [system 16 perform functions for individual 26 in home environments 14 connecting with sensor systems 12, through network 18 with databases, services and sources]), the instructions comprising instructions for: obtaining data captured by a plurality of sensors associated with the home environment ("In FIG. 1 a sensor system 12 is deployed at each of a number of monitored properties 14, and is communicable within the system 12 to provide data to a monitoring, analytics and notification system 16 that is accessible from or otherwise on or within a cloud 18 or cloud-based service (hereinafter the "cloud or cloud-based system 16")", ¶[0038]; “the sensor system 12 deployed at a particular monitored property 14, which can include data acquired by multiple sensor units 40 each having multiple sensors 48”, ¶[0063]); analyzing the captured data to identify one or more anomalies ("The received data is then saved in one or more historical logs at step 152, e.g., in the online database 22. The data is then analyzed to determine patterns at step 154 … Based on the analyzing, the analytics engine 64 generates a signature for the monitored property 14 at step 158. The signature can include normal activities for certain days of the week, periods within each day, and further layers of granularity as desired", ¶[0063]; "The signature that is generated for that monitored property 14 is then output at step 160 to enable normal, abnormal, and lack of activity to be monitored against the signature", ¶[0065]); determining, based upon the identified one or more anomalies, the condition associated with the individual in the home environment ("The signature that is generated for that monitored property 14 is then output at step 160 to enable normal, abnormal, and lack of activity to be monitored against the signature", ¶[0065]; 'The data that is currently being evaluated is compared to the signature for the property 14 at step 202 to detect any abnormal activities or events at step 204", ¶[0066]); and generating a notification indicating the condition associated with the individual ('The data that is currently being evaluated is compared to the signature for the property 14 at step 202 to detect any abnormal activities or events at step 204. When this occurs, the analytics engine 64 can coordinate with the alerts engine 66 to determine a level for the alert or notification, e.g., to determine whether or not the caregiver or service or family member should be contacted immediately, whether or not emergency medical care is required, etc.", ¶[0066]); wherein the notification comprises a snapshot report and the snapshot report includes an indication of the condition associated with the individual ("The system can be configured to notify one or more people in the case of a notable event detected by the system ... through SMS, email, phone or other communication systems, performed automatically by the system" ¶[0037]; "The alert or notification is sent at step 208 using the determined format and channel appropriate for the condition, and the abnormal activity is logged at step 210 for establishing historical patterns, etc." [0066]). and an indication of one or more of: falls, bathing, sleeping, or physical activity level of the individual over a time period ("Typical behavior that can be identified ... Using an array of sensors around the home, a "fingerprint" or "signature", or other modeled output or indication of regular behavior can be determined, which is unique to the home and the resident (s) of that home." ¶[0033]; "This signature can be ... used ... to build a model of the home ... can include additional metrics … to address other abnormal time periods such as vacations or holidays." [0034]; "the system can instead report at the metadata level ... informing subscribers of "normal activity", "abnormal activity", "lack of activity", "unusual events", etc." ¶[0035]; "typical behavior may vary ... In the event that unusual activity is detected, the activity may be automatically assessed for risk. Some events may have a high likelihood of requiring critical attention (unusual noise, continuous water flow) or less critical attention (lack of motion or irregular consumption of utilities) that may not trigger urgent attention, but may still be flagged for follow up from either a family member or health care provider." ¶[0036]) and wherein the data captured includes video data ("It is also possible to use modern video monitoring ... equipment to remotely view the person at risk", ¶[0007]). Wright et al. disclose as stated above, except for expressly teaching the additional features, discussed below. Nagale et al. disclose wherein analyzing the captured data comprises analyzing the captured data using a trained machine learning model to identify the one or more anomalies (“The speech analyzer 336 may analyze the non-content speech features including, for example, volume, pitch, rhythm, speed, strength, steadiness, range, tone, and accuracy of speech, to generate a dysarthria indicator. The dysarthria indicator may have a numerical or categorical value, indicating a frequency or degree of patterns of continuous breathy voice, irregular breakdown of articulation, mono-pitch, distorted vowels, word flow without pauses, or hypernasality, among others”, ¶[0068]; “Computationally intensive algorithms, such as machine-learning algorithms, may be implemented in and executed by the external data processor”, ¶[0048]) and wherein the data captured by the plurality of sensors includes one or more of image data or video data ("The sensors 320 may include a camera 322 configured to capture a facial image of the patient … The facial image ... may be captured by the respective sensors", ¶[0066] – truncated; “Camera 322” of “sensor 320” in “Mobile device 301”, Fig. 3 – extracted below). PNG media_image1.png 380 630 media_image1.png Greyscale It is desirable to timely detect early indicator and diagnosis of stroke. Therefore, it would have been obvious to one ordinary skill in the art, before the effective filing date of the claimed invention, to use teaching of Nagale et al. to modify the system and method of Wright et al. in order to provide the stroke risk indicator, as suggested by Nagale et al. (¶[0007]). Wright et al. in view of Nagale et al. disclose as stated above, except for expressly teaching that the trained machine learning model is a trained neural network model, wherein the neural network model is trained using a dataset associated with a home environment, by adding one or more layers to the trained neural network model, wherein at least one layer of the one or more layers is associated with at least one selected from a group consisting of an activation function, a loss function, and an optimization function. Madden discloses a trained neural network model (¶[0157]: As will be discussed in more detail later, GPU-based processing may be used as a training tool as part of a deep learning neural network as a way to extract meaningful health-related analytics from the large amount of acquired data from the wearable electronic device 100), wherein the neural network model is trained using a dataset associated with a home environment, by adding one or more layers to the trained neural network model (¶[0053]: Neural network 300 may correspond to trained ML model 204 and/or ML model 118 ... The plurality of input values 302 and plurality of input layer 304 may be interconnected ... Neural network 300 may include hidden layers ... wherein each hidden layer may comprise one or more interconnected neurons ... Neural network 300 may also include an output layer 308. Multiple layers of neural network 300 may correspond to respective models) {one of the ordinary skills understand that the neural network could use any kind of data set}, wherein at least one layer of the one or more layers is associated with at least one selected from a group consisting of an activation function, a loss function, and an optimization function (¶[0057] In general, training a ML model may include establishing a network architecture, or topology ... adding layers including activation functions ... loss function, and optimizer). It is desirable to minimize error and optimize the performance, especially when the activity being senses is unusual or abnormality of an individual. Therefore, it would have been obvious to one ordinary skill in the art, before the effective filing date of the claimed invention, to use neural network of Madden in place of machine learning algorithm of the system and method of Wright et al. in view of Nagale et al., which allows training with backpropagation, adjusting the weight and biases in the interconnected neurons to minimize errors and optimize performance. Regarding claims 2, 9 and 16, Wright et al. in view of Nagale et al. and further in view of Madden disclose as stated above. Wright et al. also disclose wherein the plurality of sensors in the home environment include one or more sensors configured to capture data indicative of electricity use by devices associated with the home environment (Figs. 4 and 5; "Aspects that may be monitored include, but are not limited to: temperature, water consumption, electricity or gas or other utility consumption, appliance or device usage, lighting, ambient noise, motion, ingress/egress to/from a building, air quality, etc." ¶[0031]). Regarding claims 3, 10 and 17, Wright et al. in view of Nagale et al. and further in view of Madden disclose as stated above. Wright et al. also disclose wherein the data indicative of electricity use includes an indication of at least one selected from a group consisting of: which device is using electricity; a time at which electricity is used by a particular device; a date at which electricity is used by a particular device; a duration of electricity use by a particular device; and a power source for the electricity use (Figs. 4 and 5; "Aspects that may be monitored include, but are not limited to: temperature, water consumption, electricity or gas or other utility consumption, appliance or device usage, lighting, ambient noise, motion, ingress/egress to/from a building, air quality, etc." ¶[0031]). Regarding claims 4, 11 and 18, Wright et al. in view of Nagale et al. and further in view of Madden disclose as stated above. Wright et al. also disclose wherein the analyzing, by the one or more processors, the data captured by the plurality of sensors associated with the borne environment using the trained neural network model (abstract; figure 1 [system 16 perform functions for individual 26 in home environments 14 connecting with sensor systems 12, through network 18 with databases, services and sources]) to identify the one or more anomalies comprises: analyzing, by the one or more processors, over a period of time, the data detected by the plurality of sensors to identify one or more patterns in the data; and comparing, by the one or more processors, the data detected by the plurality of sensors to the identified patterns in the data in order to identify instances in which the detected data is inconsistent with the identified patterns. ("The received data is then saved in one or more historical logs at step 152, e.g., in the online database 22. The data is then analyzed to determine patterns at step 154 … Based on the analyzing, the analytics engine 64 generates a signature for the monitored property 14 at step 158. The signature can include normal activities for certain days of the week, periods within each day, and further layers of granularity as desired", ¶[0063]; "The signature that is generated for that monitored property 14 is then output at step 160 to enable normal, abnormal, and lack of activity to be monitored against the signature", ¶[0065]) {analysis of a data is inherently done over a period of time, i.e., it is not instant but take some time and have a time duration to be performed}. Regarding claims 5, 12 and 19, Wright et al. in view of Nagale et al. and further in view of Madden disclose as stated above. Wright et al. also disclose wherein the determining, by the one or more processors, based upon the identified one or more anomalies, a condition associated with an individual in the home environment comprises: determining, by the one or more processors, that the one or more anomalies indicate one or more atypical behaviors of the individual in the home environment; and identifying, by the one or more processors, a condition associated with the one or more atypical behaviors of the individual. ("The signature that is generated for that monitored property 14 is then output at step 160 to enable normal, abnormal, and lack of activity to be monitored against the signature", ¶[0065]; 'The data that is currently being evaluated is compared to the signature for the property 14 at step 202 to detect any abnormal activities or events at step 204. When this occurs, the analytics engine 64 can coordinate with the alerts engine 66 to determine a level for the alert or notification, e.g., to determine whether or not the caregiver or service or family member should be contacted immediately, whether or not emergency medical care is required, etc.", ¶[0066]). Regarding claims 6, 13 and 20, Wright et al. in view of Nagale et al. and further in view of Madden disclose as stated above. Wright et al. also wherein the condition associated with the individual is a medical condition ('The data that is currently being evaluated is compared to the signature for the property 14 at step 202 to detect any abnormal activities or events at step 204. When this occurs, the analytics engine 64 can coordinate with the alerts engine 66 to determine a level for the alert or notification, e.g., to determine whether or not the caregiver or service or family member should be contacted immediately, whether or not emergency medical care is required, etc.", ¶[0066]). Regarding claims 7 and 14, Wright et al. in view of Nagale et al. and further in view of Madden disclose as stated above. Wright et al. also disclose wherein the condition associated with the individual is an emergency medical condition, the method further comprising: requesting, by the one or more processor ("central processing unit (CPU) 52", ¶[0044]), based upon the emergency medical condition, an emergency service to be provided to the individual ('The data that is currently being evaluated is compared to the signature for the property 14 at step 202 to detect any abnormal activities or events at step 204. When this occurs, the analytics engine 64 can coordinate with the alerts engine 66 to determine a level for the alert or notification, e.g., to determine whether or not the caregiver or service or family member should be contacted immediately, whether or not emergency medical care is required, etc.", ¶[0066]). Conclusion Examiner's note: As applied to the claims above, the specific columns, line numbers, and figures in the references has been cited for the Applicant’s convenience. Although the specified citations are representative of the teachings of the art and are applied to the particular limitations within the individual claims, other passages and figures may apply as well. The Applicant is respectfully requested to fully consider the references, in entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage taught by the prior art or disclosed by the Examiner, in preparing responses. Applicant(s) are reminded that MPEP 2123 I. states: “The use of patents as references is not limited to what the patentees describe as their own inventions or to the problems with which they are concerned. They are part of the literature of the art, relevant for all they contain.” In re Heck, 699 F.2d 1331, 1332-33, 216 USPQ 1038, 1039 (Fed. Cir. 1983) (quoting In re Lemelson, 397 F.2d 1006, 1009, 158 USPQ 275, 277 (CCPA 1968)). A reference may be relied upon for all that it would have reasonably suggested to one having ordinary skill the art, including nonpreferred embodiments. Merck & Co. v.Biocraft Laboratories, 874 F.2d 804, 10 USPQ2d 1843 (Fed. Cir.), cert. denied, 493 U.S. 975 (1989). Reliance on the US Pre-Grant Publication (PG PUB) of this application, which is not part of the image file wrapper of the patent application, in the prosecution is improper. All references in the reply to the office action are to be made to the latest version on record of the patent application as filed not as published. The latest version on record of the patent application means the patent application as originally filed and modified by previously entered amendment(s). THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Contact Information Any inquiry concerning this communication or earlier communications from the examiner should be directed to Nader Bolourchi whose telephone number is (571) 272-8064. The examiner can normally be reached on M-F 8:30 to 4:30. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hannah S. Wang, SPE can be reached on (571) 272-9018. The fax phone number for the organization where this application or proceeding is assigned is (571) 273-8300. Interviews are available via telephone and video conferencing using a USPTO web-based Video Conferencing and Collaboration Tool. To schedule an interview, Applicants are encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. Communications via Internet e-mail are at the discretion of the applicant. See MPEP § 502.03. Without a written authorization by applicant in place, the USPTO will not respond via Internet e-mail to any Internet correspondence which contains information subject to the confidentiality requirement as set forth in 35 U.S.C. 122 and will not initiate communications with applicants via Internet e-mail. The internet authorization must be submitted on a separate paper to be entitled to acceptance in accordance with 37 CFR 1.4(c). The separate paper will facilitate processing and avoid confusion. The written authorization may be submitted via EFS-Web, mail, or fax. It cannot be submitted by email. The following is a sample authorization form, which may be used by applicant: “Recognizing that Internet communications are not secure, I hereby authorize the USPTO to communicate with the undersigned and practitioners in accordance with 37 CFR 1.33 and 37 CFR 1.34 concerning any subject matter of this application by video conferencing, instant messaging, or electronic mail. I understand that a copy of these communications will be made of record in the application file.” A written authorization may be withdrawn by filing a signed paper clearly identifying the original authorization. The following is a sample form which may be used by applicant to withdraw the authorization: “The authorization given on______, to the USPTO to communicate with any practitioner of record or acting in a representative capacity in accordance with 37 CFR 1.33 and 37 CFR 1.34 concerning any subject matter of this application via video conferencing, instant messaging, or electronic mail is hereby withdrawn.” To facilitate processing of the internet communication authorization or withdraw of authorization, the Office strongly encourages use of Form PTO/SB/439, filed via EFS-Web. The Form is available at: https://www.uspto.gov/sites/default/files/documents/sb0439.pdf. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at (866) 217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (in USA, or CANADA) or 571-272-1000. /Nader Bolourchi/ Primary Examiner, Art Unit 2631
Read full office action

Prosecution Timeline

Dec 18, 2023
Application Filed
Jun 26, 2024
Non-Final Rejection — §103
Sep 03, 2024
Examiner Interview Summary
Sep 03, 2024
Applicant Interview (Telephonic)
Oct 01, 2024
Response Filed
Dec 30, 2024
Final Rejection — §103
Feb 17, 2025
Interview Requested
Feb 27, 2025
Applicant Interview (Telephonic)
Feb 28, 2025
Examiner Interview Summary
Apr 02, 2025
Request for Continued Examination
Apr 03, 2025
Response after Non-Final Action
Apr 10, 2025
Final Rejection — §103
Jul 08, 2025
Response after Non-Final Action
Jul 17, 2025
Request for Continued Examination
Jul 17, 2025
Response after Non-Final Action
Aug 05, 2025
Non-Final Rejection — §103
Oct 28, 2025
Examiner Interview Summary
Oct 28, 2025
Applicant Interview (Telephonic)
Oct 31, 2025
Response Filed
Jan 17, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596990
LOW-POWER SIGNALING FOR MEDICAL DEVICES AND MEDICAL DEVICE PERSONNEL
2y 5m to grant Granted Apr 07, 2026
Patent 12592723
DUPLEXER, MULTIPLEXER AND MULTIBAND FILTER
2y 5m to grant Granted Mar 31, 2026
Patent 12592729
Hybrid Distortion Suppression System and Method
2y 5m to grant Granted Mar 31, 2026
Patent 12579135
SYSTEM AND METHOD FOR DATA COLLECTION TO VALIDATE LOCATION DATA
2y 5m to grant Granted Mar 17, 2026
Patent 12580528
TRANSCEIVER CIRCUIT AND ASSOCIATED INTERFERENCE MITIGATION METHOD
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

6-7
Expected OA Rounds
82%
Grant Probability
94%
With Interview (+12.0%)
2y 8m
Median Time to Grant
High
PTA Risk
Based on 723 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month