DETAILED ACTION
Notice of Pre-AIA or AIA Status
● The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
● This action is responsive to the following communication: A response filed on 12/24/2025.
● Claims 1, 3-8, 10-14, 16-20 are currently pending; claims 2, 9, and 15 have been canceled.
Information Disclosure Statement
● The information disclosure statement (IDS) submitted on 1/28/2026 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 3-8, 10-14, 16-20 are rejected under 35 U.S.C. 103 as being unpatentable over Buddihy et al (US 20050137465) in view of Stevens et al (US 20150339905).
Regarding claim 1, Buddily discloses a computer-implemented method of detecting periled individuals within a property, the method comprising:
receiving, by the hardware controller, sensor data (sensor data, figs. 4-7) from at least one sensor located within the property, of a plurality of sensors installed at various locations (figs. 4-7) on the property;
analyzing the sensor data by one or more processors, wherein analyzing the sensor data includes analyzing sensor data (sensor data, figs. 4-7) associated with a room of the property, and wherein analyzing the sensor data is based on accessing baseline sensor data associated with the room of the property (fig. 1);
based upon the analyzing the sensor data, determining that an individual located in the room of the property has remained in bed for an abnormal amount of time (abnormal sleep time, fig. 7);
responsive to determining that the individual in the room of the property has remained in bed for the abnormal amount of time (abnormal sleep time outside of threshold limit, fig. 7), generating a notification indicating (contact caregiver if wake-up time is outside of limit, fig. 7) that the individual in the room of the property remained in bed for the abnormal amount of time; and communicating the notification to an electronic device (fig. 1).
Buddily fails to teach and/or suggest wherein the sensor data includes imaging data.
Stevens, in the same field of endeavor for monitoring system, teaches wherein the sensor data includes imaging data (imaging data via home security camera for monitoring sleeping data, fig. 1, par.16- 20, 22, 48).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention by modifying monitoring system of Buddily to include wherein the sensor data includes imaging data as taught by Stevens to improve monitoring of individual by including imaging data (e.g. security camera).
Therefore, it would have been obvious to combine Buddily with Stevens to obtain the invention as specified in claim 1.
Regarding claim 2, Cuddihy further discloses the computer-implemented method of claim 1, wherein the sensor data includes imaging data (fig.3).
Regarding claim 3, Cuddihy further discloses the computer-implemented method of claim 1, wherein the sensor data includes motion data (motion data, fig. 3).
Regarding claim 4, Cuddihy further discloses the computer-implemented method of claim 1, wherein the electronic device is a personal digital assistant device (PDA, par. 26).
Regarding claim 5, Cuddihy further discloses the computer-implemented method of claim 1, wherein analyzing the sensor data is further based on accessing baseline sensor data (figs. 4-7) associated with the individual.
Regarding claim 6, Cuddihy further discloses the computer-implemented method of claim 5, wherein analyzing the sensor data based on accessing baseline sensor data associated with the individual includes learning a routine (e.g. sleeping pattern, figs. 4-7) associated with the individual.
Regarding claim 7, Cuddihy further discloses the computer-implemented method of claim 1, wherein analyzing the sensor data comprises: analyzing the sensor data (fig. 4) to determine a current condition of the individual; receiving updated sensor data (fig. 4) from the at least one sensor; and determining, from the updated sensor data, that a current condition is maintained for a threshold period of time (fig. 7).
Regarding claims 8, 10-14, 16-20 recite limitations that are similar and in the same scope of invention as to those in claims 1, 3-7 above; therefore, claims 8, 10-14, 16-20 are rejected for the same rejection rationale/basis as described in claims 1, 3-7.
Response to Arguments
● Applicant's arguments filed 12/24/2025 have been fully considered but they are not persuasive.
---Regarding claims 1, 8, and 14, the applicants argued the cited prior arts of record [Buddihy et al (US 20050137465) in view of Stevens et al (US 20150339905)] fail to teach and/or suggest Consequently, Applicant respectfully submits that Stevens fails to disclose "based upon analyzing the sensor data, determining that an individual located in the room of the property has remained in bed for an abnormal amount of time," "wherein the sensor data includes imaging data”.
In response, Buddily discloses “based upon the analyzing the sensor data, determining that an individual located in the room of the property has remained in bed for an abnormal amount of time” (abnormal sleep time, fig. 7). See passages below for details.
[0021] Another aspect of the invention is a method for allowing a caregiver to monitor the sleep patterns of a resident residing independently in a home. The method includes the steps of providing activity sensors for distribution throughout the home, collecting data from the activity sensors, communicating the data collected from the activity sensors to a database via a near real-time communication platform, and analyzing the data collected from the activity sensors using a hidden Markov modeling technique to determine if the data indicates a deviation from the comparison data, signaling an abnormal sleep pattern.
[0022] Another aspect of the invention is a method for allowing a caregiver to monitor the wake up times of a resident residing independently in a home. The method includes the steps of providing activity sensors for distribution throughout the home, collecting data from the activity sensors indicating a wake up time of the resident, communicating the collected data to a monitoring center remote from the home via a near real-time communication platform, and analyzing the data collected from the activity sensors to determine if the data indicates that the resident is not awake by the predetermined normal wake up time, signaling an abnormal wake up time.
[0041] Changes in sleep patterns, like changes in mobility patterns, can signal a medical problem. For example, a change in sleep patterns may be an indicator of depression, or it may be an indicator that a medication needs to be changed or that a recent change in medication is affecting the resident's sleep patterns. FIG. 5 illustrates an exemplary method for chronicling the sleep patterns of a resident of the home 12 to ascertain whether any particular sleep pattern is normal or abnormal. FIG. 6 illustrates one exemplary embodiment for analyzing sleep patterns utilizing a hidden Markov modeling technique.
[0043] Once the baseline sleep pattern data has been obtained, data is collected on the resident in the home 12 to facilitate monitoring of the resident's sleep patterns at Step 405. Finally, at Step 410, the sleep pattern data is analyzed to ascertain whether the data supports a conclusion that the resident's most recent sleep pattern is normal or abnormal with reference to the baseline sleep pattern data. If the data supports a conclusion that the most recent sleep pattern is abnormal, a report may be generated and communicated to the caregiver 38 via communication media described above. Alternatively, a further analysis step may be performed to ascertain whether the abnormality of the most recent sleep pattern is sufficiently abnormal to warrant a report to the caregiver 38. Whether a sleep pattern is considered sufficiently abnormal may be determined by a predetermined set of rules, feedback from the caregiver (which may assist in retraining the home), or a combination of the two.
[0046] With reference to FIGS. 5, 6, data is taken from the activity sensors 14 for a training period of time to set up a database of baseline sleep pattern data at Step 400. The sleep pattern data may automatically separate into similar sleep patterns. After sufficient baseline sleep pattern data has been obtained, various patterns of sleep will have been identified as being normal for a resident. Then, at Step 405, the resident's sleep patterns are monitored. A hidden Markov modeling technique is utilized to analyze the sleep patterns at Step 410. Transitions between the states are done in ten minute slices 435. This methodology accounts for a number of transition slices. Thus, if a resident's in sleep state 420 lasts for four hours, then the count of transition from in sleep state to in sleep state will be a count of twenty-four. The baseline sleep pattern data is used to predict whether the resident's most recent sleep pattern belongs to a previously identified sleep pattern, and therefore is normal, or whether it does not belong to any previously identified sleep pattern, and therefore is abnormal for that resident. If the resident's sleep pattern is determined to be abnormal, a report is generated identifying a deviation in the resident's sleep pattern.
Stevens teaches wherein the sensor data includes imaging data (imaging data via home security camera for monitoring sleeping data, fig. 1, par.16- 20, 22, 48). Stevens discloses plurality of sensors including sensor for imaging data. See passage below for more details.
[0014] Aspects of the present disclosure relate to various embodiments of a system and method for detecting a change in a home environment for a visually impaired user. More particular aspects relate to using image processing, sonar mapping, and motion sensing systems to detect the change in the home environment for the visually impaired user. The method and system may work on a number of devices and operating systems. Aspects of the present disclosure include monitoring a home environment using a set of sensors configured to collect environment data for a set of areas of the home environment. The set of sensors can include motion sensors, audio sensors, cameras, or other types of sensors. The method may also include detecting an environmental change in an area of the home environment. The environmental change may be a movement of an object (e.g., an object that may be an obstacle for a visually impaired user), a broken object, a substantial change in temperature, or another change that may affect the visually impaired user. The method may also include providing, in response to a triggering event, a notification of the environmental change.
[0015] In certain embodiments, collecting the environment data may include capturing, by an image capturing system, a first set of images for the set of areas of the home environment. In certain embodiments, detecting the environmental change in the area of the home environment includes capturing, in response to receiving a motion alert from a motion detection system coupled to the image capturing system, a second set of images for the set of areas. The method may then include identifying an object of the second set of images meeting an obstruction criteria. The obstruction criteria may include an object displacement threshold based on an origin position and a subsequent position, and an object size threshold based on a first dimension, a second dimension, and a third dimension. In certain embodiments, providing the notification of the environmental change may include notifying a user, using a mobile device communicatively connected to the image capturing system, of the identified object meeting the obstruction criteria in response to detecting a home arrival. In certain embodiments, the home arrival may be detected by a security system in response to a door unlock of a home. In certain embodiments, detecting the home arrival may include recording, by a sound capture device, a set of dog barks (including like noises, such as panting). In response to recording the set of dog barks, the method may include analyzing a bark frequency and a bark pitch of the set of dog barks. The method may include determining, based on the bark frequency and the bark pitch, occurrence of the home arrival.
[0016] In certain embodiments, collecting the environment data may include mapping, using a sonar mapping system, a first sonar map of the set of areas of the home environment. In certain embodiments, detecting the environmental change in the area of the home environment may include mapping, in response to receiving a noise alert from a sound detection system coupled to the sonar mapping system, a second sonar map of the set of areas. The method may also include identifying an object of the second sonar map meeting an obstruction criteria. In certain embodiments, providing the notification of the environmental change may include notifying, in response to an alert from a sleep sensor, using a mobile device communicatively connected to the image capturing system, a user of the identified object meeting the obstruction criteria. The alert from the sleep sensor may include a change in a pulse beyond a pulse threshold, and a change in a respiration frequency beyond a respiration threshold. In certain embodiments, the method may include collecting, by the sonar mapping system, a sound profile data including a background noise level of the set of areas. The method may also include creating, using sound profile data other than the background noise level, the sonar map.
[0017] In certain embodiments, collecting the environment data may include capturing, by an image capturing system, a first set of images for the set of areas of the home environment. In certain embodiments, detecting the environmental change in the area of the home environment includes capturing, in response to receiving a motion alert from a motion detection system coupled to the image capturing system, a second set of images for the set of areas. The method may then include identifying an object of the second set of images meeting an obstruction criteria. The obstruction criteria may include an object displacement threshold based on an origin position and a subsequent position, and an object size threshold based on a first dimension, a second dimension, and a third dimension. In certain embodiments, providing the notification of the environmental change may include notifying a user, using a mobile device communicatively connected to the image capturing system, of the identified object meeting the obstruction criteria in response to detecting an area transition. In certain embodiments, detecting the area transition may include monitoring the location of a cane of the user. The cane may be equipped with a RFID (radio-frequency identification) bit communicatively connected to a RFID tracking system. The method may also include detecting that the location of the cane has been displaced by a distance value greater than a distance threshold. In certain embodiments, detecting the area transition may include using a thermal imaging system to monitor a body heat signature. The method may also include detecting that the body heat signature has been displaced by a distance value greater than a distance threshold.
[0020] In certain embodiments, the set of sensors may be equipped to collect environment data for a set of areas of the home environment. The environment data may be video, audio, thermal, motion, infrared, or other data containing information related to the home environment. For example, the environment data may be video footage or still images, captured sounds, heat signatures, or other data collected by the set of sensors. As shown in FIG. 1, in certain embodiments, monitoring the home environment can include image capturing 104, sound capturing 106, and motion sensing 108. At image capturing block 104, method 100 can include using an image capturing system (e.g., cameras, etc.) to capture a set of images of one or more areas of the home environment. At sound capturing block 106, the method 100 can include using a sound capturing system to collect sound and noise data for one or more areas of the home environment. At motion sensing block 108, the method 100 can include using a motion-sensing system to collect motion data for one or more areas of the home environment. The areas of the home environment may be one or more regions or subdivisions of the home environment. The subdivisions of the home environment may be based on physical constructions (e.g., walls, doors, etc.) or based on function (kitchen, dining room, living room, etc.). As an example, the areas of the home environment may include a bedroom, a kitchen, a living room, a bathroom, and the like.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to THIERRY L PHAM whose telephone number is (571)272-7439. The examiner can normally be reached M-F, 11-6.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hai Phan can be reached at 571-272-6338. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/THIERRY L PHAM/Primary Examiner, Art Unit 2654