Prosecution Insights
Last updated: April 19, 2026
Application No. 18/131,218

DETECTING AND PREVENTING SLEEPWALKING EVENTS

Final Rejection §103§112
Filed
Apr 05, 2023
Examiner
WESTFALL, SARAH ANN
Art Unit
3791
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Sleep Number Corporation
OA Round
2 (Final)
0%
Grant Probability
At Risk
3-4
OA Rounds
3y 2m
To Grant
0%
With Interview

Examiner Intelligence

Grants only 0% of cases
0%
Career Allow Rate
0 granted / 5 resolved
-70.0% vs TC avg
Minimal +0% lift
Without
With
+0.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
47 currently pending
Career history
52
Total Applications
across all art units

Statute-Specific Performance

§101
16.8%
-23.2% vs TC avg
§103
35.1%
-4.9% vs TC avg
§102
18.4%
-21.6% vs TC avg
§112
25.3%
-14.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 5 resolved cases

Office Action

§103 §112
Detailed Action Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claims 1, 4-12, and 17-22 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the enablement requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to enable one skilled in the art to which it pertains, or with which it is most nearly connected, to make and/or use the invention. Claims 1 and 19 recite the limitations "wherein the bed exit detection classifier uses a machine-learning model to determine bed exit probabilities during the sleep session; receive, as output from the bed exit detection classifier, a bed exit probability that indicates a likelihood that the user will begin to sleepwalk in the sleep session; and (ii) the bed exit probability satisfies a second threshold condition". These limitations are not disclosed within the specification. More specifically, the specification does not disclose a bed exit probability being output by a bed exit detection classifier, nor does it disclose that an output of a bed exit classifier indicates a likelihood that a user will begin to sleepwalk in a sleep session. There is no discussion of any “bed exit probability” in the originally filed specification. The only probability discussed in the specification is a probability of sleepwalking onset that is determined by a computer system. According to the specification, the sleep state classifier outputs a user’s sleep state (Paragraph [0008]) such as N3 (Paragraph [0246]). The bed exit detection classifier outputs whether a user has been detected as exiting the bed (Paragraph [0253]) and when said user may have exited a bed (Paragraph [0008]). The specification discloses that a computer system is used to generate a probability of a sleepwalking event for the user that is indicative of a likelihood that the user will begin to sleepwalk based on the satisfied threshold conditions from the sleep state classification and bed exit detection classification (Paragraph [0011] - the computer system being configured to… determine whether (i) the sleep state classification for the user satisfies a first threshold condition and (ii) the bed exit detection classification for the user satisfies a second threshold condition, generate, based on a determination that the first and the second threshold conditions are satisfied, a probability of a sleepwalking event for the user, and generate output based on the probability of the sleepwalking event for the user; Paragraph [0277] - The probability of the onset of the sleepwalking can indicate a likelihood that the user will experience the sleepwalking event within a threshold amount of time from a current time during the sleep session). While the probability of sleepwalking onset is an output of the computer system based on outputs of a sleep state classifier and bed exit classifier, it is not an output of the bed exit classifier. The specification also does not disclose a bed exit probability satisfying a second threshold. Claims not explicitly rejected above are rejected due to their dependence on the above claims. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 4, 7-8, 10-12, and 18-22 are rejected under 35 U.S.C. 103 as being unpatentable over Shinar et. al.’635 (U.S. Publication Number 20140371635 – previously cited) in view of Shouldice et. al.’810 (U.S. Publication Number 20200367810), and further in view of Rodgers et. al.’843 (U.S. Publication Number 20090119843 – previously cited). Regarding Claim 1, Shinar et. al.’635 discloses a system for detecting sleepwalking events of a user in a bed (Paragraph [0229] - In response to the likelihood(s), and in response to the control unit further identifying that the subject is sleeping, the control unit generates a "sleepwalking" alert), the system comprising: at least one sensor (Paragraph [0096] - System 10 typically comprises a motion sensor 30, a control unit 14, and a user interface (U/I) 24); and a computer system in communication with the at least one sensor (Paragraph [0096] - motion sensor 30 is integrated into control unit 14, in which case user interface 24 is either also integrated into control unit 14 or remote from control unit 14. For some applications, control unit 14 and/or user interface 24 of system 10 are implemented in a mobile device (such as a cellular phone, a pager, and/or a tablet computer)), the computer system configured to: receive sensor data from the at least one sensor during a sleep session of a user of a bed (Paragraph [0099] - The motion of subject 12 sensed by sensor 30, during sleep, for example, may include regular breathing movement, heartbeat-related movement, and other, unrelated body movements, as discussed below, or combinations thereof. For some applications, sensor 30 comprises a standard communication interface (e.g. USB), which enables connection to standard monitoring equipment); provide, as input to a sleep state classifier, a first portion of the sensor data, wherein the sleep state classifier uses a machine-learning model to determine the user's sleep states during the sleep session (Paragraph [0106] - Breathing pattern analysis module 22 may be used (e.g., to facilitate ascertaining a sleep stage of a subject); Paragraph [0112] - For some applications, in response to an input to system 10, the pattern identification module operates in a learning mode, in which the module learns characteristic patterns of the subject); receive, as output from the sleep state classifier, a sleep state classification for the user (Paragraph [0226] - control unit 14 identifies a sleep stage of subject 12; Paragraph [0228] - analyzing the signal from sensor 316 (e.g., motion sensor 30), control unit 14 identifies that the subject is sleeping); provide, as input to a bed exit detection classifier, a second portion of the sensor data, wherein the bed exit detection classifier uses a machine-learning model to determine when the user exits the bed during the sleep session (Paragraph [0116] - the pattern identification module operates in a learning mode, in which the module learns characteristic patterns of the subject, as described hereinabove. For some applications, respective first and second motion sensors are placed underneath the subject and the second person who uses the bed; Paragraph [0198] - Control unit 14 is configured to analyze the signal from the sensor, and, in response thereto, calculate a bed-exit likelihood, which is a likelihood that the person has left the resting surface); receive, as output from the bed exit detection classifier, a bed exit probability that indicates a likelihood that the user has left the resting surface (Paragraph [0198] - Control unit 14 is configured to analyze the signal from the sensor, and, in response thereto, calculate a bed-exit likelihood, which is a likelihood that the person has left the resting surface; Paragraph [0229] - the control unit identifies a likelihood that the subject has left the resting surface); determine whether the sleep state classification for the user satisfies a first threshold condition (Paragraph [0194] - the control unit is configured to track the history of awakenings of the subject and/or of the other person, thus allowing the control unit to learn the proper threshold) of N3 sleep state (Paragraph [0224] - identified sleep stage to the subject, only if the identified sleep stage is a slow-wave (i.e., deep) sleep stage. (The output may include a technical description of the sleep stage, e.g., "NREM stage 3", and/or a non-technical description, e.g., "deep sleep".)), and determine whether the bed exit detection classification for the user satisfies a second threshold condition (Paragraph [0150] - control unit 14 is configured to "learn" an amplitude threshold Th by analyzing a portion of the motion signal (e.g., portion 501a) that was generated in response to motion (e.g., cardiac and/or respiratory motion) of the subject, and calculating threshold Th based on the analysis; Paragraph [0151] - If the amplitude has changed by a threshold amount, it is likely the subject moved relative to the sensor, and thus, the large body-movement is likely that of the subject. (FIG. 32 shows A4 being different from A3, but not by more than the threshold amount, such that portion 506 is determined to have been generated by motion of the second person; Paragraph [0253] - the detected bed exit(s) of the user satisfies a second threshold condition in block 2010. The second threshold condition can be whether the user has been detected as exiting the bed (e.g., detection of at least one bed exit event))); identify, based on a determination that the first and the second threshold conditions are satisfied, a sleepwalking event for the user is imminent (Paragraph [0229] - In response to the likelihood(s), and in response to the control unit further identifying that the subject is sleeping, the control unit generates a "sleepwalking" alert. For example, if the signal gave no indication that the subject woke prior to the indication that the subject left the bed, an alert will be generated. In response to the alert, a caregiver may come to the subject's aid); and generate output based on identification that the sleepwalking event is imminent (Paragraph [0229] - the control unit generates a "sleepwalking" alert. For example, if the signal gave no indication that the subject woke prior to the indication that the subject left the bed, an alert will be generated). Shinar et. al.’635 fails to explicitly disclose determining a sleepwalking event based on a sleep state classification of N3 and a bed exit classification. Shouldice et. al.’810 teaches identifying a sleepwalking event based on motion detected during a slow wave – N3 – sleep stage (Paragraph [0308] - Sleepwalking is more likely to occur during deep slow wave sleep (SWS). The sleep staging algorithm of the processing device 100 of the system that detects the user's sleep stages, can detect the wake-type movement as the person gets out of bed (or back in to bed) with paradoxical respiratory patterns that are consistent with REM or deep sleep (e.g., based on the standard deviation of normalized running median filtered breathing rate estimate and other features). Where the absence periods are greater than a sleep walking threshold (e.g., >5 mins), the processing device 100 of the system can flag an area of suspected sleep walking). It would have been obvious to one of ordinary skill in the art at the time the invention was effectively filed to have modified the system of Shinar et. al.’635 that involves identifying a sleepwalking event based on an output from a sleep stage identifier and motion detector (Paragraph [0229]) to include selecting deep sleep – N3 stage – as a preferred sleep state in order to increase the probability of detecting a sleepwalking event based on the concept that sleepwalking is most likely to occur by a person during slow wave/deep sleep/N3 stage of a sleeping cycle as seen in Shouldice et. al.’810. Shinar et. al.’635 in view of Shouldice et al.’810 fails to disclose an output comprising lowering the bed before the sleepwalking event occurs. Rodgers et. al.’843 teaches a bed that receives instructions from a controller in order to lower a bed height whenever a user is exiting the bed (Paragraph [0103] - IRCC 312 – computer controller - can lower bed 304 from its specified height to some lower height in response to determining that movement data from one or more of cameras 308(a,b), 310(a,b), and 316 is sufficiently similarly to one or more movement pattern data sets generally and/or specifically indicative of an attempt by patient 302 to exit bed 304. Lowering of support platform reduces the potential fall distance of patient 302). It would have been obvious to one of ordinary skill in the art at the time the invention was effectively filed to have modified the system of Shinar et. al.’635 in view of Shouldice et. al.’810 to include lowering a bed to a specified height whenever a pattern of a user exiting the bed is identified in order to reduce potential fall distance of the user as seen in Rodgers et. al.’843. Regarding Claim 4, Shinar et. al.’635 in view of Shouldice et. al.’810 and further in view of Rodgers et. al.’843 discloses system outlined in Claim 1 above. Shinar et. al.’635 further discloses the second threshold condition is a minimum likelihood of that sleepwalking event will occur within a threshold time range from a current time (Paragraph [0198] - Control unit 14 is configured to analyze the signal from the sensor, and, in response thereto, calculate a bed-exit likelihood, which is a likelihood that the person has left the resting surface and/or a likelihood that the person is preparing to leave the resting surface. ("Preparing to leave the resting surface" may be defined, for example, as "intending to leave the resting surface within a given period of time, e.g., 2 minutes")). Regarding Claim 7, Shinar et. al.’635 in view of Shouldice et. al.’810 and further in view of Rodgers et. al.’843 discloses system outlined in Claim 4 above. Shinar et. al.’635 further discloses generating the output comprises storing the sleep state classification (Paragraph [0185] - identifying the sleep-related parameter(s), as well as generally, in identifying that a person is sleeping and/or in identifying a sleep stage of the person), a bed exit probability (Paragraph [0198] - Control unit 14 is configured to analyze the signal from the sensor, and, in response thereto, calculate a bed-exit likelihood, which is a likelihood that the person has left the resting surface; Paragraph [0229] - the control unit identifies a likelihood that the subject has left the resting surface), and a total time in a data store (Paragraph [0181] - For example, for each of the subjects, the control unit may be configured to identify one or more of the following parameters; Paragraph [0182] - A length of time for which the subject has been sleeping, e.g., a length of time for which the subject has been in a deep sleep). Regarding Claim 8, Shinar et. al.’635 in view of Shouldice et. al.’810 and further in view of Rodgers et. al.’843 discloses system outlined in Claim 1 above. Shinar et. al.’635 further discloses storing an identification that the sleepwalking event is imminent in a data store (Paragraph [0190] - generate a report that shows a history – data store - of the at least one sleep-related parameter – sleepwalking event - for each of the subjects. The report may be generated at regular intervals, such as after every night. The generation of such a report may be helpful in avoiding conflicts). Regarding Claim 10, Shinar et. al.’635 in view of Shouldice et. al.’810 and further in view of Rodgers et. al.’843 discloses system outlined in Claim 1 above. Shinar et. al.’635 further discloses generating the output comprises generating a notification indicating that the sleepwalking event was identified during the user's sleep session (Paragraph [0229] - In response to the likelihood(s), and in response to the control unit further identifying that the subject is sleeping, the control unit generates a "sleepwalking" alert. For example, if the signal gave no indication that the subject woke prior to the indication that the subject left the bed, an alert will be generated). Regarding Claim 11, Shinar et. al.’635 in view of Shouldice et. al.’810 and further in view of Rodgers et. al.’843 discloses system outlined in Claim 1 above. Shinar et. al.’635 further discloses the computer system is further configured to transmit the notification to a user device of the user for presentation in a graphical user interface (GUI) display when the user wakes up from the sleep session (Paragraph [0096] - Typically, user interface 24 includes a display. For some applications, motion sensor 30 is integrated into control unit 14, in which case user interface 24 is either also integrated into control unit 14 or remote from control unit 14). Regarding Claim 12, Shinar et. al.’635 in view of Shouldice et. al.’810 and further in view of Rodgers et. al.’843 discloses system outlined in Claim 1 above. Shinar et. al.’635 further discloses the computer system is further configured to transmit the notification to a user device of a healthcare provider associated with the user, wherein the notification is a machine-instruction to engage an automated device (Paragraph [0104] - Alternatively or additionally, the user interface 24 comprises a wireless or wired communication port for relaying the acquired raw data and/or processed data to a remote site for further analysis, interpretation, expert review, and/or clinical follow-up; Paragraph [0107] - User interface 24 is configured to notify subject 12 and/or a clinician of the predicted or occurring episode). Regarding Claim 18, Shinar et. al.’635 in view of Shouldice et. al.’810 and further in view of Rodgers et. al.’843 discloses the system outlined in Claim 1 above as well as an output from the bed exit detection classifier further comprises a time value at which the bed presence of the user is detected (Paragraph [0231] - Control unit 14 receives a plurality of inputs indicative of postures of the person at the respective times), and the bed exit detection probability is based at least in part on historic data of detected sleepwalking events for the user (Paragraph [0202] - the person usually sleeps between 11 pm and 6 am, the control unit may select a "high" intensity in response to a likelihood of 90% at 7 am, but a "low" intensity in response to a likelihood of 90% at 3 am). Regarding Claim 19, Shinar et. al.’635 discloses a method for detecting sleepwalking events of a user in a bed (Paragraph [0229] - In response to the likelihood(s), and in response to the control unit further identifying that the subject is sleeping, the control unit generates a "sleepwalking" alert), the method comprising: receiving, by a computer system, sensor data from at least one sensor during a sleep session of a user of a bed (Paragraph [0099] - The motion of subject 12 sensed by sensor 30, during sleep, for example, may include regular breathing movement, heartbeat-related movement, and other, unrelated body movements, as discussed below, or combinations thereof. For some applications, sensor 30 comprises a standard communication interface (e.g. USB), which enables connection to standard monitoring equipment); providing, by the computer system and as input to a sleep state classifier, a first portion of the sensor data, wherein the sleep state classifier uses a machine-learning model to determine the user's sleep states during the sleep session (Paragraph [0112] - For some applications, in response to an input to system 10, the pattern identification module operates in a learning mode, in which the module learns characteristic patterns of the subject); receiving, by the computer system and as output from the sleep state classifier, a sleep state classification for the user (Paragraph [0106] - Breathing pattern analysis module 22 may be used (e.g., to facilitate ascertaining a sleep stage of a subject); Paragraph [0226] - control unit 14 identifies a sleep stage of subject 12; Paragraph [0228] - analyzing the signal from sensor 316 (e.g., motion sensor 30), control unit 14 identifies that the subject is sleeping); providing, by the computer system and as input to a bed exit detection classifier, a second portion of the sensor data, wherein the bed exit detection classifier uses a machine-learning model to determine when the user exits the bed during the sleep session (Paragraph [0116] - For some applications, in response to an input to system 10, the pattern identification module operates in a learning mode, in which the module learns characteristic patterns of the subject; Paragraph [0198] - Control unit 14 is configured to analyze the signal from the sensor, and, in response thereto, calculate a bed-exit likelihood, which is a likelihood that the person has left the resting surface); receiving, by the computer system and as output from the bed exit detection classifier, a bed exit probability that indicates a likelihood that the user has left the resting surface (Paragraph [0198] - Control unit 14 is configured to analyze the signal from the sensor, and, in response thereto, calculate a bed-exit likelihood, which is a likelihood that the person has left the resting surface; Paragraph [0229] - the control unit identifies a likelihood that the subject has left the resting surface); determining, by the computer system, whether the sleep state classification for the user satisfies a first threshold condition (Paragraph [0194] - the control unit is configured to track the history of awakenings of the subject and/or of the other person, thus allowing the control unit to learn the proper threshold) of N3 sleep state (Paragraph [0224] - identified sleep stage to the subject, only if the identified sleep stage is a slow-wave (i.e., deep) sleep stage. (The output may include a technical description of the sleep stage, e.g., "NREM stage 3", and/or a non-technical description, e.g., "deep sleep".)), and determining, by the computer system, whether the bed exit detection classification for the user satisfies a second threshold condition (Paragraph [0150] - control unit 14 is configured to "learn" an amplitude threshold Th by analyzing a portion of the motion signal (e.g., portion 501a) that was generated in response to motion (e.g., cardiac and/or respiratory motion) of the subject, and calculating threshold Th based on the analysis.; Paragraph [0151] - If the amplitude has changed by a threshold amount, it is likely the subject moved relative to the sensor, and thus, the large body-movement is likely that of the subject. (FIG. 32 shows A4 being different from A3, but not by more than the threshold amount, such that portion 506 is determined to have been generated by motion of the second person.); Paragraph [0253] - the detected bed exit(s) of the user satisfies a second threshold condition in block 2010. The second threshold condition can be whether the user has been detected as exiting the bed (e.g., detection of at least one bed exit event)); identifying, by the computer system and based on a determination that the first and the second threshold conditions are satisfied, a sleepwalking event for the user is imminent (Paragraph [0229] - In response to the likelihood(s), and in response to the control unit further identifying that the subject is sleeping, the control unit generates a "sleepwalking" alert. For example, if the signal gave no indication that the subject woke prior to the indication that the subject left the bed, an alert will be generated. In response to the alert, a caregiver may come to the subject's aid); and generating, by the computer system, output based on identification that the sleepwalking event in imminent (Paragraph [0229] - the control unit generates a "sleepwalking" alert. For example, if the signal gave no indication that the subject woke prior to the indication that the subject left the bed, an alert will be generated). Shinar et. al.’635 fails to explicitly disclose determining a sleepwalking event based on a sleep state classification of N3 and a bed exit classification. Shouldice et. al.’810 teaches identifying a sleepwalking event based on motion detected during a slow wave – N3 – sleep stage (Paragraph [0308] - Sleepwalking is more likely to occur during deep slow wave sleep (SWS). The sleep staging algorithm of the processing device 100 of the system that detects the user's sleep stages, can detect the wake-type movement as the person gets out of bed (or back in to bed) with paradoxical respiratory patterns that are consistent with REM or deep sleep (e.g., based on the standard deviation of normalized running median filtered breathing rate estimate and other features). Where the absence periods are greater than a sleep walking threshold (e.g., >5 mins), the processing device 100 of the system can flag an area of suspected sleep walking). It would have been obvious to one of ordinary skill in the art at the time the invention was effectively filed to have modified the method of Shinar et. al.’635 that involves identifying a sleepwalking event based on an output from a sleep stage identifier and motion detector (Paragraph [0229]) to include selecting deep sleep – N3 stage – as a preferred sleep state in order to increase the probability of detecting a sleepwalking event based on the concept that sleepwalking is most likely to occur by a person during slow wave/deep sleep/N3 stage of a sleeping cycle as seen in Shouldice et. al.’810. Shinar et. al.’635 also fails to disclose an output comprising physically changing the bed before the sleepwalking event occurs. Rodgers et. al.’843 teaches a bed that receives instructions from a controller in order to lower a bed height whenever a user is exiting the bed (Paragraph [0103] - IRCC 312 – computer controller - can lower bed 304 from its specified height to some lower height in response to determining that movement data from one or more of cameras 308(a,b), 310(a,b), and 316 is sufficiently similarly to one or more movement pattern data sets generally and/or specifically indicative of an attempt by patient 302 to exit bed 304. Lowering of support platform reduces the potential fall distance of patient 302). It would have been obvious to one of ordinary skill in the art at the time the invention was effectively filed to have modified the method of Shinar et. al.’635 in view of Shouldice et. al.’810 to include lowering a bed to a specified height whenever a pattern of a user exiting the bed is identified in order to reduce potential fall distance of the user as seen in Rodgers et. al.’843. Regarding Claim 20, Shinar et. al.’635 in view of Shouldice et. al.’810 and further in view of Rodgers et. al.’843 discloses the method outlined in Claim 19 above. Shinar et. al.’635 further discloses generating the output comprises generating a notification indicating that the sleepwalking event was identified during the user's sleep session (Paragraph [0229] - In response to the likelihood(s), and in response to the control unit further identifying that the subject is sleeping, the control unit generates a "sleepwalking" alert. For example, if the signal gave no indication that the subject woke prior to the indication that the subject left the bed, an alert will be generated). Regarding Claim 21, Shinar et. al.’635 in view of Shouldice et. al.’810 and further in view of Rodgers et. al.’843 discloses system outlined in Claim 1 above. Shinar et. al.’635 further discloses the machine-learning model comprises data generated from training with cardiac parameters (Paragraph [0150] - In apparatus 500b, control unit 14 is configured to "learn" an amplitude threshold Th by analyzing a portion of the motion signal (e.g., portion 501a) that was generated in response to motion (e.g., cardiac and/or respiratory motion) of the subject, and calculating threshold Th based on the analysis). Regarding Claim 22, Shinar et. al.’635 in view of Shouldice et. al.’810 and further in view of Rodgers et. al.’843 discloses the system outlined in Claim 21 above. Shinar et. al.’635 further discloses wherein the machine-learning model further comprises data generated from training with user-motion parameters, respiration parameters, and cardiorespiratory coupling parameters (Paragraph [0103] - Pattern analysis module 16 typically comprises one or more of the following modules: a breathing pattern analysis module 22, a heartbeat pattern analysis module 23, a cough analysis module 26, a restlessness analysis module 28, a blood pressure analysis module 29, and an arousal analysis module 31; Paragraph [0150] - In apparatus 500b, control unit 14 is configured to "learn" an amplitude threshold Th by analyzing a portion of the motion signal (e.g., portion 501a) that was generated in response to motion (e.g., cardiac and/or respiratory motion) of the subject, and calculating threshold Th based on the analysis). It is noted by the examiner that Shinar et. al.’635 recited “Techniques described herein may be practiced in combination with techniques described in one or more of the following patents and patent applications, which are incorporated herein by reference” (Paragraph [0249]) as well as “the scope of the present invention includes both combinations and sub-combinations of the various features described hereinabove, as well as variations and modifications thereof that are not in the prior art” (Paragraph [0279]). Claims 5-6 and 9 are rejected under 35 U.S.C. 103 as being unpatentable over Shinar et. al.’635 (U.S. Publication Number 20140371635 – previously cited) in view of Shouldice et. al.’810 (U.S. Publication Number 20200367810), further in view of Rodgers et. al.’843 (U.S. Publication Number 20090119843 – previously cited), as applied to Claim 1 above, and further in view of Contant et. al.’052 (WO Publication 2012006052 – previously cited). Regarding Claim 5, Shinar et. al.’635 in view of Shouldice et. al.’810 and further in view of Rodgers et. al.’843 discloses the system outlined in Claim 4, but fails to disclose wherein the threshold time range is 1 to 3 hours. Contant et. al.’052 teaches a threshold time range of around 90 minutes for one cycle (Paragraph [0037] - Stages three and four are the deepest levels of sleep and occur only in the first third of the sleep period. NREM stage four usually takes up 12 to 15 percent of total sleep time. Sleep terrors, sleep walking, and bedwetting episodes generally occur within stage four or during partial arousals from this sleep stage. It typically takes about 90 minutes to cycle through the four deepening stages of NREM sleep). It would have been obvious to one of ordinary skill in the art at the time the invention was effectively filed to have modified the system of Shinar et. al.’635 in view of Shouldice et. al.’810 and further in view of Rodgers et. al.’843 to include a level or weight of likelihood that a sleepwalking event takes place around 90 minutes after a person falls asleep in order to connect the sleepwalking patterns of the user to stages of sleep that the user was in as seen in Contant et. al.’052. Regarding Claim 6, Shinar et. al.’635 in view of Shouldice et. al.’810 and further in view of Rodgers et. al.’843 discloses the system outlined in Claim 4, but fails to disclose wherein a threshold time range is 1 to 2 sleep cycles. Contant et. al.’052 teaches a threshold time range includes the amount of time to go through one sleep cycle (Paragraph [0036] - the health management system 10 includes a device configured to detect sleep cycles of a user using sensors 35 (such as a motion sensor 38); Paragraph [0037] - Sleep terrors, sleep walking, and bedwetting episodes generally occur within stage four or during partial arousals from this sleep stage. It typically takes about 90 minutes to cycle through the four deepening stages of NREM sleep). It would have been obvious to one of ordinary skill in the art at the time the invention was effectively filed to have modified the system of Shinar et. al.’635 in view of Shouldice et. al.’810 and further in view of Rodgers et. al.’843 to include a level or weight of likelihood that a sleepwalking event takes place towards the end of one sleep cycle after a person falls asleep in order to connect the sleepwalking patterns of the user to stages of sleep that the user was in as seen in Contant et. al.’052. Regarding Claim 9, Shinar et. al.’635 in view of Shouldice et. al.’810 and further in view of Rodgers et. al.’843 discloses the system outlined in Claim 8, but fails to disclose the identification of the sleepwalking event is imminent includes information about a time during the sleep session when the sleepwalking event was identified. Contant et. al.’052 teaches identification of a sleepwalking event includes information about a time during a user’s sleep session when the sleepwalking event was identified (Paragraph [0041] - the device includes a learning function to learn when a specific event reoccurs frequently while sleeping (e.g., a nightmare or night terrors, sleepwalking or somnambulism, enuresis or a urinating accident or bedwetting, and so on) which, e.g., is useful to more precisely optimize the alert or signal functionality. For example, the device can (directly or indirectly) detect in the sleep cycle some violent movements (e.g., using an accelerometer) that correspond to nightmares or night terrors – could include sleepwalking as identified above in Paragraph - and record every night at what time the events occur). It would have been obvious to one of ordinary skill in the art at the time the invention was effectively filed to have modified the system of Shinar et. al.’635 in view of Shouldice et. al.’810 and further in view of Rodgers et. al.’843 to include in the user’s history report a time when a sleepwalking event occurs in order to optimize the learning function and alert system of the device as seen in Contant et. al.’052. Claim 17 is rejected under 35 U.S.C. 103 as being unpatentable over Shinar et. al.’635 (U.S. Publication Number 20140371635 – previously cited) in view of Shouldice et. al.’810 (U.S. Publication Number 20200367810), further in view of Rodgers et. al.’843 (U.S. Publication Number 20090119843 – previously cited), as applied to Claim 1 above, and further in view of Mushtaq et. al.’225 (U.S. Publication Number 20230218225). Regarding Claim 17, Shinar et. al.’635 in view of Shouldice et. al.’810 and further in view of Rodgers et. al.’843 discloses the system outlined in Claim 1 above. Shinar et. al.’635 further discloses generating instructions to the controller of the bed, that causes the bed to be adjusted (Paragraph [0221] - In some applications, if control unit 14 identifies, in response to analyzing the signal following the beginning of execution of the waking routine, that the subject has woken, the control unit changes the angle of the resting surface. For example, the control unit may move the upper portion of the bed to a more upright position, in order to facilitate the subject's exit from bed), but fails to disclose the controller adjusting a temperature of the bed. Mushtaq et. al.’225 teaches a climate control system for a bed of a user to promote higher quality sleep (Paragraph [0109] - by adjusting the temperature of the bed 302 and/or the room in which the bed 302 is located, the user 308 can experience more improved sleep quality and comfort; Paragraph [0134] - the control circuitry 334 can generate control signals to cause one or more heating or cooling elements on the surface of the bed 302 to change temperature at various times, either in response to user interaction with the bed 302, at various pre-programmed times, based on user preference, and/or in response to detecting microclimate temperatures of the user 308 on the bed 302; Paragraph [0256] - Preparing the environment 1900 for the user 1904 can be advantageous to improve the user 1904's ability to fall asleep, remain asleep, and experience improved sleep quality). It would have been obvious to one of ordinary skill in the art at the time the invention was effectively filed to have modified the system of Shinar et. al.’635 in view of Shouldice et. al.’810 and further in view of Rodgers et. al.’843 to include a climate control system connected to a bed of a user in order to encourage a more favorable sleep environment for the user and assist the user in remaining asleep. Response to Argument Applicant's arguments filed 03 December 2025 have been fully considered and they are not entirely persuasive. Applicant’s amendments have overcome the prior U.S.C. 112b rejections. However, U.S.C. 112a rejections have been addressed in Paragraph 3 above. Application’s amendments and reasons regarding overcoming the prior 35 U.S.C. 101 rejections were found persuasive and have overcome prior rejections. Claims 1, 4-12, and 17-22 are rejected under 35 U.S.C. 103 as necessitated by amendments, as discussed in Paragraphs 4-6 above. The examiner has considered the applicant’s arguments regarding their invention being new art based on the argument that the sleepwalking prediction of Shinar et. al.’635 is not being dependent on identifying a user in N3 stage of sleep, but these arguments were found not to be persuasive. The examiner has cited an additional reference, as necessitated by amendments, that explains how this particular N3 stage sleep element is taught by Shouldice et. al.’810 and how these teachings would have been an obvious modification to Shinar et. al.’635. These modifications would provide further application to the control unit of Shinar et. al.’635 that identifies a sleep stage of a user and further provides a description to the identified sleep stage of the user (Paragraph [0204] - control unit 14 is configured to identify a sleep stage of person; Paragraph [0224] - The output may include a technical description of the sleep stage, e.g., "NREM stage 3"). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to SARAH ANN WESTFALL whose telephone number is (571) 272-3845. The examiner can normally be reached Monday-Friday 7:30am-4:30pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jennifer Robertson can be reached at (571) 272-5001. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SARAH ANN WESTFALL/Examiner, Art Unit 3791 /ETSUB D BERHANU/Primary Examiner, Art Unit 3791
Read full office action

Prosecution Timeline

Apr 05, 2023
Application Filed
Aug 08, 2025
Non-Final Rejection — §103, §112
Sep 11, 2025
Applicant Interview (Telephonic)
Sep 11, 2025
Examiner Interview Summary
Dec 03, 2025
Response Filed
Feb 19, 2026
Final Rejection — §103, §112 (current)

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
0%
Grant Probability
0%
With Interview (+0.0%)
3y 2m
Median Time to Grant
Moderate
PTA Risk
Based on 5 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month