DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 101 see the analysis below.
Step 1
The invention claimed in claims 1-20 are directed to statutory subject matter as the claims recite an apparatus or process.
Step 2A, Prong 1
Regarding Claims 1 and 17, the recited steps of “inputting” physiological data and “classifying” into sleep stages based on the physiological data are directed to a mental process of performing concepts in the human mind (including by a human using the aid of pen and paper) and/or a mathematical concept. For example, this limitation simply amounts to the mental process of a clinician reviewing data (such as physiological data) and performing a mental analysis of reviewing the information and classifying it into different stages. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind, then it falls within the “Mental Processes” grouping of abstract ideas. Accordingly, the claim recites an abstract idea.
Step 2A, Prong 2
Regarding Claims 1 and 17, the judicial exception is not integrated into a practical application. The claims include the additional elements of “receiving” data and causing a display to output the result. The step of “receiving . . .” the data amounts to insignificant, extra-solution activity in that the it is data gathering. The step of “causing . . .” the display to output the result is extra solution as merely outputting a result. The processor (i.e., “processor”, “computer processor”, “cloud-computing device”, “mobile device”, “user device”) in computing steps are recited at a high-level of generality (i.e., as a generic processor performing a generic computer function of determining outputs from inputs) such that it amounts no more than mere instructions to apply the exception using a generic computer component. Accordingly, this additional element does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B
Regarding Claims 1 and 17, the claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As with step 2A, Prong 2 above, the additional elements of “receiving” data and causing a display to output the result. The step of “receiving . . .” the data amounts to insignificant, extra-solution activity in that the it is data gathering. The step of “causing . . .” the display to output the result is extra solution as merely outputting a result. The processor (i.e., “processor”, “computer processor”, “cloud-computing device”, “mobile device”, “user device”) in computing steps are recited at a high-level of generality (i.e., as a generic processor performing a generic computer function of determining outputs from inputs) such that it amounts no more than mere instructions to apply the exception using a generic computer component. Accordingly, this additional element does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Additionally, per the Berkheimer requirement, wearable sensor, user device display, processing and data storage: (1) Heneghan see citations below; (2) Kinnunen see citations below; (3) Haakma see citations below including abstract, [0037]-[0038], [0049]-[0050], Fig. 7; (4) US 20150190086– see [0021]-[0024], Figs. 2-4. Furthermore, more specifically the wearable sensor be a ring: (1) Kinnunen see Fig. 2-3 and citations below; (2) US 20220031233 to Han et al. see [0032], [0072], [0079]-[0080], [0088], Figs. 1-3B, 5, 11-13. As such the elements are shown to be WRC.
The claim limitations when viewed individually and in combination therefore do not amount to significantly more than the abstract idea itself. The claims are therefore ineligible.
Claims 2-16, 18-20 only further define the data gathering or displaying of the result (insignificant, extra-solution activity) or further define elements of the inputting/classifying (i.e., only further define the mental process). Therefore, the claims do not include any additional elements that show integration into a practical application and do not include any additional elements that amount to significantly more than the abstract idea. The claims are ineligible.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 1-6, 8-10, 13-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 20180064388 to Heneghan et al. (hereinafter Heneghan – Cited in IDS dated 1/6/26) in view of US 20180042540 to Kinnunen et al. (hereinafter Kinnunen).
Regarding Claim 1, an interpretation of Heneghan discloses a method for automatically detecting sleep stages, comprising:
receiving physiological data associated with a user from a wearable device, the physiological data collected via the wearable device throughout a time interval ([0125]-[0126] including “the sleep monitoring platform 200 comprises a wearable device 202, a secondary device 204, a network 206, and a backend system 208.”, [0127] including “a wearable device configured to be attached to the wearer's body, such as a wrist-worn band, watch, finger clip, ear-clip, chest-strap, ankle strap, or any other suitable device.”, [0131], [0134] including “collect pulse-related data and motion data during a sleep session for use in the techniques and systems discussed herein. . . . to obtain the motion data from the one or more motion sensors 410, obtain the pulse-related data from the one or more optical sensors 412, and to then transmit such data to other devices in the sleep monitoring platform.”, Figs. 2-5 see also [0005], [0116], [0129]-[0130], [0132]-[0133], [0135], [0230]; recites gathering physiological data during a sleep session on a wearable and transmit data to/receive the data at external devices);
inputting the physiological data into a machine learning classifier ([0118] including “various data-driven features for a given time interval of a sleep session may be derived from optical heart rate sensor data and/or accelerometer sensor data that is obtained in association with that time interval. These features may then be provided to a classifier that classifies that time interval into one of several classes or stages of sleep based on the values of one or more of the data-driven features.”, [0155]-[0156] see also [0230]);
classifying the physiological data, using the machine learning classifier, into at least one sleep stage of a plurality of sleep stages for at least a portion of the time interval ([0116] including “a sleep session for a person's sleep may be divided into a number of intervals, which are often referred to as “epochs,” and each such interval may be evaluated to determine what sleep state or stage the person was in during that interval. For example, the American Academy of Sleep Medicine Guidelines define four classes of sleep: W, N1-N2, N3, and random eye movement (REM), which may be viewed as corresponding to Wake (or Awake), Light Sleep (which may include both N1 and N2), Deep Sleep, and REM Sleep, respectively. Other approaches may include fewer or greater numbers of sleep stages.”, [0118] including ”These features may then be provided to a classifier that classifies that time interval into one of several classes or stages of sleep based on the values of one or more of the data-driven features.”, [0155]-[0156] see also [0230]; data is input into a ML classifier which determines the sleep state/stage); and
causing a graphical user interface of a user device ([0122], [0132], [0180], [0215], Figs. 13, 15, 22, 24-25 see also [0181]-[0183], [0230]) to display an indication of the at least one sleep stage of the plurality of sleep stages based at least in part on classifying the physiological data ([0122], [0180]-[0181] including “Once a person's sleep session has been characterized using the sleep monitoring platform and its classifier, the resulting data set for the sleep session may be broken down into various formats or analyzed and the presented to the person”, [0182]-[0183] including “A more detailed presentation of the sleep data may be provided by way of a hypnogram 1326, which is a type of timeline in which time intervals during the timeline corresponding with particular sleep stages”, Figs. 13-14 see also [0230]).
An interpretation Heneghan may not explicitly disclose wherein the wearable sensor device is a ring.
However, in the same field of endeavor (medical diagnostic systems), Kinnunen teaches wherein the wearable sensor device is a ring ([0067]-[0068], Figs. 1-2 see also [0027], [0073], [0085] including “the hypnogram is determined based on the measured motion data, photoplethysmogram waveform dynamics and heart rate dynamics using a classification algorithm, a decision tree, a neural network or equivalent.”).
It would have been prima facie obvious to one of skill in the art before the effective filing date of the claimed invention to have modified the wearable device recited by Heneghan to more specifically be a ring as recited by Kinnunen because measurement on the finger is optimal for gathering HR/HRV optically and the ring wearable is lighter weight than other wearable sensor devices ([0173]).
Regarding Claim 2, an interpretation of Heneghan further discloses wherein classifying the physiological data ([0116], [0118], [0155]-[0156], Figs. 5, 10, 13-14 see also [0230]) comprises:
classifying the physiological data collected throughout the time interval into a plurality of sleep intervals within the time interval ([0116] including “a sleep session for a person's sleep may be divided into a number of intervals, which are often referred to as “epochs,” and each such interval may be evaluated to determine what sleep state or stage the person was in during that interval.”, [0118] including “various data-driven features for a given time interval of a sleep session may be derived from optical heart rate sensor data and/or accelerometer sensor data that is obtained in association with that time interval. These features may then be provided to a classifier that classifies that time interval into one of several classes or stages of sleep based on the values of one or more of the data-driven features.”, [0155]-[0156], [0175], Figs. 10, 12-14, 25-26 see also [0119], [0178], [0230]; Reference discloses the sleep session is made up of a series of time intervals and classifying a sleep stage (aka state etc.) each time interval); and
classifying each sleep interval of the plurality of sleep intervals into at least one of an awake sleep stage, a light sleep stage, a rapid eye movement sleep stage, or a deep sleep stage ([0116] including “For example, the American Academy of Sleep Medicine Guidelines define four classes of sleep: W, N1-N2, N3, and random eye movement (REM), which may be viewed as corresponding to Wake (or Awake), Light Sleep (which may include both N1 and N2), Deep Sleep, and REM Sleep, respectively. Other approaches may include fewer or greater numbers of sleep stages.”, [0118], [0155]-[0156], Figs. 10, 13-14 see also [0178], [0230]).
Regarding Claim 3, an interpretation of Heneghan further discloses causing the graphical user interface of the user device to display one or more sleep intervals of the plurality of sleep intervals ([0122], [0180]-[0181] including “Once a person's sleep session has been characterized using the sleep monitoring platform and its classifier, the resulting data set for the sleep session may be broken down into various formats or analyzed and the presented to the person”, [0182]-[0183] including “A more detailed presentation of the sleep data may be provided by way of a hypnogram 1326, which is a type of timeline in which time intervals during the timeline corresponding with particular sleep stages”, [0215], Figs. 13-14, 25-26 see also [0230]); and
causing the graphical user interface of the user device to display a classified sleep stage corresponding to each sleep interval of the one or more sleep intervals ([0122], [0180]-[0181] including “Once a person's sleep session has been characterized using the sleep monitoring platform and its classifier, the resulting data set for the sleep session may be broken down into various formats or analyzed and the presented to the person”, [0182]-[0183] including “A more detailed presentation of the sleep data may be provided by way of a hypnogram 1326, which is a type of timeline in which time intervals during the timeline corresponding with particular sleep stages”, Figs. 13-14, 24-26 see also [0230])
Regarding Claim 4, an interpretation of Heneghan further discloses performing one or more normalization procedures on the physiological data ([0082] including “may be normalized before those one or more pulse data features are extracted”, [0140], [0160]-[0161] see also [0230]), wherein inputting the physiological data into the machine learning classifier comprises inputting the normalized physiological data into the machine learning classifier ([0082], [0140], [0160]-[0161] see also [0230]; extracted features may be normalized and the normalized features input into the classifier).
Regarding Claim 5, an interpretation of Heneghan further discloses identifying, using the machine learning classifier, a plurality of features associated with the physiological data ([0019], [0120]-[0121] including “Various tools for training, validating, and testing classifiers of various types may be found, for example, in the documentation for scikit-learn, which is a module developed for the Python programming language (http://scikit-learn.sourceforge.net).”, [0140], [0178] see also [0142]-[0156] discusses various features and the input of the features into a classifier, [0230]), wherein classifying the physiological data is based at least in part on identifying the plurality of features ([0019], [0140], [0155]-[0156], [0160]-[0161], [0178] see also [0120]-[0121], [0142]-[0154] discusses various features, [0230]).
Regarding Claim 6, an interpretation of Heneghan discloses wherein the plurality of features comprise a pattern between two or more parameters of the physiological data (0081]-[0082], [0160] see also [0219]-[0220]), an average data value of the physiological data ([0081]-[0082] including “a mean heart rate. . .”, [0140], [0160]), and/or a comparison of a data value of the physiological data to a baseline data value for the user (0081]-[0082], [0160] see also [0219]-[0220]), or any combination thereof.
Regarding Claim 8, an interpretation of Heneghan further discloses identifying a bed time associated with the user, a wake time associated with the user, or both, based at least in part on classifying the physiological data ([0130], [0198], Figs. 20, 22, 24); and causing the graphical user interface of the user device to display the bed time, the wake time, or both ([0198], Figs. 20, 22, 24 see also [0180], [0230]).
Regarding Claim 9, an interpretation of Heneghan further discloses transmitting, via the user device, the physiological data to one or more servers for classification ([0125]-[0126] including “the sleep monitoring platform 200 comprises a wearable device 202, a secondary device 204, a network 206, and a backend system 208.”, [0129] including “the wearable device 204 may access the backend system 208 via the secondary device 204.”, [0130]-[0131], [0134] including “collect pulse-related data and motion data during a sleep session for use in the techniques and systems discussed herein. . . . to obtain the motion data from the one or more motion sensors 410, obtain the pulse-related data from the one or more optical sensors 412, and to then transmit such data to other devices in the sleep monitoring platform.”, [0164], Figs. 2-5 see also [0005], [0116], [0127], [0129]-[0130], [0132]-[0133], [0135], [0230]; recites gathering physiological data during a sleep session (i.e. the time interval) on a wearable and transmit data to/receive the data at external devices).
Regarding Claim 10, an interpretation of Heneghan further discloses generating, using the user device, value/parameter based at least in part on the physiological data ([0130], [0132], [0135], [0177], Figs. 2-3, 5 see also [0126], [0129], [0230]).
An interpretation of Heneghan may not explicitly disclose generate one or more scores associated with the user based at least in part on the physiological data, the one or more scores comprising a Sleep Score, a Readiness Score, or both.
However, in the same field of endeavor (medical diagnostic systems), Kinnunen teaches generate one or more scores associated with the user based at least in part on the physiological data, the one or more scores comprising a Sleep Score, a Readiness Score, or both ([0087], [0092]-[0093], [0117], [0120]-[0121], Figs. 9-12 see also [0027], [0122]-[0131], [0177]; Data may be transferred from wearable gathering data to external device analyzing data and presenting a sleep score).
It would have been prima facie obvious to one of skill in the art before the effective filing date of the claimed invention to have modified the analysis recited by Heneghan to include scores as recited by Kinnunen because the determination and presentation of the scores and related information provides the user with more information and suggestions to improve those scores and optimize their use of free time ([0003], [0095], [0172]).
Regarding Claim 13, an interpretation of Heneghan further discloses causing the graphical user interface of the user device to display at least a subset of the physiological data ([0199], Figs. 20, 22, 24).
Regarding Claim 14, an interpretation of Heneghan further discloses wherein the physiological data comprises accelerometer data ([0131], [0134], [0136]-[0137], Figs. 2-4 see also [0230]), heart rate ([0131], [0134], [0137], Figs. 2-4 see also [0230]).
Regarding Claim 15, an interpretation of Heneghan may not explicitly disclose wherein the wearable device collects the physiological data from the user based on arterial blood flow within a finger of the user.
However, in the same field of endeavor (medical diagnostic systems), Kinnunen teaches wherein the wearable ring device collects the physiological data from the user based on arterial blood flow within a finger of the user ([0067], [0173], Fig. 2).
It would have been prima facie obvious to one of skill in the art before the effective filing date of the claimed invention to have modified the wearable device recited by Heneghan to more specifically be a ring as recited by Kinnunen because measurement on the finger is optimal for gathering HR/HRV optically and the ring wearable is lighter weight than other wearable sensor devices ([0173]).
Regarding Claim 16, an interpretation of Heneghan further discloses wherein the wearable device collects the physiological data from the user using one or more red light emitting diodes and one or more green light emitting diodes ([0134] including “For example, green light is particularly well-suited for measuring heart rate, whereas a combination of red light and infrared light is particularly well-suited for measuring blood oxygenation levels.”).
An interpretation Heneghan may not explicitly disclose wherein the wearable sensor device is a ring.
However, in the same field of endeavor (medical diagnostic systems), Kinnunen teaches wherein the wearable sensor device is a ring with light sources ([0067]-[0068], [0077]-[0078], Figs. 1-2 see also [0027]).
It would have been prima facie obvious to one of skill in the art before the effective filing date of the claimed invention to have modified the wearable device recited by Heneghan to more specifically be a ring as recited by Kinnunen because measurement on the finger is optimal for gathering HR/HRV optically and the ring wearable is lighter weight than other wearable sensor devices ([0173]).
Regarding Claim 17, an interpretation of Heneghan discloses an apparatus for automatically detecting sleep stages (abstract, [0125]-[0126], Figs. 2-5 see also [0127]-[0129], [0230]), comprising:
a processor ([0132]-[0133], figs. 2-3 see also [0135], [0177], [0230]);
memory coupled with the processor ([0132]-[0133], figs. 2-3 see also [0135], [0230]); and
instructions stored in the memory and executable by the processor ([0131]-[0133], figs. 2-3 see also [0135], [0177], [0230]) to cause the apparatus to:
receive physiological data associated with a user from a wearable device, the physiological data collected via the wearable device throughout a time interval ([0125]-[0126] including “the sleep monitoring platform 200 comprises a wearable device 202, a secondary device 204, a network 206, and a backend system 208.”, [0127] including “a wearable device configured to be attached to the wearer's body, such as a wrist-worn band, watch, finger clip, ear-clip, chest-strap, ankle strap, or any other suitable device.”, [0131], [0134] including “collect pulse-related data and motion data during a sleep session for use in the techniques and systems discussed herein. . . . to obtain the motion data from the one or more motion sensors 410, obtain the pulse-related data from the one or more optical sensors 412, and to then transmit such data to other devices in the sleep monitoring platform.”, Figs. 2-5 see also [0005], [0116], [0129]-[0130], [0132]-[0133], [0135], [0230]; recites gathering physiological data during a sleep session (i.e. the time interval) on a wearable and transmit data to/receive the data at external devices);
input the physiological data into a machine learning classifier ([0118] including “various data-driven features for a given time interval of a sleep session may be derived from optical heart rate sensor data and/or accelerometer sensor data that is obtained in association with that time interval. These features may then be provided to a classifier that classifies that time interval into one of several classes or stages of sleep based on the values of one or more of the data-driven features.”, [0155]-[0156] see also [0230]);
classify the physiological data, using the machine learning classifier, into at least one sleep stage of a plurality of sleep stages for at least a portion of the time interval ([0116] including “a sleep session for a person's sleep may be divided into a number of intervals, which are often referred to as “epochs,” and each such interval may be evaluated to determine what sleep state or stage the person was in during that interval. For example, the American Academy of Sleep Medicine Guidelines define four classes of sleep: W, N1-N2, N3, and random eye movement (REM), which may be viewed as corresponding to Wake (or Awake), Light Sleep (which may include both N1 and N2), Deep Sleep, and REM Sleep, respectively. Other approaches may include fewer or greater numbers of sleep stages.”, [0118] including ”These features may then be provided to a classifier that classifies that time interval into one of several classes or stages of sleep based on the values of one or more of the data-driven features.”, [0155]-[0156] see also [0230]; data is input into a ML classifier which determines the sleep state/stage); and
cause a graphical user interface of a user device ([0122], [0132], [0180], [0215], Figs. 13, 15, 22, 24-25 see also [0181]-[0183], [0230]) to display an indication of the at least one sleep stage of the plurality of sleep stages based at least in part on classifying the physiological data ([0122], [0180]-[0181] including “Once a person's sleep session has been characterized using the sleep monitoring platform and its classifier, the resulting data set for the sleep session may be broken down into various formats or analyzed and the presented to the person”, [0182]-[0183] including “A more detailed presentation of the sleep data may be provided by way of a hypnogram 1326, which is a type of timeline in which time intervals during the timeline corresponding with particular sleep stages”, Figs. 13-14 see also [0230]).
An interpretation Heneghan may not explicitly disclose wherein the wearable sensor device is a ring.
However, in the same field of endeavor (medical diagnostic systems), Kinnunen teaches wherein the wearable sensor device is a ring ([0067]-[0068], Figs. 1-2 see also [0027], [0073], [0085] including “the hypnogram is determined based on the measured motion data, photoplethysmogram waveform dynamics and heart rate dynamics using a classification algorithm, a decision tree, a neural network or equivalent.”).
It would have been prima facie obvious to one of skill in the art before the effective filing date of the claimed invention to have modified the wearable device recited by Heneghan to more specifically be a ring as recited by Kinnunen because measurement on the finger is optimal for gathering HR/HRV optically and the ring wearable is lighter weight than other wearable sensor devices ([0173]).
Regarding Claim 18, an interpretation of Heneghan further discloses: classify the physiological data collected throughout the time interval into a plurality of sleep intervals within the time interval ([0116] including “a sleep session for a person's sleep may be divided into a number of intervals, which are often referred to as “epochs,” and each such interval may be evaluated to determine what sleep state or stage the person was in during that interval.”, [0118] including “various data-driven features for a given time interval of a sleep session may be derived from optical heart rate sensor data and/or accelerometer sensor data that is obtained in association with that time interval. These features may then be provided to a classifier that classifies that time interval into one of several classes or stages of sleep based on the values of one or more of the data-driven features.”, [0155]-[0156], [0175], Figs. 10, 12-14, 25-26 see also [0119], [0178], [0230]; Reference discloses the sleep session is made up of a series of time intervals and classifying a sleep stage (aka state etc.) each time interval); and
classify each sleep interval of the plurality of sleep intervals into at least one of an awake sleep stage, a light sleep stage, a rapid eye movement sleep stage, or a deep sleep stage ([0116] including “For example, the American Academy of Sleep Medicine Guidelines define four classes of sleep: W, N1-N2, N3, and random eye movement (REM), which may be viewed as corresponding to Wake (or Awake), Light Sleep (which may include both N1 and N2), Deep Sleep, and REM Sleep, respectively. Other approaches may include fewer or greater numbers of sleep stages.”, [0118], [0155]-[0156], Figs. 10, 13-14 see also [0178], [0230]).
Regarding Claim 19, an interpretation of Heneghan further discloses cause the graphical user interface of the user device to display one or more sleep intervals of the plurality of sleep intervals ([0122], [0180]-[0181] including “Once a person's sleep session has been characterized using the sleep monitoring platform and its classifier, the resulting data set for the sleep session may be broken down into various formats or analyzed and the presented to the person”, [0182]-[0183] including “A more detailed presentation of the sleep data may be provided by way of a hypnogram 1326, which is a type of timeline in which time intervals during the timeline corresponding with particular sleep stages”, [0215], Figs. 13-14, 25-26 see also [0230]); and
cause the graphical user interface of the user device to display a classified sleep stage corresponding to each sleep interval of the one or more sleep intervals ([0122], [0180]-[0181] including “Once a person's sleep session has been characterized using the sleep monitoring platform and its classifier, the resulting data set for the sleep session may be broken down into various formats or analyzed and the presented to the person”, [0182]-[0183] including “A more detailed presentation of the sleep data may be provided by way of a hypnogram 1326, which is a type of timeline in which time intervals during the timeline corresponding with particular sleep stages”, Figs. 13-14, 24-26 see also [0230])
Regarding Claim 20, an interpretation of Heneghan further discloses perform one or more normalization procedures on the physiological data ([0082] including “may be normalized before those one or more pulse data features are extracted”, [0140], [0160]-[0161] see also [0230]), wherein inputting the physiological data into the machine learning classifier comprises inputting the normalized physiological data into the machine learning classifier ([0082], [0140], [0160]-[0161] see also [0230]; extracted features may be normalized and the normalized features input into the classifier).
Claim Rejections - 35 USC § 103
Claim(s) 7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Heneghan in view of Kinnunen in further view of US 20180125418 to Haakma et al. (hereinafter Haakma).
Regarding Claim 7, an interpretation of Heneghan discloses causing the graphical user interface of the user device to display a maximum and minimum of the heart rate ([0199], Figs. 20, 22, 24).
an interpretation of Heneghan may not explicitly disclose the max and min parameters are features of the plurality of features.
However, in the same field of endeavor (medical diagnostic systems), Haakma teaches the max and min parameters are features of the plurality of features ([0037]-[0038] including “a feature refers to any parameter that can be obtained from evaluating a signal and thus is also termed as a signal feature. For instance, a signal feature can be a maximum or minimum value of the sensor signal during a specified time period”, [0050]).
It would have been prima facie obvious to one of skill in the art before the effective filing date of the claimed invention to have modified the features of sleep staging analysis of Heneghan in view of Kinnunen to include max/min values of parameters as recited by Haakma because it is “Obvious to try” – choosing from a finite number of identified, predictable solutions, with a reasonable expectation of success. Haakma recites that “signal feature” is known term in signal processing and recites a finite number of identified and predictable features generic signals to serve as features in in a ML model which have a reasonable expectation of success.
Claim Rejections - 35 USC § 103
Claim(s) 11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Heneghan in view of Kinnunen in further view of US 20200397366 to Guazzi et al. (hereinafter Guazzi).
Regarding Claim 11, an interpretation of Heneghan may not explicitly disclose inputting a circadian rhythm adjustment model into the machine learning classifier, wherein classifying the physiological data is based at least in part on the circadian rhythm adjustment model.
However, in the same field of endeavor (medical diagnostic systems), Guazzi teaches inputting a circadian rhythm adjustment model into the machine learning classifier, wherein classifying the physiological data is based at least in part on the circadian rhythm adjustment model ([0003]-[0004], [0011], [0028]-[0029]; sleep staging incorporating the circadian rhythm).
It would have been prima facie obvious to one of skill in the art before the effective filing date of the claimed invention to have modified the sleep staging analysis of Heneghan in view of Kinnunen to include circadian rhythm model element as disclosed by Guazzi because it is combining prior art elements (the analysis for sleep staging as recited by Heneghan to include the additional consideration of circadian rhythm as recited by Guazzi) according to known methods to yield predictable results of having sleep stages which is based on the physiological data as well as the circadian rhythm.
Claim Rejections - 35 USC § 103
Claim(s) 12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Heneghan in view of Kinnunen in further view of US 20110230790 to Kozlov (hereinafter Kozlov).
Regarding Claim 12, an interpretation of Heneghan further discloses receiving additional physiological data associated with the user from the wearable device, the physiological data collected via the wearable device throughout a second time interval ([0125]-[0127], [0131], [0134], [0201], Figs. 2-5, 22 see also [0005], [0116], [0129]-[0130], [0132]-[0133], [0135], [0230]; recites gathering physiological data during a plurality of sleep sessions on a wearable and transmit data to/receive the data at external devices);
inputting the additional physiological data into the machine learning classifier ([0118], [0155]-[0156] see also [0230]);
classifying the additional physiological data, using the machine learning classifier, into at least one sleep stage of the plurality of sleep stages for at least a portion of the second time interval ([0116], [0118], [0155]-[0156] see also [0230]; data is input into a ML classifier which determines the sleep state/stage), wherein classifying the additional physiological data is based at least in part on inputting the additional physiological data ([0116], [0118], [0155]-[0156] see also [0230]; data is input into a ML classifier which determines the sleep state/stage); and
causing the graphical user interface of the user device ([0122], [0132], [0180], [0215], Figs. 13, 15, 22, 24-25 see also [0181]-[0183], [0230]) to display an indication of the at least one sleep stage of the plurality of sleep stages within the second time interval based at least in part on classifying the additional physiological data ([0122], [0180]-[0181], [0182]-[0183], Figs. 13-14 see also [0230]).
An interpretation Heneghan may not explicitly disclose wherein the wearable sensor device is a ring.
However, in the same field of endeavor (medical diagnostic systems), Kinnunen teaches wherein the wearable sensor device is a ring with light sources ([0067]-[0068], [0077]-[0078], Figs. 1-2 see also [0027]).
It would have been prima facie obvious to one of skill in the art before the effective filing date of the claimed invention to have modified the wearable device recited by Heneghan to more specifically be a ring as recited by Kinnunen because measurement on the finger is optimal for gathering HR/HRV optically and the ring wearable is lighter weight than other wearable sensor devices ([0173]).
An interpretation of Heneghan may not explicitly disclose the classification based on prior data (i.e. the physiological data) and current data (i.e. the additional physiological data).
However, in the same field of endeavor (medical diagnostic systems), Kozlov teaches refining a model based on including a first set of data which then increases the accuracy of the analysis for the additional second set of data (abstract, [0179] see also [0137]-[0138]).
It would have been prima facie obvious to one of skill in the art before the effective filing date of the claimed invention to have modified the sleep staging analysis of Heneghan in view of Kinnunen to include the updating of the model using previous data as recited by Kozlov because it provides more accurate determinations ([0179]).
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claim 1, 17 are rejected on the ground of nonstatutory double patenting as being unpatentable over claim 10 (which depends from 1) of U.S. Patent No. 12268530 (‘530) in view of Heneghan. Claim 5 recites the gathering of physiological data and classifying into sleep stages using an ML model using input physiological data. An interpretation of ‘530 may not explicitly disclose displaying on a GUI as claimed, however, Heneghan discloses this element (see citations in the rejection above). Combining the determination of ‘530 with the displaying of the hypnogram recited by Heneghan is merely the use of known technique (displaying the result) to improve similar devices (methods, or products) in the same way.
Claims 1-20 are provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-20 of copending Application No. 17/733864 (reference application). Although the claims at issue are not identical, they are not patentably distinct from each other because ’864 discloses the elements of the claims of the current application. Claim 1 of ‘864 discloses claim 1 of the current application the receiving/measuring physiological data, inputting the data into a ML classifier, classifying the data using the ML model into a sleep stage and displaying the sleep stage on a GUI. Likewise for the other claims.
This is a provisional nonstatutory double patenting rejection because the patentably indistinct claims have not in fact been patented.
Claim 1, 17 is provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claim 3-4, 19 of copending Application No. 18/145645 (reference application) in view of Heneghan. Although the claims at issue are not identical, they are not patentably distinct from each other because ’645 discloses the elements of the claims of the current application. Claim 3 of ‘864 discloses claim 1 of the current application the receiving/measuring physiological data, inputting the data into a ML classifier for classifying the data using the ML model into a sleep stage/cycle. An interpretation of ‘645 may not explicitly disclose displaying on a gui as claimed, however, Heneghan discloses this element (see citations in the rejection above). Combining the determination of ‘645 with the displaying of the hypnogram recited by Heneghan is merely the use of known technique (displaying the result) to improve similar devices (methods, or products) in the same way.
This is a provisional nonstatutory double patenting rejection because the patentably indistinct claims have not in fact been patented.
Claims 1-20 provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-20 of copending Application No. 18/584745 (reference application) in view of Heneghan. Although the claims at issue are not identical, they are not patentably distinct from each other because ’745 discloses the elements of the claims of the current application. Claim 1 of ‘745 discloses claim 1 of the current application the receiving/measuring physiological data, inputting the data into a classifier for classifying the data using the ML model into a sleep stage/cycle. An interpretation of ‘745 may not explicitly disclose the “classifier” is an ML classifier or displaying on a gui as claimed, however, Heneghan discloses these elements (see citations in the rejection above). Combining the particular type of classifier to be a ML classifier with the analysis of ‘745; and the determination of stages of ‘745 with the displaying of the hypnogram recited by Heneghan is merely combining prior art elements (particular classifier of Heneghan with the analysis disclosed in ‘745; and the display of the hypnogram of Heneghan with the determined stages of ‘745) according to known methods to yield predictable results.
This is a provisional nonstatutory double patenting rejection because the patentably indistinct claims have not in fact been patented.
Claim 1, 17 provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claim 1 of copending Application No. 18/455048 (reference application) in view of Heneghan. Although the claims at issue are not identical, they are not patentably distinct from each other because ’048 discloses the elements of the claims of the current application. Claim 1 of ‘048 discloses claim 1 of the current application the receiving/measuring physiological data, inputting the data into a classifier for classifying the data using the ML model into a sleep stage/cycle and displaying the result on a gui. An interpretation of ‘048 may not explicitly disclose the “classifier” is an ML classifier, however, Heneghan discloses these elements (see citations in the rejection above). Combining the particular type of classifier to be a ML classifier with the analysis of ‘048 as recited by Heneghan is merely combining prior art elements (particular classifier of Heneghan with the analysis disclosed in ‘048) according to known methods to yield predictable results.
This is a provisional nonstatutory double patenting rejection because the patentably indistinct claims have not in fact been patented.
Claims 1-20 provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-20 of copending Application No. 18/455514 (reference application) in view of Heneghan. Although the claims at issue are not identical, they are not patentably distinct from each other because Claim 1 of ‘514 discloses claim 1 of the current application the receiving/measuring physiological data, inputting the data into a ML classifier, classifying the data using the ML model into a sleep stage. An interpretation of ‘514 may not explicitly disclose displaying on a gui as claimed, however, Heneghan discloses these elements (see citations in the rejection above). Combining the determination of stages of ‘514 with the displaying of the hypnogram recited by Heneghan is merely combining prior art elements (the display of the hypnogram of Heneghan with the determined stages of ‘514) according to known methods to yield predictable results. Likewise for the other claims.
This is a provisional nonstatutory double patenting rejection because the patentably indistinct claims have not in fact been patented.
Claims 1, 17 are provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claim 1-20 of copending Application No. 18/612879 (reference application) in view of Heneghan. Although the claims at issue are not identical, they are not patentably distinct from each other because Claim 1, 13 of ‘879 discloses claim 1 of the current application the receiving/measuring physiological data, inputting the data into a ML classifier, classifying the data using the ML model into a sleep stage. An interpretation of ‘879 may not explicitly disclose displaying on a gui as claimed, however, Heneghan discloses these elements (see citations in the rejection above). Combining the determination of stages of ‘879 with the displaying of the hypnogram recited by Heneghan is merely combining prior art elements (the display of the hypnogram of Heneghan with the determined stages of ‘879) according to known methods to yield predictable results. Likewise for the other claims.
This is a provisional nonstatutory double patenting rejection because the patentably indistinct claims have not in fact been patented.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JAMES R MOSS whose telephone number is (571)272-3506. The examiner can normally be reached Monday - Friday (9:30 am - 5:30 pm).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, James Kish can be reached at (571) 272-5554. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/James Moss/Examiner, Art Unit 3792