DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant’s amendments merit new grounds for rejection under 35 U.S.C. § 103 in view of Reifman (U.S. Patent Application Publication No. 2018/028931).
The Reifman reference teaches generating a first set of weighted physiological data (¶[0071] variable weights) based at least in part on inputting the first set of physiological data (¶[0075] PVT performance data) and a circadian rhythm model (¶¶[0058-0059] circadian rhythm model C) into a machine learning model (¶[0057] using a model); and subsequently classifying, using the machine learning model, the weighted first set of physiological data into the circadian rhythm chronotype based at least in part on generating the weighted first set of physiological data using the machine learning model and the circadian rhythm model (¶[0090], ¶[0093] circadian amplitude and phase).
Applicant’s amendments and remarks (see pp. 11-12 of the Remarks dated 12/23/2025) overcome the rejections under 35 U.S.C. § 101, as the abstract idea claimed is considered integrated with a practical application.
Applicant’s arguments with respect to the combination of Kinnunen and Youngblood, see p. 14 of the Remarks filed 12/23/2025, have been fully considered but are not persuasive.
Applicant first states that the “circadian rhythm” of Kinnunen is not shown to be equivalent to circadian rhythm chronotype as claimed. This is not found persuasive as Applicant has not provided any particular definition for circadian rhythm chronotype, and the Examiner’s position is that the circadian rhythm chronotype of a user is any representation of a user’s specific circadian rhythm. Furthermore, Kinnunen teaches that an individual user’s day-night cycle is substantially equivalent to the user’s circadian rhythm in ¶[0191] and ¶[0193].
Applicant’s remaining arguments with respect to the newly amended claim limitations have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 1-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kinnunen et al. (U.S. Patent Application Publication No. 2017/0132946) hereinafter referred to as Kinnunen; in view of Youngblood et al. (U.S. Patent Application Publication No. 2022/0105308) hereinafter referred to as Youngblood; in view of Reifman (U.S. Patent Application Publication No. 2018/0289314) hereinafter referred to as Reifman.
Regarding claim 1, Kinnunen teaches a method for determining a circadian rhythm (¶[0062]) chronotype (¶[0088] moningness-eveningness type) on an application running on an operating system of user device (¶[0170] application installed in the mobile communication device) and associated with a wearable device (¶¶[0117-0118] ring wearable associated with smartphone), comprising:
receiving, from the wearable device, a first set of physiological data measured from a user by the wearable device collected over a period of time (¶[0057] wearable device typically includes sensor data), the first set of physiological data comprising at least nighttime (¶[0072] body temperature at night) temperature data (¶[0057] temperature), activity data (¶[0057] movement), and sleep pattern data (¶[0063] sleeping pattern);
receiving, from the wearable device, a second set of physiological data measured from the user by the wearable device collected over a previous sleep day, the second set of physiological data comprising at least sleep pattern data (¶[0063] and ¶[0068], ¶[0083] measured over a period of days, ¶[0119]);
classifying, using an algorithm, the first set of physiological data into the circadian rhythm chronotype based at least in part on inputting the first set of physiological data into the algorithm (¶[0162]);
comparing, by the application that is configured for processing data received from the wearable device, the determined circadian rhythm chronotype and the received second set of physiological data (¶[0175]); and
transmitting, to a graphical user interface of the device, signaling causing the graphical user interface of the user device to display a message associated with the comparison, the determined circadian rhythm chronotype, the received second set of physiological data, or a combination thereof (¶[0176]).
Kinnunen does not make explicit the “deep data analysis” and therefore does not teach a machine learning model.
Attention is drawn to the Youngblood reference, which teaches a machine learning model for sleep monitoring and recommendations (¶¶[0188-0190] and ¶[0243]).
It would have been obvious to one of ordinary skill in the art at the time of filing to modify the sleep monitoring of Kinnunen to include a machine learning model, as taught by Youngblood, because it allows the system to make better predictions about what is helpful to an individual user (Youngblood ¶[0200]).
Kinnunen as modified does not teach generating a first set of weighted physiological data based at least in part on inputting the first set of physiological data and a circadian rhythm model into a machine learning model; and subsequently classifying, using the machine learning model, the weighted first set of physiological data into the circadian rhythm chronotype based at least in part on generating the weighted first set of physiological data using the machine learning model and the circadian rhythm model.
Attention is brought to the Reifman reference, which teaches generating a first set of weighted physiological data (¶[0071] variable weights) based at least in part on inputting the first set of physiological data (¶[0075] PVT performance data) and a circadian rhythm model (¶¶[0058-0059] circadian rhythm model C) into a machine learning model (¶[0057] using a model); and subsequently classifying, using the machine learning model, the weighted first set of physiological data into the circadian rhythm chronotype based at least in part on generating the weighted first set of physiological data using the machine learning model and the circadian rhythm model (¶[0090], ¶[0093] circadian amplitude and phase).
It would have been obvious to one of ordinary skill in the art at the time of filing to modify the analysis of Kinnunen as modified to include the two-stage machine learning model of Reifman to provide a benefit of transforming physiological data into actionable information and allow users to achieve peak performance (Reifman, ¶¶[0009-0010]).
Regarding claim 2, Kinnunen as modified teaches the method of claim 1.
Kinnunen teaches further comprising: causing the graphical user interface of the user device to display a graphical representation of an averaging of the sleep pattern data of the first set of physiological data over the period of time (Fig. 4, ¶[0184]).
Regarding claim 3, Kinnunen as modified teaches the method of claim 2.
Kinnunen further teaches wherein the averaging of the sleep pattern data (¶[0119] sleep parameters average or normal/typical are monitored) comprises an average wake time that the user wakes up (¶[0088] awakening time), an average bedtime that the user goes to sleep (¶[0088] bedtime), an average sleep midpoint time (¶[0088] sleep midpoint), an average sleep duration (¶[0184]), or a combination thereof.
Regarding claim 4, Kinnunen as modified teaches the method of claim 2.
Kinnunen teaches further comprising: overlaying the graphical representation of the averaging of the sleep pattern data of the first set of physiological data over the period of time against a representation of a twenty-four hour timespan (Fig. 4, each column in “too short sleep time” represents one full 24 hr day, and the night’s sleep associated with said day).
Regarding claim 5, Kinnunen as modified teaches the method of claim 4.
Kinnunen teaches further comprising: causing the graphical user interface of the user device to display a segment of the representation of the twenty-four hour timespan that comprises the averaging of the sleep pattern data of the first set of physiological data over the period of time (Fig. 4, each column in “too short sleep time” represents one full 24 hr day, and the night’s sleep associated with said day, the bar represents the segment as claimed).
Regarding claim 6, Kinnunen as modified teaches the method of claim 5.
Kinnunen does not teach wherein the segment represents the averaging of the sleep pattern data of the first set of physiological data over the period of time as a shaped portion having a first side indicating an average time the user goes to sleep, a second side indicating an average time the user wakes up, and a midpoint that is positioned between the first side and the second side and indicates an average time of a sleep midpoint of the user.
Attention is brought to the Youngblood reference, which teaches segments comprising a 24-hr period, including an arrangement of the physiological data over the period of time, (Fig. 67) including a first side indicating an average time the user goes to sleep (Fig. 67, bedtime), a second side indicating an average time the user wakes up (Fig. 67, Wake-up on a different “side” of the S curve), and a midpoint that is positioned between the first side and the second side and indicates an average time of a sleep midpoint of the user (Fig. 67, between deep and REM sleep zones is a midpoint, further the sleep profile screen in Fig. 80 of Youngblood appears to comprise these elements as well and ¶[0282]).
It would have been obvious to one of ordinary skill in the art at the time of filing to modify the mobile application of Kinnunen to include additional displays of user data, as taught by Youngblood, because Youngblood teaches documenting a user’s progress over time to allow a user to experience success with achieving a goal, improving user motivation (Youngblood ¶[0191]).
Regarding claim 7, Kinnunen as modified teaches the method of claim 1.
Kinnunen teaches further comprising: identifying a time of night associated with a nighttime temperature minimum based at least in part on receiving the first set of physiological data, wherein classifying the first set of physiological data into the circadian rhythm chronotype is based at least in part on identifying the time of night associated with the nighttime temperature minimum (¶[0075] lowest body temperature included in circadian rhythm calculation).
Regarding claim 8, Kinnunen as modified teaches the method of claim 1.
Kinnunen teaches further comprising: processing, by the application, the sleep pattern data of the first set of physiological data to extract at least a standard deviation of a sleep midpoint (¶[0092]), a median wake time wake that the user wakes up, a median bedtime that the user goes to sleep, or a combination thereof;
processing, by the application, the activity data of the first set of physiological data to extract at least an average metabolic equivalent of task (MET) value, a time that the user is active (¶[0088] before noon), or both; and
processing, by the application, the nighttime temperature data to extract at least an average skin temperature, an average skin temperature for a plurality of highest temperature values of a consecutive twenty-four hour timespan (¶[0187]), an average skin temperature for a plurality of lowest temperature values of a consecutive twenty-four hour timespan, or a combination thereof,
wherein classifying the first set of physiological data into the circadian rhythm chronotype is based at least in part processing, by the application, the sleep pattern data, the activity data, and the nighttime temperature data (¶[0162], the measurement data includes sleep pattern, activity, nighttime temperature data).
Regarding claim 9, Kinnunen as modified teaches the method of claim 1.
Kinnunen teaches further comprising: determining a misalignment between the received second set of physiological data and the determined circadian rhythm chronotype based at least in part on comparing the determined circadian rhythm chronotype and the received second set of physiological data (¶[0095]).
Regarding claim 10, Kinnunen as modified teaches the method of claim 1.
Kinnunen further teaches wherein the message comprises a recommended time of day that the user is active (¶[0191]), a recommended wake time that the user wakes up (¶[0090]), a recommended bedtime that the user goes to sleep (¶[0192]), a recommended sleep duration, a recommended time of day that the user rests (¶[0129 day napping), a recommended time of day that the user is focused, a sleep alignment message (¶[0092]), a sleep misalignment message, or a combination thereof (Fig. 4).
Regarding claim 11, Kinnunen as modified teaches the method of claim 1.
Kinnunen further teaches wherein the nighttime temperature data comprises continuous nighttime temperature data (¶[0072] body temperature at night, ¶[0057] temperature).
Regarding claim 12, Kinnunen as modified teaches the method of claim 1.
Kinnunen further teaches wherein the wearable device comprises a wearable ring device (¶[0117]).
Regarding claim 13, Kinnunen as modified teaches the method of claim 1.
Kinnunen further teaches wherein the wearable device collects the first set of physiological data and the second set of physiological data from the user based on arterial blood flow, capillary blood flow, arteriole blood flow, or a combination thereof (¶[0056] reflectance based photoplethysmography with an infrared light and photodetector is responsive to blood flow).
Regarding claims 14-17/18-20, the claims are directed to an apparatus and non-transitory computer readable medium comprising substantially the same subject matter as claims 1-4/1-3 and are rejected under substantially the same sections of Kinnunen, Youngblood, and Reifman.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to AMANDA L STEINBERG whose telephone number is (303)297-4783. The examiner can normally be reached Mon-Fri 8-4.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Unsu Jung can be reached at (571) 272-8506. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/AMANDA L STEINBERG/ Examiner, Art Unit 3792