Prosecution Insights
Last updated: April 19, 2026
Application No. 18/145,640

TECHNIQUES FOR PROVIDING INSIGHTS ACCORDING TO APPLICATION DATA AND BIOMETRIC DATA

Non-Final OA §103
Filed
Dec 22, 2022
Examiner
NGUYEN, HIEP VAN
Art Unit
3686
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Oura Health OY
OA Round
5 (Non-Final)
55%
Grant Probability
Moderate
5-6
OA Rounds
4y 2m
To Grant
84%
With Interview

Examiner Intelligence

Grants 55% of resolved cases
55%
Career Allow Rate
564 granted / 1025 resolved
+3.0% vs TC avg
Strong +29% interview lift
Without
With
+29.3%
Interview Lift
resolved cases with interview
Typical timeline
4y 2m
Avg Prosecution
47 currently pending
Career history
1072
Total Applications
across all art units

Statute-Specific Performance

§101
27.9%
-12.1% vs TC avg
§103
46.9%
+6.9% vs TC avg
§102
7.3%
-32.7% vs TC avg
§112
10.2%
-29.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1025 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims Claims 1-15, 17-20 have been examined. Claims 1, 17, 20 have been amended. Claim 16 has been previously canceled. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have ahrens et al. (EP2783716A1 hereinafter Ahrens)been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-15, 17-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over James Proud (US 20140247146A1) in view of Youngblood et al. (US20200227160A1 hereinafter Youngblood) and further in view of Krishnapura (US20170124328A1). With respect to claim 1, Proud teaches a method comprising: acquiring, via one or more optical components of a wearable ring device (‘146; Para 0081: A programming device 22 may be configured to transmit data to a sensor 14, also known as a user monitoring device, utilizing a variety of alternative transmission means, including, for example, RF, IR, optical, and the like, or a magnetic loop/induction system.), biometric data associated with a user, the one or more optical components comprising one or more light- transmitting components and one or more light-receiving components (‘146; Para 0284; Para 0291: light-receiving); wherein the biometric data is associated with health of the user (‘146; Para 0238: The database includes base standards for at least one of user, activities, behaviors, habit information and health information that is indicative of a healthy lifestyle of activities, behaviors, habit information, exercise programs and health condition) receiving, via a transceiver, the biometric data from the wearable ring device (‘146; Para 0111: by disclosure, Proud describes the monitoring device recognizes the wearer based on biometric information, previous data, movement pattern; Para 0158: The user monitoring device 10 can communicate with other devices via an RF transceiver 86, an IRDA transceiver 88, and/or an RF backscatter transceiver 90. Each of the components in the user monitoring device; Para 0205: a user's watch may have a biometric sensor that collects data throughout the day) YoungBlood teaches receiving, from utility or social media application running on a user device, application data associated with the user, wherein the application data is indicative of a previous activity in which the user engaged or an upcoming activity in which the user is to engage (‘160; Para 0122: The mobile application is also operable to manage exchanges between a user and their environment. In one example, the mobile application notes that the user's commute time is negatively impacting their stress level. In another example, the mobile application notes that interaction with an individual raises their stress level (e.g., toxic relationship). In yet another example, the mobile application is operable to detect a negative impact of social media use on the user. The mobile application advises a user to minimize time on social media due to the negative impact (e.g., measured through stress responses by the EDA and/or heart sensors). The mobile application preferably identifies these exchanges and coaches the user to minimize stress. The mobile application is also operable to identify positive influences. In one example, the mobile application identifies at least one individual that positively impacts a user's stress level. When the user is stressed out, the mobile application suggests that the user contact the at least one individual for support. ); identifying content for the user via a machine learning model trained to identify the content for the user based at least in part on the pattern between the biometric data and the application data, wherein the content corresponds to the respective effectiveness for maintaining or affecting the biometric data to satisfy the one or more biometric data threshold (‘160; Para 0110: The mobile application uses machine learning to identify positive behaviors, negative behaviors, antecedents or causes of positive behaviors, antecedents or causes of negative behaviors, triggers, early or past experiences that impact current behavior, and/or core belief structures and patterns. The mobile application is also operable to use machine learning to identify timing of the positive behaviors, the negative behaviors, the antecedents or causes of positive behaviors, the antecedents or causes of negative behaviors, and/or the triggers. The timing is a daily, weekly, monthly, or other interval (e.g., two weeks, six weeks) basis.; Para 0099: The stress reduction and sleep promotion system is monitored to determine if there is a change in status of the body sensors (e.g., change in body temperature), the environmental sensors (e.g., change in room temperature), the system components (e.g., change in temperature of mattress pad), or sleep stage of the user. If there is a change in status, the virtual model is updated to reflect the change in status. Predicted values are generated for the stress reduction and sleep promotion system. If a difference between the optimized values and the predicted values is greater than a threshold, a simulation is run on the simulation engine to optimize the stress reduction and sleep promotion system based on the real-time data. The simulation engine uses information including, but not limited to, global historical subjective data, global historical objective data, global historical environmental data, and/or global profile data to determine if a change in parameters is necessary to optimize the stress reduction and sleep promotion system. In one example, the temperature of the mattress pad is lowered to keep a user in Stage N3 sleep for a longer period of time. In another example, the mobile application provides recommendations of an activity to a user.); and causing a graphical user interface of the user device running a wearable application to display the content for the user based at least in part on the pattern between the biometric data and the application data (‘160; Para 0173: he combination module 1222 determines a final inferred user attribute. In one embodiment, the final inferred user attribute is displayed on a display module 1230 (e.g., graphical user interface). In another embodiment, the display module prompts the user to confirm the final inferred user attribute. In yet another embodiment, the final inferred user attribute is sent to a search module 1232 to allow the final inferred user attribute to be searchable within the phenotype networking system. In still another embodiment, the final inferred user attribute is sent to one or more of the databases 1202 for storage (e.g., in user profile database 1206) ) It would have been obvious to one of ordinary skill in the art before the effective filing date of claimed invention to modify the system of Proud with the technique of health data platform as taught by Youngblood and the motivation is to generate the recommendation based at least in part on correlating the received biometric data and the received application data. Krishnapura discloses detecting a pattern between the biometric data and the application data, wherein the pattern is associated with a respective effectiveness for regulating the biometric data to satisfy one or more biometric data thresholds (‘328;Para 0013: usage patterns surrounding an authentication attempt can be exploited to make the biometric authentication experience more user-friendly and also more secure ; Para 0026: the biometric system monitor software is a background process that interacts with matcher software to gather data and statistics about the authentication attempt and also the ambient information from the system during and post authentication. In some embodiments, the authentication events are associated with time. A time series analysis can be conducted to detect patterns in the authentication. When patterns are detected, they are marked with an associated strength in the pattern.; Para 0053: he authentication attempt is assigned a score by the biometric matcher 220. In some authentication schemes, an authentication attempt is compared to the enrolled biometric template and given a score corresponding to how closely the authentication attempt matches the template. If the score satisfies a threshold, the authentication attempt is deemed to be successful and authentication is achieved. If the score does not satisfy the threshold, the authentication attempt is unsuccessful and authentication is denied.) It would have been obvious to one of ordinary skill in the art before the effective filing date of claimed invention to modify the system of Proud/Youngblood with the technique of biometric authentication as taught by Krishnapura and the motivation is to generate the pattern for regulating biometric data and the received application data. Claims 17 and 20 are rejected as the same reason with claim 1. With respect to claim 2, the combined art teaches the method of claim 1, Proud discloses further comprising: determining that the biometric data satisfies a biometric threshold, wherein determining the content for the user is further based at least in part on the biometric data satisfying the biometric threshold (‘160; Para 0095: The global analytics engine 754 analyzes differences between the predicted values and optimized values. If the difference between the optimized values and the predicted values is greater than a threshold, then the simulation engine 758 determines optimized values of the monitored stress reduction and sleep promotion system based on the real-time data and user preferences. In one embodiment, the global analytics engine 754 determines whether a change in parameters of the system components 710 is necessary to optimize sleep based on the output of the simulation engine 758). With respect to claim 3, the combined art teaches the method of claim 1, Proud discloses further comprising: scheduling at least one calendar event based at least in part on the pattern between the biometric data and the application data, the content comprising the at least one scheduled calendar event defined within an application associated with the wearable ring device and the user device, wherein causing the graphical user interface of the user device running the application to display the content for the user is further based at least in part on the at least one calendar event (‘146; Paras 0190-0192; Para 0252; Para 0257, Para 0260). With respect to claim 4, the combined art teaches the method of claim 3, Proud discloses wherein the at least one calendar event comprises event information indicating an activity in which the user is to engage for regulating the biometric data, timing information indicating a duration of the activity 4 in which the user is to engage for regulating the biometric data, location information indicating a location of the activity in which the user is to engage for regulating the biometric data, or any combination thereof (‘146; Para 0074; Para 0245). With respect to claim 5, the combined art teaches the method of claim 3, Proud discloses further comprising: causing the graphical user interface of the user device running the application to prompt the user for feedback based at least in part on the at least one scheduled calendar event (‘146; Para 0191: if the activity manager 218 determines that a user has entered a new activity in the calendar 204, the activity manager 218 can prompt the user to determine if the user wants the activity manager 218 to monitor and manage that activity) . With respect to claim 6, the combined art teaches the method of claim 5, Proud discloses wherein the feedback comprises an acceptance of the at least one calendar event or a rejection of the at least one calendar event, and the feedback is selected from a set of feedback responses displayed via the graphical user interface with the prompt (‘146; Paras 0194, 0207). With respect to claim 7, the combined art teaches the method of claim 5, Proud discloses further comprising: removing the at least one calendar event defined within the application associated with the wearable ring device and the user device based at least in part on the feedback; and adjusting one or more parameters of the machine learning model (‘146; Para 0094: Machine Learning-grade algorithms is used to identify the user's activities, behaviors, behaviors and perform analysis) . With respect to claim 8, the combined art teaches the method of claim 1, Proud discloses further comprising: identifying at least one calendar event based at least in part on the application data, wherein determining the content for the user is further based at least in part on the at least one calendar event, the content comprising a recommendation indicating one or more instructions for the user to maintain a threshold Readiness Score of the user (‘146; Paras 0042, 0159). With respect to claim 9, the combined art teaches the method of claim 1, Proud discloses further comprising: receiving sensor data from one or both of the wearable ring device or the user device, the sensor data comprising sound data associated with a physical environment, the content comprising a recommendation indicating one or more instructions for the user to reduce a noise level during a duration to maintain a threshold Readiness Score of the user, wherein determining the content for the user is further based at least in part on the sensor data (‘146; Para 0074; Para 0168). With respect to claim 10, the combined art teaches the method of claim 1, Proud discloses further comprising: receiving user data associated with the user from one or more applications associated with the wearable ring device or the user device, or any combination thereof, the one or more applications comprising a lifestyle application, the social media application, the utility application, an entertainment application, a productivity application, an information outlet application, or any combination thereof; and correlating the user data with the biometric data and the received application data to determine the content for the user, wherein causing the graphical user interface of the user device to display the content for the user is further based at least in part on correlating the user data with the biometric data and the received application data (‘146; Para 0069: lifestyle; Para 0126: social networking). With respect to claim 11, the combined art teaches the method of claim 1, Proud discloses further comprising: detecting the pattern between the biometric data and the application data (‘146; Para 093: a heart rate monitor 14 detects the user's heart rate in order to accurately determine the user's activity level, behavioral patterns ). With respect to claim 12, the combined art teaches the method of claim 11, Proudscore discloses further comprising: determining a recurrency of an activity over a set of occasions comprising a first occasion and a second occasion; and determining that at least one value of the biometric data satisfies a threshold during a third occasion subsequent to the first occasion or the second occasion, wherein determining the content for the user is further based at least in part on the determining that the at least one value of the biometric data satisfies the threshold during the third occasion subsequent to the first occasion or the second occasion (‘146; Para 0092, Para 093: a heart rate monitor 14 detects the user's heart rate in order to accurately determine the user's activity level, behavioral patterns ). With respect to claim 13, the combined art teaches the method of claim 12, Proud discloses wherein the content comprises a user-specific notification indicating a Readiness Score or a recommendation indicating one or more instructions for the user to increase the Readiness Score of the user (‘146; Para 0272:). With respect to claim 14, the combined art teaches the method of claim 1, Proud discloses wherein the application data comprises event information indicating an event in which the user is to engage, timing information indicating a duration of the event associated with the user, location information indicating a location of the event associated with the user, temperature data associated with the user, or any combination thereof (‘146; Para 0219, 0300). With respect to claim 15, the combined art teaches the method of claim 1, Proud discloses wherein the biometric data comprises heart rate data associated with the user, respiratory rate data associated with the user, sleep data associated with the user, activity data associated with the user, or any combination thereof (‘146; Paras 0239-0240). With respect to claim 18, the combined art teaches the apparatus of claim 17, Proud discloses wherein the instructions are further executable by the processor to cause the apparatus to: determine that the biometric data satisfies a biometric threshold, wherein the instructions to determine the content for the user are further executable by the processor based at least in part on the biometric data satisfying the biometric threshold (‘146; Para 0216). With respect to claim 19, the combined art teaches the apparatus of claim 17, Proud discloses wherein the instructions are further executable by the processor to cause the apparatus to: schedule at least one calendar event based at least in part on the pattern between the biometric data and the application data, the content comprising the at least one calendar event defined within an application associated with the wearable ring device and the user device, wherein the instructions to cause the graphical user interface of the apparatus running the application to display the content for recommending to the user are further executable by the processor based at least in part on the at least one calendar event (‘146; Para 0257). Response to Arguments Applicant’s arguments with respect to claim amendments have been considered but are moot because the argument does not apply on the reference of Krishnapura being used in the current rejection. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. US20200367807A1, Nov. 26, 2020; Lassoued et al.; Intelligent monitoring of a health state of a user engaged in operation of a computer device. Any inquiry concerning this communication or earlier communications from the examiner should be directed to HIEP VAN NGUYEN whose telephone number is (571)270-5211. The examiner can normally be reached Monday through Friday between 8:00AM and 5:00PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jason B Dunham can be reached on 5712728109. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /HIEP V NGUYEN/Primary Examiner, Art Unit 3686
Read full office action

Prosecution Timeline

Dec 22, 2022
Application Filed
Jul 25, 2024
Non-Final Rejection — §103
Oct 29, 2024
Response Filed
Feb 11, 2025
Final Rejection — §103
Apr 02, 2025
Applicant Interview (Telephonic)
Apr 03, 2025
Examiner Interview Summary
May 02, 2025
Request for Continued Examination
May 06, 2025
Response after Non-Final Action
May 12, 2025
Non-Final Rejection — §103
Jul 24, 2025
Applicant Interview (Telephonic)
Jul 24, 2025
Examiner Interview Summary
Aug 15, 2025
Response Filed
Oct 12, 2025
Final Rejection — §103
Dec 15, 2025
Response after Non-Final Action
Jan 15, 2026
Request for Continued Examination
Feb 17, 2026
Response after Non-Final Action
Feb 17, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12592322
MULTI-MODAL DIGITAL COMMUNICATION ARCHITECTURE FOR PATIENT ENGAGEMENT
2y 5m to grant Granted Mar 31, 2026
Patent 12592323
TARGETED GENERATION OF MESSAGES FOR DIGITAL THERAPEUTICS USING GENERATIVE TRANSFORMER MODELS
2y 5m to grant Granted Mar 31, 2026
Patent 12580067
SYSTEM AND METHOD FOR DISPENSING A CUSTOMIZED NUTRACEUTICAL PRODUCT
2y 5m to grant Granted Mar 17, 2026
Patent 12573478
SYSTEM AND METHOD FOR COMMUNICATING MEDICAL DATA
2y 5m to grant Granted Mar 10, 2026
Patent 12541784
ARTIFICIAL INTELLIGENCE BASED SYSTEM AND METHODS FOR PREDICTING SKIN ANALYTICS OF INDIVIDUALS
2y 5m to grant Granted Feb 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
55%
Grant Probability
84%
With Interview (+29.3%)
4y 2m
Median Time to Grant
High
PTA Risk
Based on 1025 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month