DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of the Claims
The status of the claims as of the response filed 2/16/2026 is as follows: Claims 1, 15, and 20 are currently amended. Claims 2-14 and 16-19 are original. Claims 1-20 are currently pending in the application and have been considered below.
Response to Amendment
Double Patenting Rejection
The independent claims have been amended such that the scope sufficiently differs from that of the claims of copending application 18/732,830, and thus the corresponding double patenting rejections are withdrawn.
Rejection Under 35 USC 101
The claims have been amended but the 35 USC 101 rejections for claims 1-20 are upheld.
Rejection Under 35 USC 103
The amendments made to the claims introduce limitations that are not fully addressed in the previous office action (e.g. wherein training the machine learning model comprises applying a failure time and a censor variable to a time to event loss function that predicts probability of survival/failure within a specified timeframe), and thus the corresponding 35 USC 103 rejections are withdrawn. However, Examiner will consider the amended claims in light of an updated prior art search and address their patentability with respect to prior art below.
Response to Arguments
Rejection Under 35 USC 101
On pages 7-8 of the response filed 2/16/2026 Applicant argues that the claims are not directed to certain methods of organizing human activity and instead use “specific operations to address a technical challenge in training machine learning models to predict an onset of a physiological event.” Applicant further analogizes the instant claims to those found eligible in Examples 37 and 39, asserting that they similarly “require machine-only actions” and “relate to training neural networks.” Applicant’s arguments are fully considered, but are not persuasive. Examiner respectfully disagrees that the claims do not recite an abstract idea; the claims as presently drafted recite methods of accessing data and using the data to train/fit a predictive model using certain mathematical functions and variables, which fits into both the “mathematical concepts” grouping and “certain methods of organizing human activity” grouping of abstract idea. For example, training a predictive model by applying a failure time and a censor variable to a time to event loss function that predicts probabilities of survival/failure within a specified timeframe recites a mathematical concept, while accessing data and using a fitted/trained predictive model to make predictions describe steps that a human actor could take to manage their personal behavior, e.g. a clinician making risk calculations using a trained model by observing readouts of a patient’s ECG sensor.
The claims do not have the same fact patterns as the claims found eligible in Examples 37 and 39. The eligible claim in Example 37 utilized a processor to access computer memory indicative of application usage and rearrange positions of icons on a GUI in accordance with usage determinations. In contrast, the instant claims merely utilize training data to train a predictive model, which both describe mathematical concepts and steps that a human actor such as a clinician or researcher could take when managing their personal behavior. Though the steps are performed with computing components like a processor, they are not inherently “machine-only actions” and the computing components merely act as tools with which to digitize and/or automate the otherwise-abstract functions such that they do not confer eligibility.
The eligible claim in Example 39 was directed to a method of training a neural network for facial detection that involved the collection and transformation of digital facial images to create two training sets for the neural network trained in two stages. Though this claim was not found to recite an abstract idea at all, it differs from the claims of the instant application due to the nature of the collected and transformed data as well as the increased specificity of the resulting trained model. That is, the claim of Example 39 recited the collection and transformation of digital facial images to create training sets, which could not reasonably be accomplished by a human actor mentally or by following instructions for managing personal behavior, whereas the instant claim recites training a model with physiological data, a physiological status indicator, a failure time, and a censor variable, which are data types that a human actor would be capable of collecting and utilizing to fit/train a predictive model. Further, Example 39 is directed to the training of specifically a neural network using two distinct training sets in two separate training stages, whereas the instant claim merely recites training a “machine learning model,” which could encompass a wide array of model architectures, training techniques, methods of iterating or tuning the system, etc. Finally, Example 39 was directed only to a method of training a neural network via the creation and use of training datasets, and did not include further steps that could be characterized as aspects of a certain method of organizing human activity. In contrast, the instant claims do include steps that can be reasonably characterized as mathematical concepts and/or certain methods of organizing human activity (as explained above, and in the updated 35 USC 101 rejections below). Accordingly, the claim of Example 39 does not recite an abstract idea, while the independent claims of the instant application do.
On pages 8-9 Applicant argues that the instant claims “improve the technological process of a machine learning based physiological event prediction process through an improved machine learning process, as opposed to merely reciting an abstract process, and improve the accuracy of machine learning models performing the compliance processing.” Applicant further alleges similarities with claims found eligible in McRO, Ex parte Carmody, and Ex parte Desjardins. Applicant’s arguments are fully considered, but are not persuasive. Examiner respectfully disagrees that the instant claims provide improvements to the technical field of machine learning or methods of training machine learning models themselves, and instead appear to apply known machine learning techniques to the clinical field of cardiac arrhythmia prediction. The steps listed on Pg 8 of the response as providing the technical improvement to machine learning are instead part of the abstract idea itself because steps for accessing training data and physiological status indicators, training a model using the training data and physiological status indicators, and applying a failure time and censor variable to a time to event loss function that predicts probabilities of survival/failure within a timeframe describe mathematical operations and steps that a human actor could take to manage their personal behavior for the purpose of training/fitting and using a clinical predictive model. Because these functions are part of the abstract idea itself, they do integrate the abstract idea into a practical application and thus do not confer eligibility (see MPEP 2106.05(a): “It is important to note, the judicial exception alone cannot provide the improvement. The improvement can be provided by one or more additional elements.” See also 2106.05(a)(II): “it is important to keep in mind that an improvement in the abstract idea itself… is not an improvement in technology”).
The claims do not have the same fact patterns as the claims found eligible in McRO, Carmody, and Desjardins. In McRO, the claimed invention recited a very specific set of rules that allowed a computer to perform animation in a manner that was previously only performable by human animators. The very fact that the animation could not be previously performed by computers and that the rules applied by the claimed invention solved this problem was the reason the claimed invention in McRO was found to be not directed to an abstract idea by improving an existing technological process. Here there is no evidence on record that establishes that the claimed invention was only previously performable by humans in the manner of McRO. The claimed invention thus does not provide an analogous technological improvement, and is instead directed to the abstract idea of fitting and using predictive clinical models to determine onset of clinical events, which is not understood to be a technological problem because this describes mathematical concepts and data analysis operations in the business field of medical diagnostics.
Regarding Applicant’s arguments with respect to Carmody and Desjardins, Examiner notes that these cases both described specific technical improvements to methods of training machine learning models that resulted in technical advantages described in their respective specifications. For example, in Desjardins, the claims reflected an improvement to how a machine learning model itself is trained and operates to address the technical problem of ‘catastrophic forgetting’ encountered in continual learning systems, which was identified and explained as a technical problem in the specification. In contrast, the instant specification does not outline a specific technical problem in machine learning technology whose solution is reflected in the claims. As noted above, the use of high-level computerized machine learning techniques to digitize and/or automate the prediction of clinical events from physiological data does not provide a technical improvement to a technical problem (as in Desjardins and Carmody), and instead amount to instructions to “apply” the exception with a computer.
For the reasons outlined above, the 35 USC 101 rejections are upheld for claims 1-20.
Rejection Under 35 USC 103
Applicant’s arguments on pages 9-11 directed to alleged deficiencies of the cited prior art references with respect to the newly-introduced limitation of independent claims 1, 15, and 20 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 2/16/2026 is in accordance with the provisions of 37 CFR 1.97 and is considered by the Examiner.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1
In the instant case, claims 1-14 are directed to a device (i.e. a machine), claims 15-19 are directed to a method (i.e. a process), and claim 20 is directed to a non-transitory computer storage medium (i.e. a manufacture). Thus, each of the claims falls within one of the four statutory categories. Nevertheless, the claims fall within the judicial exception of an abstract idea.
Step 2A – Prong 1
Independent claims 1, 15, and 20 recite steps that, under their broadest reasonable interpretations, cover certain methods of organizing human activity, e.g. managing personal behavior, relationships, or interactions between people, as well as mathematical concepts. Specifically, claim 1 (as representative) recites:
An electronic device for monitoring physiological signals of a user, the electronic device comprising:
an adhesive assembly comprising a housing that encloses a circuit board;
a sensor in electrical communication with the circuit board and configured to be positioned in conformal contact with the surface of the user to detect the physiological signals of the user; and
a hardware processor configured to apply the physiological signals to a machine learning model, wherein the machine learning model is configured to predict an onset of a physiological event based on the physiological signals of the user, and wherein the machine learning model is trained by:
accessing training data for a plurality of users separate from the user, wherein the training data comprises physiological data of the plurality of users;
for each user of the plurality of users, accessing a physiological status indicator, wherein the physiological status indicator is determined based on the training data; and
training the machine learning model based on the training data and the physiological status indicator to predict the onset of the physiological event, wherein training the machine learning model comprises applying a failure time and a censor variable to a time to event loss function that predicts probabilities of survival/failure within a specified timeframe.
But for the recitation of generic computer components like a hardware processor and high-level machine learning, the italicized functions, when considered as a whole, describe a clinical event detection and model fitting operation that could be achieved via mathematical operations and by a human actor such as a clinician or other medical professional managing their personal behavior and/or interactions with others. For example, a clinician could look at sensor readouts of physiological signals for a patient and apply the signals to a predictive model fitted with labelled data from other patients to predict an onset of a physiological event. The application of a failure time and a censor variable to a time to event loss function that predicts probabilities of survival/failure within a specified timeframe describes mathematical processes for training the model. Accordingly, claim 1 recites an abstract idea in the form of mathematical concepts and a certain method of organizing human activity. Claims 15 and 20 recite substantially similar subject matter as claim 1 and are found to recite an abstract idea under the same analysis.
Dependent claims 2-14 and 16-19 inherit the limitations that recite an abstract idea from their dependence on claims 1 and 15, respectively, and thus these claims also recite an abstract idea under the Step 2A – Prong 1 analysis. In addition, claims 2-14 and 16-19 recite additional limitations that further describe the abstract idea identified in the independent claims.
Specifically, claims 2 and 16 specify that the physiological signals do not indicate an ongoing occurrence of the physiological event, which a clinician would be capable of ascertaining by looking at the physiological signals of a patient not currently undergoing an acute physiological event.
Claims 3 and 16 specify that predicting the onset of the physiological event comprises predicant a future occurrence of the physiological event, which a clinician could achieve by looking at the physiological signals of a patient not currently undergoing an acute physiological event but indicative of the potential for a future acute event.
Claims 4 and 17 specify that predicting the onset of the physiological event comprises determining a risk group within a risk stratification, which a clinician could achieve by using their medical expertise to classify a patient into a high, medium, or low risk group based on their physiological signals.
Claim 17 further recites predicting a future occurrence of the physiological event based at least in part on the risk group, which a clinician could achieve by considering the risk group when predicting a future event (e.g. finding a higher probability of a future event if the patient is in a high risk group).
Claim 5 specifies that determining the risk group comprises determining treatment guidance associated with the risk group, which a clinician could achieve by using their medical expertise to recommend a treatment type for each group, e.g. continued monitoring for a low risk group, preventative medication for a medium risk group, and surgical intervention for a high risk group.
Claim 6 recites generating a risk score corresponding to a risk of an occurrence of the physiological event at a future time, which is a mathematical operation; such an operation could also be achieved by a clinician using their medical expertise to calculate a risk score.
Claim 7 recites transmitting an alert to the user or a healthcare provider in response to the risk score satisfying a threshold score, which a clinician could achieve by comparing the calculated risk score to a threshold and then communicating an alert (e.g. verbally, in writing, etc.) to the patient or a colleague if the risk score exceeds the threshold.
Claims 8-9 and 18 recite adjusting a function based on a prediction of the onset of the physiological event, e.g. adjusting a window of physiological signals used to predict the onset of the physiological event. A clinician could achieve these functions by making a note to increase or decrease the time duration of patient data evaluated for making event predictions.
Claim 10 recites outputting a recommendation of a treatment or intervention based on the predicted onset of the event, which a clinician could achieve by using their medical expertise to recommend a treatment for a patient.
Claims 11 and 19 recite identifying a patient cluster of a plurality of patient clusters based on similarity of a physiological characteristic and selecting the treatment recommendation based on the patient cluster, which a clinician could achieve by evaluating several potential patient clusters each associated with certain characteristics or traits and corresponding treatment recommendations, and selecting the cluster most similar to the current patient’s physiological characteristics so that the corresponding treatment recommendation of that cluster may be selected.
Claim 12 recites that each patient cluster corresponds to a centroid of a plurality of centroids and that the recommendation is selected based in part on a distance in latent space between a representation of the user and corresponding centroid of the identified patient cluster, which a clinician could achieve by utilizing mathematical graphing and distance concepts to identify the similar patient cluster and relevant corresponding treatment recommendation.
Claim 13 specifies that the physiological characteristic comprises one or more of a status of cardiac arrhythmia, an intervention type, or an outcome corresponding to an instance of an intervention of the intervention type. A clinician would be capable of evaluating these types of patient characteristics to identify an appropriate cluster of patients with similar characteristics as in claim 11.
Claim 14 specifies that the event is cardiac arrhythmia, which is a type of event that a clinician would be capable of assessing or predicting from physiological signal data.
However, recitation of an abstract idea is not the end of the analysis. Each of the claims must be analyzed for additional elements that indicate the abstract idea is integrated into a practical application to determine whether the claim is considered to be “directed to” an abstract idea.
Step 2A – Prong 2
The judicial exception is not integrated into a practical application. In particular, independent claims 1, 15, and 20 do not include additional elements that integrate the abstract idea into a practical application. The additional elements of claim 1 include an electronic device comprising an adhesive assembly comprising a housing that encloses a circuit board, a sensor in electrical communication with the circuit board and configured to be positioned in conformal contact with the surface of the user to detect the physiological signals of the user, and a hardware processor configured to perform the various steps of the invention, as well as specifying that the model is a machine learning model. Claims 15 and 20 recite substantially similar additional elements. The adhesive assembly and sensor additional elements amount to insignificant extra-solution activity in the form of data gathering because they merely serve as means of obtaining the physiological signals needed for the main data analysis steps of the invention (see MPEP 2106.05(g)). The use of a hardware processor (e.g. executing computer-executable instructions stored in a non-transitory computer storage medium as in claim 20) to perform the various steps of the invention amounts to instructions to “apply” the abstract idea using generic computer components because this element merely serves as a tool with which the otherwise-abstract functions of applying signals to a model, accessing various types of data, and training a model are digitized and/or automated (see MPEP 2106.05(f)). Similarly, specifying that the model is a machine learning model merely invokes this high-level type of computerized element as a tool with which to automate and/or digitize the otherwise-abstract functions of mathematical model fitting and execution to analyze data. Accordingly, claims 1, 15, and 20 as a whole are each directed to an abstract idea without integration into a practical application.
The judicial exception recited in dependent claims 2-14 and 16-19 is also not integrated into a practical application under a similar analysis as above. Claims 2-6, 10-14, and 16-19 merely further describe the abstract idea of the independent claims without introducing any new additional elements of their own, and thus do not provide integration into a practical application. Claim 7 recites that the hardware processor transmits an alert to a device of the user or a healthcare provider, which merely digitizes/automates the otherwise-abstract function of sharing data between human actors such that this element also amounts to instructions to “apply” the exception in a computer environment. Claims 8-9 specify that the adjusted function is a function of the electronic device, which again merely utilizes a high-level digital device to “apply” the otherwise-abstract function of adjusting data processing windows, with no details about how such an adjustment is actually facilitated or controlled with any specific components of the electronic device.
Accordingly, the additional elements of claims 1-20 do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. Claims 1-20 are directed to an abstract idea.
Step 2B
The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of an electronic device with a processor executing instructions to perform the applying, predicting, accessing, training, and machine learning model aspects of the invention amount to mere instructions to apply the exception using generic computer components. As evidence of the generic nature of the above recited additional elements, Examiner notes paras. [0155]-[0156] & [0161] of Applicant’s specification, where the computer elements (e.g. microprocessor) are described in terms of known, conventional components and device types (e.g. server, laptop computer, mobile device, etc.). See also paras. [0157]-[0158], noting various generic types of I/O devices for interaction with a user (e.g. encompassing a device of a user or healthcare provider as in claim 7) such as a touchpad or touchscreen, display device, GUI, etc.
Regarding the machine learning aspect of the event detection model, Examiner notes that it is well-understood, routine, and conventional to utilize machine learning models like neural networks for the purpose of physiological event detection, as evidenced by at least Thakur et al. (US 20200297230 A1) paras. [0078]-[0079]; Boleyn et al. (US 20190090769 A1) paras. [0025]-[0026] & [0106]; and Fornwalt et al. (US 20210076960 A1) paras. [0003] & [0180].
Regarding the adhesive assembly and sensor elements of the electronic device, as noted above, these additional elements amount to insignificant extra-solution activity in the form of a means of data gathering. Examiner notes that use of an adhesive assembly comprising a housing that encloses a circuit board and a sensor in electrical communication with the circuit board and configured to be positioned in conformal contact with the surface of the user to detect physiological signals is well-understood, routine, and conventional in the art, as evidenced by at least Hughes et al. (US 20160120433 A1) paras. [0007]-[0008] & [0084]-[0085]; Thakur paras. [0048]-[0050]; and Boleyn paras. [0007]-[0008] & [0043].
Further, the combination of these additional elements is not expanded upon in the specification as a unique arrangement and as such relies on the knowledge of one of ordinary skill in the art to understand the combination of components as a well-known and generic combination for automating an abstract idea that could otherwise be performed as a certain method of organizing human activity and thus do not provide an inventive concept. Additionally, the combination of adhesive sensor devices, computer processing hardware, and machine learning models to achieve physiological event monitoring and prediction is well-understood, routine, and conventional, as evidenced by at least Thakur Fig. 2 and paras. [0048]-[0050] & [0078]-[0079]; Boleyn Fig. 3 and paras. [0025]-[0026], [0043], & [0106]; and Fornwalt abstract and paras. [0153] & [0180].
Thus, when considered as a whole and in combination, claims 1-20 are not patent eligible.
Claim Rejections - 35 USC § 103
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-6, 8-10, 14-18, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Thakur et al. (US 20200297230 A1) in view of Gensheimer et al. (Reference V on the accompanying PTO-892).
Claim 1
Thakur teaches an electronic device for monitoring physiological signals of a user (Thakur Fig. 2, [0061]-[0062], noting arrhythmic risk stratification system monitors a user’s physiological signals to predict onset of a cardiac arrhythmia), the electronic device comprising:
an adhesive assembly comprising a housing that encloses a circuit board (Thakur [0050], noting ambulatory medical device includes a hermetically sealed housing that contains various circuits; the AMD may be a patch-based device as noted in [0048], indicating that the it may be embodied with an adhesive (i.e. patch-based) assembly);
a sensor in electrical communication with the circuit board and configured to be positioned in conformal contact with the surface of the user to detect the physiological signals of the user (Thakur [0048]-[0050], [0066], noting the ambulatory medical device includes sensor devices in communication with the various circuits, e.g. a cardiac sensor that senses cardiac information from electrodes positioned on a patient’s body surface); and
a hardware processor configured to apply the physiological signals to a machine learning model, wherein the machine learning model is configured to predict an onset of a physiological event based on the physiological signals of the user (Thakur [0069], [0078]-[0079], noting arrhythmic risk stratifier is embodied as part of a microprocessor circuit or other processor hardware and uses a machine learning model to determine an arrhythmia risk indication (e.g. onset timing) based on input physiologic information), and wherein the machine learning model is trained by:
accessing training data for a plurality of users separate from the user, wherein the training data comprises physiological data of the plurality of users (Thakur [0078], noting the machine learning model is trained using sensor data from a patient population (i.e. training data comprising physiological data of a plurality of users separate from the user));
training the machine learning model based on the training data (Thakur [0078], noting the machine learning model is trained using sensor data from a patient population).
In summary, Thakur teaches a system for applying non-invasively collected physiological data to a machine learning model that has been trained from patient population data to predict future onset of a physiological event such as a cardiac arrhythmia. The machine learning model can take many forms, including linear regression, decision tree, Naïve Bayes, support vector machine, neural network, etc. (see [0078]). However, this reference fails to explicitly disclose that the machine learning model training process includes for each user of the plurality of users, accessing a physiological status indicator, wherein the physiological status indicator is determined based on the training data; training the machine learning model based on the training data and the physiological status indicator, and wherein training the machine learning model comprises applying a failure time and a censor variable to a time to event loss function that predicts probabilities of survival/failure within a specified timeframe as required by the instant claim. However, Gensheimer teaches a specific method of training a clinical prediction machine learning model such as a neural network that includes accessing training data of a plurality of patients that each have known outcome indicators including failure time and censor variables (also considered equivalent to physiological status indicators) and applying the known outcome indicators in the training data to a loss function to train the model to predict probabilities of survival or failure within a specified timeframe (Gensheimer abstract & “Implementation” section on Pg 5). It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the machine learning model training method of Thakur to include use of known physiological status indicator like failure time and censor variables in a loss function for training the model to predict probabilities of survival/failure in a time period as in Gensheimer in order to utilize a modeling approach that avoids information loss when training the model and enables the generation of predicted survival curves (as suggested by Gensheimer abstract & “Implementation” section on Pg 5).
Claim 2
Thakur in view of Gensheimer teaches the electronic device of claim 1, and the combination further teaches wherein the physiological signals do not indicate an ongoing occurrence of the physiological event (Thakur [0043], [0079], noting the arrhythmia is predicted for a future onset rather than being currently indicated in the physiological signals).
Claim 3
Thakur in view of Gensheimer teaches the electronic device of claim 2, and the combination further teaches wherein predicting the onset of the physiological event comprises predicting a future occurrence of the physiological event (Thakur abstract, [0079], noting prediction of future onset of an arrhythmia).
Claim 4
Thakur in view of Gensheimer teaches the electronic device of claim 1, and the combination further teaches wherein predicting the onset of the physiological event comprises determining a risk group within a risk stratification (Thakur [0079], noting the arrhythmia risk indication can stratify a patient into categorical risk groups such as high, medium, or low).
Claim 5
Thakur in view of Gensheimer teaches the electronic device of claim 4, and the combination further teaches wherein determining the risk group comprises determining treatment guidance associated with the risk group (Thakur [0083], noting the system can deliver a therapy (i.e. determine treatment guidance) based on the user’s risk indication being categorized as high risk).
Claim 6
Thakur in view of Gensheimer teaches the electronic device of claim 1, and the combination further teaches wherein predicting the onset of the physiological event comprises generating a risk score corresponding to a risk of an occurrence of the physiological event at a time subsequent to detection by the sensor of the physiological signals of the user (Thakur [0076], [0079], noting the arrhythmia risk indication includes a risk score indicating the risk of the patient developing an arrhythmia in the future).
Claim 8
Thakur in view of Gensheimer teaches the electronic device of claim 1, and the combination further teaches wherein the hardware processor is further configured to adjust a function of the electronic device based on a prediction of the onset of the physiological event (Thakur [0086], noting the system can tune or adjust aspects of the sensors such as sensor data acquisition time, schedule, frequency, duration, sampling rate, etc. based on a current estimate of arrhythmia risk for a patient).
Claim 9
Thakur in view of Gensheimer teaches the electronic device of claim 8, and the combination further teaches wherein adjusting the function of the electronic device comprises adjusting a window of physiological signals being processed by the electronic device to predict the onset of the physiological event (Thakur [0086], noting the system can tune or adjust aspects of the sensors such as sensor data acquisition time, schedule, frequency, duration (i.e. window), sampling rate, etc. based on a current estimate of arrhythmia risk for a patient).
Claim 10
Thakur in view of Gensheimer teaches the electronic device of claim 1, and the combination further teaches wherein the hardware processor is further configured to output a recommendation of a treatment or an intervention based on a prediction of the onset of the physiological event (Thakur [0082], [0099], noting the system can generate recommendations for treatments or interventions such as more aggressive arrhythmia monitoring, further testing to be performed, initiating or adjusting patient medication or other types of treatment, etc. based on the patient risk).
Claim 14
Thakur in view of Gensheimer teaches the electronic device of claim 1, and the combination further teaches wherein the physiological event comprises cardiac arrhythmia (Thakur abstract).
Claim 15
Thakur teaches a method comprising:
detecting physiological signals of a user using a sensor of a physiological signal monitor, wherein the sensor is configured to be placed in conformal contact with a surface of the user (Thakur [0048]-[0050], [0066], noting the ambulatory medical device includes patch-based or wearable sensor devices that collect physiological signals of a patient, e.g. a cardiac sensor that senses cardiac information from electrodes positioned on a patient’s body surface); and
applying the physiological signals detected by the sensor to a machine learning model, wherein the machine learning model is configured to predict an onset of a physiological event based on the physiological signals of the user (Thakur [0069], [0078]-[0079], noting arrhythmic risk stratifier uses a machine learning model to determine an arrhythmia risk indication (e.g. onset timing) based on input physiologic information), wherein the machine learning model is trained by:
accessing training data for a plurality of users, wherein the training data comprises historical physiological data of the plurality of users (Thakur [0078], noting the machine learning model is trained using sensor data from a patient population (i.e. training data comprising historical physiological data of a plurality of users));
training the machine learning model based on the training data (Thakur [0078], noting the machine learning model is trained using sensor data from a patient population).
In summary, Thakur teaches a method for applying non-invasively collected physiological data to a machine learning model that has been trained from patient population data to predict future onset of a physiological event such as a cardiac arrhythmia. However, this reference fails to explicitly disclose that the machine learning model training process includes for each user of the plurality of users, accessing a physiological status indicator, wherein the physiological status indicator is determined based on the training data; training the machine learning model based on the training data and the physiological status indicator, and wherein training the machine learning model comprises applying a failure time and a censor variable to a time to event loss function that predicts probabilities of survival/failure within a specified timeframe as required by the instant claim. However, Gensheimer teaches a specific method of training a clinical prediction machine learning model such as a neural network that includes accessing training data of a plurality of patients that each have known outcome indicators including failure time and censor variables (also considered equivalent to physiological status indicators) and applying the known outcome indicators in the training data to a loss function to train the model to predict probabilities of survival or failure within a specified timeframe (Gensheimer abstract & “Implementation” section on Pg 5). It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the machine learning model training method of Thakur to include use of known physiological status indicator like failure time and censor variables in a loss function for training the model to predict probabilities of survival/failure in a time period as in Gensheimer in order to utilize a modeling approach that avoids information loss when training the model and enables the generation of predicted survival curves (as suggested by Gensheimer abstract & “Implementation” section on Pg 5).
Claim 16
Thakur in view of Gensheimer teaches the method of claim 15, and the combination further teaches wherein the physiological signals do not indicate an ongoing occurrence of the physiological event, and wherein the method further comprises predicting a future occurrence of the physiological event based on the physiological signals of the user (Thakur abstract, [0043], [0079], noting the arrhythmia is predicted for a future onset rather than being currently indicated in the physiological signals).
Claim 17
Thakur in view of Gensheimer teaches the method of claim 15, and the combination further teaches determining a risk group of the user within a risk stratification; and predicting a future occurrence of the physiological event based at least in part on the risk group (Thakur [0079], noting the arrhythmia risk indication can stratify a patient into categorical risk groups such as high, medium, or low and predict corresponding onset timing or timeframe of a future cardiac event).
Claim 18
Thakur in view of Gensheimer teaches the method of claim 15, and the combination further teaches modifying a monitoring window of the physiological signal monitor based on a prediction of the onset of the physiological event (Thakur [0086], noting the system can tune or adjust aspects of the sensors such as sensor data acquisition time, schedule, frequency, duration (i.e. window), sampling rate, etc. based on a current estimate of arrhythmia risk for a patient).
Claim 20
Thakur teaches a non-transitory computer storage medium storing computer-executable instructions that, when executed by a processor, cause the processor to perform operations (Thakur [0069]-[0070], noting arrhythmic risk stratifier is embodied with processing hardware that can carry out operations encoded in a computer readable medium) comprising:
detecting physiological signals of a user using a sensor of a physiological signal monitor, wherein the sensor is configured to be placed in conformal contact with a surface of the user (Thakur [0048]-[0050], [0066], noting the ambulatory medical device includes patch-based or wearable sensor devices that collect physiological signals of a patient, e.g. a cardiac sensor that senses cardiac information from electrodes positioned on a patient’s body surface); and
applying the physiological signals detected by the sensor to a machine learning model, wherein the machine learning model is configured to predict an onset of a physiological event based on the physiological signals of the user (Thakur [0069], [0078]-[0079], noting arrhythmic risk stratifier uses a machine learning model to determine an arrhythmia risk indication (e.g. onset timing) based on input physiologic information), wherein the machine learning model is trained by:
accessing training data for a plurality of users, wherein the training data comprises historical physiological data of the plurality of users (Thakur [0078], noting the machine learning model is trained using sensor data from a patient population (i.e. training data comprising historical physiological data of a plurality of users));
training the machine learning model based on the training data (Thakur [0078], noting the machine learning model is trained using sensor data from a patient population).
In summary, Thakur teaches a system for applying non-invasively collected physiological data to a machine learning model that has been trained from patient population data to predict future onset of a physiological event such as a cardiac arrhythmia. However, this reference fails to explicitly disclose that the machine learning model training process includes for each user of the plurality of users, accessing a physiological status indicator, wherein the physiological status indicator is determined based on the training data; training the machine learning model based on the training data and the physiological status indicator, and wherein training the machine learning model comprises applying a failure time and a censor variable to a time to event loss function that predicts probabilities of survival/failure within a specified timeframe as required by the instant claim. However, Gensheimer teaches a specific method of training a clinical prediction machine learning model such as a neural network that includes accessing training data of a plurality of patients that each have known outcome indicators including failure time and censor variables (also considered equivalent to physiological status indicators) and applying the known outcome indicators in the training data to a loss function to train the model to predict probabilities of survival or failure within a specified timeframe (Gensheimer abstract & “Implementation” section on Pg 5). It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the machine learning model training method of Thakur to include use of known physiological status indicator like failure time and censor variables in a loss function for training the model to predict probabilities of survival/failure in a time period as in Gensheimer in order to utilize a modeling approach that avoids information loss when training the model and enables the generation of predicted survival curves (as suggested by Gensheimer abstract & “Implementation” section on Pg 5).
Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Thakur and Gensheimer as applied to claims 1 and 6 above, and further in view of Fornwalt et al. (US 20210076960 A1).
Claim 7
Thakur in view of Gensheimer teaches the electronic device of claim 6, and the combination further teaches wherein (Thankur [0045], [0056], [0082], claim 10, noting the system may generate an alert to notify a system user (e.g. a clinician or other healthcare personnel) at an output device about the risk or predicted future event).
In summary, the present combination teaches transmitting an alert to a device of a healthcare provider based on the generated risk or predicted event. However, the present combination fails to explicitly disclose that the alert is transmitted specifically in response to the risk score satisfying a threshold score. However, Fornwalt teaches generating a report (equivalent to an alert) for output at medical personnel user devices specifically responsive to the risk score exceeding a predetermined threshold (Fornwalt [0014], [0165]). It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the alerting function of the combination such that it occurs specifically responsive to the risk score satisfying a threshold as in Fornwalt in order to alert a clinician only in cases where the risk score indicates that the patient will suffer from an event within a predetermined time period (as suggested by Fornwalt [0165]), thereby facilitating immediate intervention and reducing unnecessary alerts.
Claims 11-13 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Thakur and Gensheimer as applied to claims 1 and 10 or 15 above, and further in view of Van Berkel et al. (US 20220084669 A1).
Claim 11
Thakur in view of Gensheimer teaches the electronic device of claim 10, but the combination fails to explicitly disclose wherein the hardware processor is further configured to identify a patient cluster based at least in part on a similarity of a physiological characteristic between the user and patients of the patient cluster, wherein the patient cluster is one of a plurality of patient clusters, and wherein the recommendation of the treatment or the intervention is selected based in part on the patient cluster.
However, Van Berkel teaches selecting an appropriate treatment for a user by: identifying a patient cluster based at least in part on a similarity of a physiological characteristic between the user and patients of the patient cluster, wherein the patient cluster is one of a plurality of patient clusters (Van Berkel [0060], noting the system selects the most similar virtual patient for a new patient (i.e. user), where a virtual patient represents a near-homogeneous cluster of patients grouped based on clinical, demographic, socioeconomic, utilization, etc. features/characteristics as noted in [0052]-[0054] & [0062]; thus, selecting a most similar virtual patient out of a plurality of virtual patients is considered equivalent to identifying a patient cluster of a plurality of patient clusters based at least in part on a similarity of a physiological characteristic or feature of the new patient (user) and the virtual patient (cluster)); and wherein the recommendation of the treatment or the intervention is selected based in part on the patient cluster (Van Berkel [0060], noting once the system has selected the most similar virtual patient (i.e. cluster) for a given new patient (i.e. user), it selects a care plan for the new patient based on the care plan that is associated with the virtual patient). It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the treatment recommendation function of the combination to include the specific patient cluster identification and selection of associated treatment plans as in Van Berkel in order to match new patients to optimized care plans that are likely to benefit those patients the most based on known features and outcomes of similar patients (as suggested by Van Berkel [0049]), thereby improving patient care.
Claim 12
Thakur in view of Gensheimer and Van Berkel teaches the electronic device of claim 11, and the combination further teaches wherein each patient cluster of the plurality of patient clusters corresponds to a medoid of a plurality of medoids (Van Berkel [0054], noting each virtual patient represents a medoid of its associated patient cluster), and wherein the hardware processor is further configured to select the recommendation based in part on a distance in latent space between a representation of the user and a corresponding medoid of the patient cluster identified based on the similarity of the physiological characteristic between the user and the patients of the patient cluster (Van Berkel Fig. 8, [0060]-[0061], noting the most similar virtual patient (i.e. medoid representation of a patient cluster) is identified for a new patient based on distance between a representation of the new patient and the virtual patient (i.e. medoid of a patient cluster) as mapped in a patient space such that the associated treatment plan is selected based on the shortest distance from a patient to a virtual patient medoid).
In summary, the present combination teaches a system that may select an appropriate treatment plan for a user by identifying a most similar patient cluster (represented by a virtual patient representing the medoid of the cluster) based on shortest distance in a mapped patient space between representations of the new patient and the virtual patient. Though the virtual patient represents a composite metric of an entire patient cluster such as a medoid, the present combination fails to explicitly disclose that the composite metric of a patient cluster is a centroid. However, Van Berkel also contemplates that a centroid may be a cluster parameter used for distance and similarity comparisons (Van Berkel [0059]). It therefore would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the virtual patient to be a representation of a given patient cluster’s centroid rather than its medoid because centroid is also shown to be a known cluster parameter that can be used for distance and similarity comparisons, and simple substitution of one known element for another producing a predictable result (i.e. selection of the cluster’s centroid instead of medoid as its virtual patient representation) renders the claim obvious.
Claim 13
Thakur in view of Gensheimer and Van Berkel teaches the electronic device of claim 12, and the combination further teaches wherein the physiological characteristic comprises one or more of a status of cardiac arrhythmia, an intervention type, or an outcome corresponding to an instance of an intervention of the intervention type (Van Berkel [0050]-[0052], noting patients are grouped as “similar” based on clinical, demographic, socioeconomic, utilization (i.e. intervention), etc. features/characteristics, considered to include clinical features like cardiac arrhythmia risk or status when considered in the context of the combination with Thakur and Gensheimer).
Claim 19
Thakur in view of Gensheimer teaches the method of claim 15, and the combination further teaches (Thakur [0082], [0099], noting the system can generate recommendations for treatments or interventions such as more aggressive arrhythmia monitoring, further testing to be performed, initiating or adjusting patient medication or other types of treatment, etc. based on the patient risk).
Though the present combination teaches generating treatment recommendations responsive to the predicted patient onset of a cardiac event, it fails to explicitly disclose identifying a patient cluster from a plurality of patient clusters based at least in part on a similarity of a physiological characteristic between the user and patients of the patient cluster and determining the treatment recommendation based at least in part on the patient cluster. However, Van Berkel teaches selecting an appropriate treatment for a user by: identifying a patient cluster from a plurality of patient clusters based at least in part on a similarity of a physiological characteristic between the user and patients of the patient cluster (Van Berkel [0060], noting the system selects the most similar virtual patient for a new patient (i.e. user), where a virtual patient represents a near-homogeneous cluster of patients grouped based on clinical, demographic, socioeconomic, utilization, etc. features/characteristics as noted in [0052]-[0054] & [0062]; thus, selecting a most similar virtual patient out of a plurality of virtual patients is considered equivalent to identifying a patient cluster of a plurality of patient clusters based at least in part on a similarity of a physiological characteristic or feature of the new patient (user) and the virtual patient (cluster)); and determining a recommendation of a treatment based at least in part on the patient cluster (Van Berkel [0060], noting once the system has selected the most similar virtual patient (i.e. cluster) for a given new patient (i.e. user), it selects a care plan for the new patient based on the care plan that is associated with the virtual patient). It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the treatment recommendation function of the combination to include the specific patient cluster identification and selection of associated treatment plans as in Van Berkel in order to match new patients to optimized care plans that are likely to benefit those patients the most based on known features and outcomes of similar patients (as suggested by Van Berkel [0049]), thereby improving patient care.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Chapfuwa et al. (Reference U on the accompanying PTO-892) describes methods for training machine learning time-to-event models using censored events and cost functions.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to KAREN A HRANEK whose telephone number is (571)272-1679. The examiner can normally be reached M-F 8:00-4:00 ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Shahid Merchant can be reached at 571-270-1360. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/KAREN A HRANEK/ Primary Examiner, Art Unit 3684