DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 10/28/2025 has been entered.
Response to Arguments
Rejection Under 101
Applicant's arguments filed 10/28/2025 have been fully considered. Applicant argues that the amended claims are no longer directed to an abstract idea. Even if they could be considered an abstract idea, any abstract idea is incorporated into a practical application. In response to Applicant’s conclusory arguments, the claims claim encompass monitoring patient data in order to provide the user with a recommendation based on the patient’s medical incidents and to produce summaries reports of the incidents. The claim falls under management of personal behavior or interaction by following instructions to process the data in order to provide the recommendation and summary report and thus falls under the grouping of organizing human activity. The additional elements (bolded in the rejection below) do not amount to a practical application since they merely invoke the use of a computer to carry out the abstract idea. See the updated rejection below.
Rejection Under 103
Applicant's arguments filed 10/28/2025 have been fully considered. Applicant argues that the claims have been amended and the prior art does not disclose the amended features. In response to Applicant’s argument, the argument is directed toward the amendment and is therefore moot. However, Kiani teaches the amended limitation. See the updated rejection below for further clarification.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1, 3-11, 13-21, 23-30 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., an abstract idea) without significantly more.
Step 1 of the Alice/Mayo Test
Claims 1, 3-10 are drawn to a method, which is within the four statutory categories (i.e. process). Claims 11, 13-20 are drawn to a computer program product, which is within the four statutory categories (i.e. manufacture).Claims 21, 23-30 are drawn to a system, which is within the four statutory categories (i.e. apparatus).
Step 2A of the Alice/Mayo Test - Prong One
The independent claims recite an abstract idea. For example, claim 11 (and substantially similar with independent claims 1, 21) recites:
A computer program product residing on a computer readable medium having a plurality of instructions stored thereon which, when executed by a processor, cause the processor to perform operations comprising:
monitoring a plurality of data signals associated with a patient within a medical environment;
detecting one or more incidents defined within one or more of the data signals, wherein the one or more incidents each include the occurrence of a sequence of events occurring within a defined time period, and wherein each of the events is indicated by a respective alarm condition of at least a portion of the plurality of data signals, and wherein detecting one or more incidents includes predicting the one or more incidents when a defined portion of the sequence of events has occurred in a defined order;
providing a recommendation in response to the detected one or more incidents; and
processing the one or more incidents defined within the one or more data signals to produce a summary of the one or more incidents so that a user may be quickly knowledgeable of these one or more incidents.
These underlined elements recite an abstract idea that can be categorized, under its broadest reasonable interpretation, to cover the management of personal behavior or interactions (i.e., following rules or instructions), but for the recitation of generic computer components. For example, but for the computer program residing on a computer readable medium having a plurality of instructions, a processor, memory, the limitations in the context of this claim encompass a monitoring patient data in order to provide the user with a recommendation based on the patient’s medical incidents and to produce summaries reports of the incidents. If a claim limitation, under its broadest reasonable interpretation, covers management of personal behavior or interactions but for the recitation of generic computer components, then the limitations fall within the “Certain Methods of Organizing Human Activity” grouping of abstract ideas. See MPEP § 2106.04(a).
Dependent claims recite additional subject matter which further narrows or defines the abstract idea embodied in the claims (such as claims 3-10, 13-20, and 23-30 reciting particular aspects of the abstract idea).
Step 2A of the Alice/Mayo Test - Prong Two
For example, claim 11 (and substantially similar with independent claims 1, 21) recites:
A computer program product residing on a computer readable medium having a plurality of instructions stored thereon which, when executed by a processor, cause the processor to perform operations comprising: (merely invokes use of computer and other machinery as a tool as noted below, see MPEP 2106.05(f))
monitoring a plurality of data signals associated with a patient within a medical environment;
detecting one or more incidents defined within one or more of the data signals, wherein the one or more incidents each include the occurrence of a sequence of events occurring within a defined time period, and wherein each of the events is indicated by a respective alarm condition of at least a portion of the plurality of data signals, and wherein detecting one or more incidents includes predicting the one or more incidents when a defined portion of the sequence of events has occurred in a defined order;
providing a recommendation in response to the detected one or more incidents; and
processing the one or more incidents defined within the one or more data signals to produce a summary of the one or more incidents so that a user may be quickly knowledgeable of these one or more incidents.
The judicial exception is not integrated into a practical application. In particular, the additional elements do not integrate the abstract idea into a practical application, other than the abstract idea per se, because the additional elements amount to no more than limitations, which:
amount to mere instructions to apply an exception (such as recitations of processor coupled to memory configured to perform steps, and using the computer program residing on a computer readable medium having a plurality of instructions, a processor, memory, thereby invoking computers as a tool to perform the abstract idea, see applicant’s specification [00513]-[00515], see MPEP 2106.05(f))
Dependent claims recite additional subject matter which amount to limitations consistent with the additional elements in the independent claims (such as claims 3-10, 13-20, and 23-30 recite additional limitations which amount to invoking computers as a tool to perform the abstract idea, and claims 3-10, 13-20, and 23-30 additional limitations which generally link the abstract idea to a particular technological environment or field of use). Looking at the limitations as an ordered combination adds nothing that is not already present when looking at the elements taken individually. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely provide conventional computer implementation and do not impose a meaningful limit to integrate the abstract idea into a practical application.
Step 2B of the Alice/Mayo Test for Claims
The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to discussion of integration of the abstract idea into a practical application, the additional elements amount to no more than mere instructions to apply an exception. Additionally, the additional elements, other than the abstract idea per se, amount to no more than elements which:
amount to elements that have been recognized as well-understood, routine, and conventional activity in particular fields (such as using the computer program residing on a computer readable medium having a plurality of instructions, a processor, memory, e.g., Applicant’s spec describes the computer system with it being well-understood, routine, and conventional because it describes in a manner that the additional elements are sufficiently well-known that the specification does not need to describe the particulars of such elements to satisfy 112a. (See Applicant’s Spec. [00513]-[00515]); using a computer program residing on a computer readable medium having a plurality of instructions, a processor, memory, e.g., merely adding a generic computer, generic computer components, or a programmed computer to perform generic computer functions, Alice Corp. Pty. Ltd. v. CLS Bank Int’l, 134 S. Ct. 2347, 2358-59, 110 USPQ2d 1976, 1983-84 (2014).
Dependent claims recite additional subject matter which, as discussed above with respect to integration of the abstract idea into a practical application, amount to invoking computers as a tool to perform the abstract idea and are generally linking the abstract idea to a particular field of environment. Looking at the limitations as an ordered combination adds nothing that is not already present when looking at the elements taken individually. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely provide conventional computer implementation. Therefore, the claims are not patent eligible, and are rejected under 35 U.S.C. § 101.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 3-7, 11, 13-17, 21, 23-27 are rejected under 35 U.S.C. 103 as being unpatentable over Katra (US PG Pub. 2020/0357513) in view of Kiani et al. (WO 2010/102069).
Regarding claim 1, Katra discloses a computer-implemented method, executed on a computing device, comprising: (Par. [0044], “This disclosure presents systems and methods for remote post-IMD monitoring for implantation site infections.”)
monitoring a plurality of data signals associated with a patient within a medical environment; (Par. [0048], “The computing device executing the app (e.g., a virtual check-in process) may perform various functionalities described below, whether via local computing resources provided by the computing device, via cloud-based backend systems, or both. In some examples, the computing device may implement the app via a web browser. In some examples, the computing device may perform a device check. In such examples, the computing device may implement one or more interrogations of one or more medical devices (e.g., IMDs, CIEDs, etc.). In addition, the computing device may analyze medical device settings, parameters, and performance metrics.” Par. [0084], “In a non-limiting example, medical device(s) 6 may include wearable devices (e.g., smart watches, headsets, etc.) configured to obtain physiological data (e.g., activity data, heart rate, etc.) and transfer such data to computing device(s) 2, network 10, edge device(s) 12, etc. for subsequent utilization, in accordance with one or more of the various techniques of this disclosure.”)
detecting one or more incidents defined within one or more of the data signals, wherein each of the events is indicated by a respective alarm condition of at least a portion of the plurality of data signals, (Par. [0081], “In some examples, medical device(s) 6 may include diagnostic medical devices. In an example, medical device(s) 6 may include a device that predicts heart failure events or that detects worsening heart failure of patient 4. In a non-limiting and illustrative example, system 100 may be configured to measure impedance fluctuations of patient 4 and process impedance data to accumulate evidence of worsening heart failure. In any case, medical device(s) 6 may be configured to determine a health status relating to patient 4. Medical device(s) 6 may transmit the diagnostic data or health status to computing device(s) 2 as interrogation data, such that computing device(s) 2 may correlate the interrogation data with image data to determine whether an abnormality present with a particular one of medical device(s) 6 (e.g., an IMD) or patient 4 (e.g., infection at an implantation site).” Par. [0084], “In a non-limiting example, medical device(s) 6 may include wearable devices (e.g., smart watches, headsets, etc.) configured to obtain physiological data (e.g., activity data, heart rate, etc.) and transfer such data to computing device(s) 2, network 10, edge device(s) 12, etc. for subsequent utilization, in accordance with one or more of the various techniques of this disclosure.” Par. [0298], “In some examples, processing circuitry 20 may provide an alert, such as a text- or graphics-based notification, a visual notification, etc. In some examples, processing circuitry 20 may cause an audible alarm to sound or cause a tactile alarm, alerting patient 4 of a determined abnormality. In other examples, computing device(s) 2 may provide a visual light indication, such as emitting a red light for high severity or a yellow light for medium severity. The alert may indicate a potential, possible or predicted abnormality event (e.g., a potential infection).”)
providing a recommendation in response to the detected one or more incidents; and (Katra Par. [0196], “In an illustrative example, when processing circuitry 20 determines that anyone or any combination of subsessions yielded an abnormal result (or abnormal outside an acceptable margin of normal), processing circuitry 20 may use the mobile device app to output a prompt to patient 4; [0195] In some examples, processing circuitry, e.g., processing circuitry 20 of computing device(s) 2, processing circuitry 64 of edge device(s) 12, processing circuitry 98 of server(s) 94, or processing circuitry 40 of medical device(s) 17, may output the post-implant report by outputting the post-implant report to a device of a HCP via network 10 (e.g., via a bidirectional communication protocol). In one example, processing circuitry 20 may output, via network 10, the post-implant report to another device of the user of UI 22 and/or the interactive session, where the user may or may not necessarily be patient 4)
processing the one or more incidents defined within the one or more data signals to produce a summary of the one or more incidents so that a user may be quickly knowledgeable of these one or more incidents. (Par. [0151], “In one example, computing device 2 may determine, from physiological parameters obtained via a second subsession, an ECG change that indicates device migration and as such, increases the likelihood that a potential abnormality is being detected from the image data. In such instances, computing device 2 may analyze the image data using a bias toward detecting an abnormality or may include, in a post-implant report, a heightened likelihood (e.g., probability, confidence interval) that is based on the likelihood of a potential abnormality determined from the first and second set data items.” Par. [0196], “In an illustrative example, when processing circuitry 20 determines that anyone or any combination of subsessions yielded an abnormal result (or abnormal outside an acceptable margin of normal), processing circuitry 20 may use the mobile device app to output a prompt to patient 4. In some examples, the prompt may indicate an abnormality. In other examples, the prompt may include a recommendation or instruction to schedule a follow-up visit with the HCP.” Par. [0298], “Processing circuitry, e.g., processing circuitry 20 of computing device(s) 2, processing circuitry 64 of edge device(s) 12, processing circuitry 98 of server(s) 94, or processing circuitry 40 of medical device(s) 17, may determine instructions for medical intervention based on the health condition status of patient 4 (2304). For example, where processing circuitry 20 determines the presence of an abnormality at the implantation site of IMD(s) 6, processing circuitry 20 may determine instructions for medical intervention based on the abnormality. In another example, where processing circuitry 20 determines the presence of an abnormality at the implantation site of IMD(s) 6 and an abnormality in a physiological parameter of patient 4, processing circuitry 20 may determine instructions for medical intervention based on a post-implant report that processing circuitry 20 may generate based at least in part on the multiple abnormalities. In some examples, processing circuitry 20 may determine different instructions for different severity levels or abnormality categories.”)
Katra does not appear to disclose the following, however, Kiani teaches it is old and well known in the art of healthcare data processing wherein:
the one or more incidents each include the occurrence of a sequence of events occurring within a defined time period (Kiani [0086] time stamps for events occurring in the physiological monitoring system 100, environmental conditions such as changes to the state of the network and usage statistics of the network interface module 106 [0125] The escalation rules module 518 has a rules engine that actuates an escalation policy defined by the hospital. The escalation rules module 518 provides alternative routing of alarms to alternative and additional clinical users in the event an alarm is not responded to or persists for a predefined (e.g., by a policy) period of time. [0184] This correlation may include reconstructing a timeline of medical events, with values of physiological parameters (optionally including waveforms) provided in the correct time sequence on the timeline)
wherein detecting one or more incidents includes predicting the one or more incidents when a defined portion of the sequence of events has occurred in a defined order; (Kiani Fig. 31 and corresponding text; [0378] At block 3104, the physiological parameter data is analyzed to identify alarm conditions based upon a first set of alarm criteria. The alarm criteria can be configurable so as to modify the physiological conditions that will trigger an alarm. In some embodiments, the analysis of the physiological parameter data is performed in substantially real-time by, for example, the bedside patient monitoring devices in order to detect alarm conditions as they occur. The alarm criteria will generally depend upon the particular physiological parameter being monitored. In some embodiments, the alarm criteria is a single threshold value. In some embodiments, the alarm criteria includes multiple threshold values that define, for example, an enclosed range of safe or normal values for the physiological parameter. Other types of alarm criteria can also be used. [0380] At block 3108, the physiological parameter data that was previously stored can be analyzed to identify alarm conditions based on a second alarm criteria that is different from the first criteria used at block 3104. This analysis can be performed by, for example, the reporting module described herein. If.sub.5 for example, in the case of blood oxygen saturation monitoring, detected pulse oximetry signals were analyzed at the actual time of monitoring using an alarm threshold of 94% oxygen saturation, then later at block 3108, the pulse oximetry signals can be re-analyzed using an alarm threshold of 93% oxygen saturation, or 92% oxygen saturation, etc. [0382] At block 3110, the reporting module can analyze the effect of the simulated alarm criteria on alarm conditions that are detected. For example, the reporting module can analyze the change, if any, in the number of detected alarm conditions using the new simulated alarm criteria… {construed as sequence of events in a defined order})
Therefore, it would have been obvious to one of ordinary skill in the art of healthcare data processing, before the effective filing date of the claimed invention, to modify Katra to incorporate wherein the one or more incidents each include the occurrence of a sequence of events occurring within a defined time period, and wherein detecting one or more incidents includes predicting the one or more incidents when a defined portion of the sequence of events has occurred in a defined order, as taught by Kiani, in order to gather all contextual information surrounding the events for further analysis to improve clinician performance and to better alarm thresholds in a riskless manner during patient monitoring. See Kiani [0156] and [0380].
Regarding claim 3, Katra-Kiani teaches the computer-implemented method of claim 1 wherein detecting one or more incidents defined within one or more of the data signals includes: monitoring the data signals associated with a medical device utilized on a patient within the medical environment to detect the occurrence of the one or more alarms. (Par. [0298], “Processing circuitry, e.g., processing circuitry 20 of computing device(s) 2, processing circuitry 64 of edge device(s) 12, processing circuitry 98 of server(s) 94, or processing circuitry 40 of medical device(s) 17, may determine instructions for medical intervention based on the health condition status of patient 4 (2304). For example, where processing circuitry 20 determines the presence of an abnormality at the implantation site of IMD(s) 6, processing circuitry 20 may determine instructions for medical intervention based on the abnormality. In another example, where processing circuitry 20 determines the presence of an abnormality at the implantation site of IMD(s) 6 and an abnormality in a physiological parameter of patient 4, processing circuitry 20 may determine instructions for medical intervention based on a post-implant report that processing circuitry 20 may generate based at least in part on the multiple abnormalities. In some examples, processing circuitry 20 may determine different instructions for different severity levels or abnormality categories. For example, processing circuitry 20 may determine a first set of instructions for one abnormality that processing circuitry 20 determines is likely less severe than another abnormality. In some examples, processing circuitry 20 may not determine intervention instructions where processing circuitry 20 determines that the abnormality level does not satisfy a predefined threshold. In some examples, processing circuitry 20 may provide an alert, such as a text- or graphics-based notification, a visual notification, etc. In some examples, processing circuitry 20 may cause an audible alarm to sound or cause a tactile alarm, alerting patient 4 of a determined abnormality. In other examples, computing device(s) 2 may provide a visual light indication, such as emitting a red light for high severity or a yellow light for medium severity. The alert may indicate a potential, possible or predicted abnormality event (e.g., a potential infection).”)
Regarding claim 4, Katra-Kiani teaches the computer-implemented method of claim 1 wherein the one or more incidents define an event. (Par. [0103], “A trained ML model 30 and/or AI engine 28 may be configured to process and analyze the user input (e.g., images of the implantation site, patient status data, etc.), device parameters (e.g., accelerometer data), historical data of medical device (e.g., medical device 6), and/or physiological parameters, in accordance with certain examples of this disclosure where ML models are considered advantageous (e.g., predictive modeling, inference detection, contextual matching, natural language processing, etc.)… In another example, these models and engines may be trained to synthesize data in order to identify abnormalities of patient 4 or medical device(s) 17 and to identify abnormalities of patient 4 or medical device(s) 17 from individual data items.” Par. [0180], “In some examples, computing device 502 may identify the follow-up schedule by receiving a physiological parameter that indicates an abnormality (e.g., an ECG abnormality) and determining a first time period for the follow-up schedule based on the physiological parameter abnormality. That is, computing device(s) 2 may determine a triggering event for identifying a follow-up schedule that includes a trigger based on an amount of time that has passed, a particular signal received from one of medical device(s) 17 (e.g., IMD 6), such as an activity level, an ECG, etc., or based on a trigger received via network 10 (e.g., a computing device 2 of an HCP).”)
Regarding claim 5, Katra-Kiani teaches the computer-implemented method of claim 1 wherein processing the one or more incidents defined within the one or more data signals to produce a summary of the one or more incidents so that a user may be quickly knowledgeable of these one or more incidents includes: utilizing massive data sets processed by ML to produce the summary of the one or more incidents so that the user may be quickly knowledgeable of these one or more incidents. ( Par. [0103], “A trained ML model 30 and/or AI engine 28 may be configured to process and analyze the user input (e.g., images of the implantation site, patient status data, etc.), device parameters (e.g., accelerometer data), historical data of medical device (e.g., medical device 6), and/or physiological parameters, in accordance with certain examples of this disclosure where ML models are considered advantageous (e.g., predictive modeling, inference detection, contextual matching, natural language processing, etc.). Examples of ML models and/or AI engines that may be so configured to perform aspects of this disclosure include classifiers and non-classification ML models, artificial neural networks (“NNs”), linear regression models, logistic regression models, decision trees, support vector machines (“SVM”), Naïve or a non-Naïve Bayes network, k-nearest neighbors (“KNN”) models, deep learning (DL) models, k-means models, clustering models, random forest models, or any combination thereof. Depending on the implementation, the ML models may be supervised, unsupervised or in some instances, a hybrid combination (e.g., semi-supervised). These models may be trained based on data indicating how users (e.g., patient 4) interact with computing device(s) 2. For example, certain aspects of the disclosure will be described using events or behaviors (such as clicking, viewing, or watching) with respect to items (e.g., wound images, cameras, videos, physiological parameters, etc.), for purposes of illustration only. In another example, these models and engines may be trained to synthesize data in order to identify abnormalities of patient 4 or medical device(s) 17 and to identify abnormalities of patient 4 or medical device(s) 17 from individual data items.” Par. [0151], “In one example, computing device 2 may determine, from physiological parameters obtained via a second subsession, an ECG change that indicates device migration and as such, increases the likelihood that a potential abnormality is being detected from the image data. In such instances, computing device 2 may analyze the image data using a bias toward detecting an abnormality or may include, in a post-implant report, a heightened likelihood (e.g., probability, confidence interval) that is based on the likelihood of a potential abnormality determined from the first and second set data items.”)
Regarding claim 6, Katra-Kiani teaches the computer-implemented method of claim 1 wherein processing the one or more incidents defined within the one or more data signals to produce a summary of the one or more incidents so that a user may be quickly knowledgeable of these one or more incidents includes: utilizing a generative AI model to produce the summary of the one or more incidents so that the user may be quickly knowledgeable of these one or more incidents. (Par. [0103], “A trained ML model 30 and/or AI engine 28 may be configured to process and analyze the user input (e.g., images of the implantation site, patient status data, etc.), device parameters (e.g., accelerometer data), historical data of medical device (e.g., medical device 6), and/or physiological parameters, in accordance with certain examples of this disclosure where ML models are considered advantageous (e.g., predictive modeling, inference detection, contextual matching, natural language processing, etc.). Examples of ML models and/or AI engines that may be so configured to perform aspects of this disclosure include classifiers and non-classification ML models, artificial neural networks (“NNs”), linear regression models, logistic regression models, decision trees, support vector machines (“SVM”), Naïve or a non-Naïve Bayes network, k-nearest neighbors (“KNN”) models, deep learning (DL) models, k-means models, clustering models, random forest models, or any combination thereof. Depending on the implementation, the ML models may be supervised, unsupervised or in some instances, a hybrid combination (e.g., semi-supervised). These models may be trained based on data indicating how users (e.g., patient 4) interact with computing device(s) 2. For example, certain aspects of the disclosure will be described using events or behaviors (such as clicking, viewing, or watching) with respect to items (e.g., wound images, cameras, videos, physiological parameters, etc.), for purposes of illustration only. In another example, these models and engines may be trained to synthesize data in order to identify abnormalities of patient 4 or medical device(s) 17 and to identify abnormalities of patient 4 or medical device(s) 17 from individual data items.” Par. [0151], “In one example, computing device 2 may determine, from physiological parameters obtained via a second subsession, an ECG change that indicates device migration and as such, increases the likelihood that a potential abnormality is being detected from the image data. In such instances, computing device 2 may analyze the image data using a bias toward detecting an abnormality or may include, in a post-implant report, a heightened likelihood (e.g., probability, confidence interval) that is based on the likelihood of a potential abnormality determined from the first and second set data items.”)
Regarding claim 7, Katra-Kiani teaches the computer-implemented method of claim 6 wherein utilizing a generative AI model to produce the summary of the one or more incidents so that the user may be quickly knowledgeable of these one or more incidents includes: utilizing prompt engineering and the generative AI model to produce the summary of the one or more incidents so that the user may be quickly knowledgeable of these one or more incidents. (Par. [0103]-[0104] describe the AI models that can be used. Par. [0104], “In a non-limiting example, patient 4 may exhibit difficulty with capturing images of the implantation site from various angles. This helpful data may be shared across a health monitoring or computing network so that optimal results may be presented to more than one user based on similar queries and user reactions to those queries.” Par. [0173], “In an illustrative example, processing circuitry 20 may receive user input, via UI 22, indicating a patient name of patient 4. Processing circuitry 20 may query the database and based on a result of the query, identify patient 4 as a known patient of system 100 (e.g., system 300).” Par. [0174], “Processing circuitry 20 may reference patient data (e.g., patient identifiers) such that the same computing device(s) 2 (or the same algorithm base) can be shared across multiple patients (e.g., in a clinic). As described herein, computing device(s) 2 may adjust, based on patient data, the base of the site-check algorithm in order to tailor the process and UI visualizations in order to accommodate each respective patient. In another example, a common site-check algorithm may be deployed to accommodate all patients of a certain class (e.g., nursing home patients, patients of a particular nursing home, etc.). In this way, the site-check algorithms may maintain and provide a particular level of uniformity for the various users of the interactive session, where those users may be part of a common class.”)
Regarding claim 11, the claim recites substantially similar limitations as those recited in the rejection of claim 1, and, as such, is rejected for similar reasons as given above. Additionally, Katra discloses a computer program product residing on a computer readable medium having a plurality of instructions stored thereon which, when executed by a processor, cause the processor to perform operations (Par. [0156], “In some examples, storage device 50 includes computer-readable instructions that, when executed by processing circuitry 40, cause medical device(s) 17, including processing circuitry 40, to perform various functions attributed to medical device(s) 17 and processing circuitry 40 herein.”)
Regarding claim 13, the claim recites substantially similar limitations as those recited in the rejection of claim 3, and, as such, is rejected for similar reasons as given above.
Regarding claim 14, the claim recites substantially similar limitations as those recited in the rejection of claim 4, and, as such, is rejected for similar reasons as given above.
Regarding claim 15, the claim recites substantially similar limitations as those recited in the rejection of claim 5, and, as such, is rejected for similar reasons as given above.
Regarding claim 16, the claim recites substantially similar limitations as those recited in the rejection of claim 6, and, as such, is rejected for similar reasons as given above.
Regarding claim 17, the claim recites substantially similar limitations as those recited in the rejection of claim 7, and, as such, is rejected for similar reasons as given above.
Regarding claim 21, the claim recites substantially similar limitations as those recited in the rejection of claim 11, and, as such, is rejected for similar reasons as given above.
Regarding claim 23, the claim recites substantially similar limitations as those recited in the rejection of claim 3, and, as such, is rejected for similar reasons as given above.
Regarding claim 24, the claim recites substantially similar limitations as those recited in the rejection of claim 4, and, as such, is rejected for similar reasons as given above.
Regarding claim 25, the claim recites substantially similar limitations as those recited in the rejection of claim 5, and, as such, is rejected for similar reasons as given above.
Regarding claim 26, the claim recites substantially similar limitations as those recited in the rejection of claim 6, and, as such, is rejected for similar reasons as given above.
Regarding claim 27, the claim recites substantially similar limitations as those recited in the rejection of claim 7, and, as such, is rejected for similar reasons as given above.
Claims 8-10, 18-20, 28-30 are rejected under 35 U.S.C. 103 as being unpatentable over Katra (US PG Pub. 2020/0357513) in view of Kiani et al. (WO 2010/102069) and Aranke (US Pat. 11,830,623).
Regarding claim 8, Katra-Kiani teaches the computer-implemented method of claim 1 and Katra further discloses:
wherein the plurality of data signals include one or more of: one or more data signals associated with a medical device utilized on a patient within the medical environment; (Par. [0048], “In such examples, the computing device may implement one or more interrogations of one or more medical devices (e.g., IMDs, CIEDs, etc.). In addition, the computing device may analyze medical device settings, parameters, and performance metrics.”)
one or more data signals associated with drugs administered to the patient within the medical environment; (Par. [0209], “FIG. 10 illustrates a non-limiting example of a patient status questionnaire that either the backend system or the mobile device application of this disclosure may generate. The patient status questionnaire may be used to gauge information on a general health picture of patient 4, on implant-recovery-specific symptoms, medications that the patient has taken recently or will take soon, etc.”)
one or more data signals associated with clinical assessments performed on the patient within the medical environment; (Par. [0177], “In another example, IMD information may include historical data relating to the implantation site of the IMD and/or historical data relating to the 1 MB. In some examples, the historical data may include images of the implantation site following the implantation procedure, wound characteristics, shape and size of the implantation site (e.g., the wound size), incision information, history of any complications during surgery, date of the implant, etc. In some examples, IMD information may further include IMD type information, 1 MB communication protocol information, an estimated orientation of medical device(s) 17 (e.g., 1 MB 6), data regarding one or more HCPs (e.g., surgeons, clinicians) responsible for implanting (e.g., inserting) medical device(s) 17 (e.g., 1 MB 6), information relating to one or more methods employed by the one or more HCPs when sealing the implantation site, images of the implantation site over time, etc.”)
one or more data signals associated with clinical procedures performed on the patient within the medical environment; (Par. [0177], “In another example, IMD information may include historical data relating to the implantation site of the IMD and/or historical data relating to the 1 MB. In some examples, the historical data may include images of the implantation site following the implantation procedure, wound characteristics, shape and size of the implantation site (e.g., the wound size), incision information, history of any complications during surgery, date of the implant, etc. In some examples, IMD information may further include IMD type information, 1 MB communication protocol information, an estimated orientation of medical device(s) 17 (e.g., 1 MB 6), data regarding one or more HCPs (e.g., surgeons, clinicians) responsible for implanting (e.g., inserting) medical device(s) 17 (e.g., 1 MB 6), information relating to one or more methods employed by the one or more HCPs when sealing the implantation site, images of the implantation site over time, etc.”)
one or more data signals associated with electronic health records and/or electronic medical records of the patient within the medical environment; and (Par. [0094], “In some examples, system 100 may include one or more databases (e.g., storage device 96) that store various medical data records, cohort data, image data. In such examples, server(s) 94 (e.g., one or more databases) may be managed or controlled by one or more separate entities (e.g., internet service providers (ISPs), etc.).”)
one or more data signals associated with a medical history of the patient within the medical environment. (Par. [0186], “In another example, computing device 502 may generate historical reports that include an aggregation of any one or more of the reports, such that in response to detecting a selection of an interactive-session tracker tile or other tracker tile, computing device 502 may retrieve historical reports and compile and/or summarize reports from the past in order to produce a single post-implant historical report for export and/or display (e.g., via a pop-up interface).”)
Katra-Kiani does not appear to explicitly teach the following, however, Aranke teaches it is old and well known in the art of healthcare data processing wherein:
one or more data signals associated with lab work performed on the patient within the medical environment; (Col. 1, ln. 39-49, “Referring back to FIG. 2A, the remaining components of the integrated healthcare application 110a will be described. The condition management engine 208 may include software and/or logic to provide functionality to facilitate user management of one or more of their health conditions. Health conditions may include, for example, diabetes, pregnancy, osteoarthritis, cancer, mental health, chronic conditions, episodic conditions, etc. The condition management engine 208 uses the total health index score and diverse set of data about the user as input to one or more trained machine learning models to facilitate the user with managing their health conditions and risks. For example, the diverse set of data may include encounters, diagnoses, laboratory test results, user wearable device data, medication and refill history, user's application preferences, etc. The condition management engine 208 personalizes the user experience with condition management and generates personalized, real-time recommendations of actionable interventions.” Col. 15, ln. 12-42, “…the data processing engine 202 may query the plurality of data sources 135 for collecting the clinical data and the retail pharmacy data. Clinical data may include, but is not limited to, EMR data, EHR data, patient-physician encounter data (e.g., name, age, phone number, temperature, pulse, weight, height, encounter date, encounter type, encounter reason, encounter location, encounter time, encounter notes, etc.), patient demographics (e.g., gender, birth date, address, preferred language, ethnicity, marital status, religion, etc.), zip-code level demographics data, individual social determinant data (e.g., financial information, education level, mobility, race, alcohol use, tobacco use, drug use, etc.), patient surveys, patient problem list, patient procedures (e.g., ICD procedure codes, procedure date, procedure results, etc.), physician appointment history, upcoming appointment data, patient diagnoses, medical laboratory test results data, hospitalization data, rehabilitation data, health plan data, Center for Medicare & Medicaid Services Hierarchical Condition Category (CMS-HCC) risk score, census data and other public data (e.g., geography-level), etc. Pharmacy data may include, but is not limited to, prescription data, medication data (e.g., medication ID, date ordered, start and end date, etc.), pharmaceutical data, medication history, medication refills (e.g., number of refills, etc.), medication adherence data, proportion of days covered (PDC) score, polypharmacy data, drug interaction data, drug dosage, etc.”)
Therefore, it would have been obvious to one of ordinary skill in the art of healthcare data processing, before the effective filing date of the claimed invention, to modify Katra-Kiani, as modified above, to incorporate one or more data signals associated with lab work performed on the patient within the medical environment, as taught by Aranke, in order to improve healthcare providers’ ability to track and evaluate their patients by identifying lab results as important data that can be used in tracking and patient care. See Aranke, Col. 1, ln. 25-49.
Regarding claim 9, Katra-Kiani-Aranke teaches the computer-implemented method of claim 8, Katra further discloses wherein the one or more data signals associated with a medical device utilized on a patient within the medical environment concern one or more details of the medical device and/or uses of the medical device. (Par. [0048], “In such examples, the computing device may implement one or more interrogations of one or more medical devices (e.g., IMDs, CIEDs, etc.). In addition, the computing device may analyze medical device settings, parameters, and performance metrics.”)
Regarding claim 10, Katra-Kiani-Aranke teaches the computer-implemented method of claim 8, Katra further discloses wherein the medical device includes one or more sub-medical devices. (Par. [0142], “One or more of electrodes 16 may be coupled to at least one lead. In some examples, medical device(s) 17 may employ electrodes 16 in order to provide sensing and/or pacing functionalities. The configurations of electrodes 16 may be unipolar or bipolar. Sensing circuitry 52 may be selectively coupled to electrodes 16 via switching circuitry 58, e.g., to select the electrodes 16 and polarity, referred to as the sensing vector, used to sense impedance and/or cardiac signals, as controlled by processing circuitry 40. Sensing circuitry 52 may sense signals from electrodes 16, e.g., to produce a cardiac EGM or subcutaneous ECG, in order to facilitate monitoring the post-implant status of IMD 6. Sensing circuitry 52 also may monitor signals from sensors 54, which may include one or more accelerometers, pressure sensors, temperature sensors, and/or optical sensors, as examples. In some examples, sensing circuitry 52 may include one or more filters and amplifiers for filtering and amplifying signals received from electrodes 16 and/or sensors 54.”)
Regarding claim 18, the claim recites substantially similar limitations as those recited in the rejection of claim 8, and, as such, is rejected for similar reasons as given above.
Regarding claim 19, the claim recites substantially similar limitations as those recited in the rejection of claim 9, and, as such, is rejected for similar reasons as given above.
Regarding claim 20, the claim recites substantially similar limitations as those recited in the rejection of claim 10, and, as such, is rejected for similar reasons as given above.
Regarding claim 28, the claim recites substantially similar limitations as those recited in the rejection of claim 8, and, as such, is rejected for similar reasons as given above.
Regarding claim 29, the claim recites substantially similar limitations as those recited in the rejection of claim 9, and, as such, is rejected for similar reasons as given above.
Regarding claim 30, the claim recites substantially similar limitations as those recited in the rejection of claim 10, and, as such, is rejected for similar reasons as given above.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to AMANDA R COVINGTON whose telephone number is (303)297-4604. The examiner can normally be reached Monday - Friday, 10 - 5 MT.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jason B. Dunham can be reached on (571) 272-8109. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/AMANDA R. COVINGTON/Examiner, Art Unit 3686
/RACHELLE L REICHERT/Primary Examiner, Art Unit 3686