Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Change in Examiner
Gregory Moseley is no longer continuing prosecution on application number 18/798,632. It has been transferred to Examiner Sara Morice de Vargas.
Status of Claims
Claims 1, 3-11, 13-21, and 23-30 are currently pending and have been examined.
Claims 1, 3, 11, 13, 21, and 23 have been amended.
Claims 2, 12, and 22 have been canceled.
Claims 1, 3-11, 13-21, and 23-30 have been rejected.
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claims 1, 3-11, 13-21, 23-30 of the instant Application 18/798,632 (‘632) are provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1, 3-11, 13-21, 23-30 of co-pending Application No. 18/798,613 (‘613) in view of Gupta (US PG Pub 2024/0029848 A1) and Rusin (US PG Pub 2015/0137968 A1). Although the claims at issue are not identical, they are not patentably distinct from each other because the claims are directed toward similar statutory categories and they recite substantially similar limitations as indicated by the following table. This is a provisional nonstatutory double patenting rejection because the patentably indistinct claims have not been patented.
Claims 1, 3-11, 13-21, 23-30 of the instant Application ‘632 and claims 1, 3-11, 13-21, 23-30 of co-pending Application ‘613 are shown in the tables below for exemplary purposes. The commonalities are bolded.
Regarding Claim 1:
Instant Application ‘632 Claim 1
Co-pending Application ‘613 Claim 1
A computer-implemented method, executed on a computing device, comprising: monitoring a plurality of data signals associated with a patient within a medical environment;
A computer-implemented method, executed on a computing device, comprising: monitoring a plurality of data signals associated with a patient within a medical environment;
detecting one or more incidents defined within one or more of the data signals,
detecting one or more incidents defined within one or more of the data signals,
wherein the one or more incidents each include a specific combination of alarms occurring in a temporally proximate fashion;
wherein the one or more incidents each include the occurrence of a sequence of events occurring within a defined time period,
utilizing a generative AI model to generate a predictive report concerning the one or more incidents, including analyzing patterns in a medical history and current health metrics associated with the patient, including iteratively refining a prompt based upon an initial output of the generative AI model;
and wherein each of the events is indicated by a respective alarm condition of at least a portion of the plurality of data signals, and wherein detecting one or more incidents includes predicting the one or more incidents when a defined portion of the sequence of events has occurred in a defined order;
and processing the one or more incidents defined within the one or more data signals to produce a recommendation based upon the one or more incidents so that a user may gain more insight into these one or more incidents,
providing a recommendation in response to the detected one or more incidents; and processing the one or more incidents defined within the one or more data signals to produce a summary of the one or more incidents so that a user may be quickly knowledgeable of these one or more incidents.
including: providing a historical timeline view of at least a portion of the plurality of signals: receiving a selection of one or more incidents within the historical timeline view: and providing supplemental information regarding the selected one or more incidents including an analysis of the selected one or more incidents, and the recommendation, wherein the recommendation concerns the selected one or more incidents.
Wherein para 238 of both application specifications describe “temporarily-proximate fashion” as defining the incident as the occurrence of a plurality of required alarms within a defined period of time, therefore these limitations are substantially similar.
Wherein para 341-342 and Fig. 18C of both application specifications (‘632 and ‘613) discloses a summary and recommending an alarm limit update, therefore, under the broadest reasonable interpretation of the claims, the limitations of producing a recommendation and producing a summary are substantially similar as applies to claims 1 and 5-7.
Further, “gain more insight into these one or more incidents” and “be quickly knowledgeable of these one or more incidents” are broadly interpreted to disclose substantially similar limitations as applies to claims 1 and 5-7. These limitations are simply disclosing similar benefits of utilizing the same invention.
While Claim 1 of co-pending Application ‘613 does not recite, “utilizing a generative AI model to generate a predictive report concerning the one or more incidents, including analyzing patterns in a medical history and current health metrics associated with the patient, including iteratively refining a prompt based upon an initial output of the generative AI model” Gupta discloses this limitation:
utilizing a generative AI model to generate a predictive report concerning the one or more incidents, including analyzing patterns in a medical history and current health metrics associated with the patient, including iteratively refining a prompt based upon an initial output of the generative AI model; (Gupta Para 203 and FIG. 5 disclose a system for simulating healthcare journey of one or more patients. The system 500 comprises a database 502 and a processor 504 communicably coupled with a communication network. Para 204 discloses the system 500 comprises the database 502 configured to store patient data related to the one or more patients. The system 500 comprises the processor 504 configured to receive a patient data wherein the patient data is accessed from the database 502 and/or from an external source. The external source may include the one or more patients, wherein the external source provides the patient data via a patient input using a user-interface. The processor 504 is further configured to create a simulation model of the one or more patients, using the received patient data, and employing a machine learning. The simulation model is then executed by the processor to predict one more health variables. In response to the one or more health variables, one or more treatment variables are generated. The processor 504 provides the predicted one or more health variables, the generated one or more treatment variables and one or more clinician inputs to the simulation model for continuous learning of the simulation model. Furthermore, a final outcome including patient's healthcare journey and patient's disease diagnosis and treatment is provided to the one or more patient by the simulation model. Para 224 discloses the system employs Generative Large-Language-Models (G-LLMs) for generating responses on the queries received from the clinicians. The Generative Large-Language-Models (G-LLMs) are deep learning algorithms trained on a large amount of data. The Generative Large-Language-Models (G-LLMs) assist the clinician at each step of patient journey and generate the summaries of patient journey. In this regard, the patient journey comprises some steps that are normally followed by the clinicians in evaluating the patients such as patient history evaluation, conducting physical examination, differential diagnosis, targeted clinical investigations, diagnosis, evaluation of diagnosis and then define a plan and monitoring the patient. For example, the Generative Large-Language-Models (G-LLMs) assist the clinicians in identifying the clinical features related to a target disease, in diagnostic evaluation or in generating a patient report based on test reports.)
including iteratively refining a prompt based upon an initial output of the generative AI model (This step only requires repeating the steps previously recited. Therefore, it is a duplication of the steps of providing input to generate the reports. This is akin to the duplication of parts, which says that “mere duplication of parts has no patentable significance unless a new and unexpected result is produced.” (MPEP 2144.VI.B). Because the purpose of iterating the steps is to “refin[e]” the results by tweaking the inputs, the duplication of this step does not produce “a new and unexpected result”. Therefore, this limitation will be given no patentable significance in accordance with the MPEP.)
It would have been obvious to one having ordinary skill in the art before the effective filing date of this application to modify the ‘613 application to include the ability to utilize a generative AI model to generate a predictive report concerning the one or more incidents, including analyzing patterns in a medical history and current health metrics associated with the patient, as taught by Gupta, because the algorithm used by Gupta “allows the processing arrangement to become more accurate in generating the document, without being explicitly programmed.” (Gupta, par. [0225]).
Further, Claim 1 of co-pending Application ‘613 does not recite, “providing a historical timeline view of at least a portion of the plurality of signals: receiving a selection of one or more incidents within the historical timeline view: and providing supplemental information regarding the selected one or more incidents including an analysis of the selected one or more incidents, and the recommendation, wherein the recommendation concerns the selected one or more incidents.” Rusin discloses these limitations:
providing a historical timeline view of at least a portion of the plurality of signals: (Para 106-107 and FIGS. 8 and 9 disclose example screenshots illustrating patient alarm data monitoring in the system 100 according to one embodiment. In this example, a clinician assigns a patient alarm data monitor to the patient. The system 100 then displays a screen as illustrated in FIG. 8… Line 810 contains patient and bed information such as the patient's name, unit and bed number, patient identification number, and date of birth, in this example “Bed: PICU-RM012-012” and “MRN 5458246623.” Area 820 is a patient's alarm summary indicating the period from admission to the present.)
receiving a selection of one or more incidents within the historical timeline view; and providing supplemental information regarding the selected one or more incidents including an analysis of the selected one or more incidents, and the recommendation, wherein the recommendation concerns the selected one or more incidents. (Paras 108-109 disclose when the clinician selects an alarm, for example, “SpO2 LOW,” the system 100 displays a recommendation screen illustrated in FIG. 9. Area 910 is the Standard Recommended for Unit with the recommended Limit set by Hospital policy for typical patients in that unit… Area 920 displays “Observed Vitals for Unit Population,” which is a histogram of this unit's typical patient alarm values for a sample of the past 25 hours, excluding the last hour. The first percentile value, the 50th percentile value, and the 99th percentile value are marked in the histogram. In this example the typical patient in this unit had an SpO2 Low alarm value at or below 83 in only 1% of the sample, a value of 100 at or below in 50% of the sample (the mean), and a value at or below 103 in 99% of the sample [analysis of the one or more incidents]. Area 930 displays “Patient-specific Recommendations” as a chart displaying a calculated matrix showing: for each threshold value the % of time in alarm (for the most recent 25 hours, excluding the last hour) and an estimated % reductions possible, if the alarm limits are set at that threshold. [wherein para 341-342 of the Applicant’s specification discloses the recommendation can be an alarm limit update].)
It would have been obvious to one having ordinary skill in the art before the effective filing date of this application to modify the ‘613 application with the ability to utilize a generative AI model to generate a predictive report concerning the one or more incidents as taught by Gupta with the alarm management system as taught by Rusin in order to visualize an alarm summary of a specific patient and to recommend an alarm limit to avoid non-medically actionable alarms.
Regarding Claim 3:
Instant Application ‘632 Claim 3
Co-pending Application ‘613 Claim 3
The computer-implemented method of claim 1 wherein detecting one or more incidents defined within one or more of the data signals includes: monitoring the data signals associated with a medical device utilized on a patient within the medical environment to detect the occurrence of the one or more alarms.
The computer-implemented method of claim 1 wherein detecting one or more incidents defined within one or more of the data signals includes: monitoring the data signals associated with a medical device utilized on a patient within the medical environment to detect the occurrence of the one or more alarms.
Regarding Claim 4:
Instant Application ‘632 Claim 4
Co-pending Application ‘613 Claim 4
The computer-implemented method of claim 1 wherein the one or more incidents define an event.
The computer-implemented method of claim 1 wherein the one or more incidents define an event.
Regarding Claim 5:
Instant Application ‘632 Claim 5
Co-pending Application ‘613 Claim 5
The computer-implemented method of claim 1 wherein processing the one or more incidents defined within the one or more data signals to produce a recommendation based upon the one or more incidents so that a user may gain more insight into these one or more incidents includes: utilizing massive data sets processed by MLI to produce the recommendation based upon the one or more incidents so that the user may gain more insight into these one or more incidents.
The computer-implemented method of claim 1 wherein processing the one or more incidents defined within the one or more data signals to produce a summary of the one or more incidents so that a user may be quickly knowledgeable of these one or more incidents includes: utilizing massive data sets processed by ML to produce the summary of the one or more incidents so that the user may be quickly knowledgeable of these one or more incidents
Regarding Claim 6:
Instant Application ‘632 Claim 6
Co-pending Application ‘613 Claim 6
The computer-implemented method of claim 1 wherein processing the one or more incidents defined within the one or more data signals to produce a recommendation based upon the one or more incidents so that a user may gain more insight into these one or more incidents includes: utilizing a generative Al model to produce the recommendation based upon the one or more incidents so that the user may gain more insight into these one or more incidents.
The computer-implemented method of claim 1 wherein processing the one or more incidents defined within the one or more data signals to produce a summary of the one or more incidents so that a user may be quickly knowledgeable of these one or more incidents includes: utilizing a generative Al model to produce the summary of the one or more incidents so that the user may be quickly knowledgeable of these one or more incidents.
Regarding Claim 7:
Instant Application ‘632 Claim 7
Co-pending Application ‘613 Claim 7
The computer-implemented method of claim 6 wherein utilizing a generative Al model to produce the recommendation based upon the one or more incidents so that the user may gain more insight into these one or more incidents includes: utilizing prompt engineering and the generative Al model to produce the recommendation based upon the one or more incidents so that the user may gain more insight into these one or more incidents.
The computer-implemented method of claim 6 wherein utilizing a generative Al model to produce the summary of the one or more incidents so that the user may be quickly knowledgeable of these one or more incidents includes: utilizing prompt engineering and the generative Al model to produce the summary of the one or more incidents so that the user may be quickly knowledgeable of these one or more incidents.
Regarding Claim 8:
Instant Application ‘632 Claim 8
Co-pending Application ‘613 Claim 8
The computer-implemented method of claim 1 wherein the plurality of data signals include one or more of: one or more data signals associated with a medical device utilized on a patient within the medical environment; one or more data signals associated with drugs administered to the patient within the medical environment; one or more data signals associated with lab work performed on the patient within the medical environment; one or more data signals associated with clinical assessments performed on the patient within the medical environment; one or more data signals associated with clinical procedures performed on the patient within the medical environment; one or more data signals associated with electronic health records and/or electronic medical records of the patient within the medical environment; and one or more data signals associated with a medical history of the patient within the medical environment.
The computer-implemented method of claim 1 wherein the plurality of data signals include one or more of: one or more data signals associated with a medical device utilized on a patient within the medical environment; one or more data signals associated with drugs administered to the patient within the medical environment; one or more data signals associated with lab work performed on the patient within the medical environment; one or more data signals associated with clinical assessments performed on the patient within the medical environment; one or more data signals associated with clinical procedures performed on the patient within the medical environment; one or more data signals associated with electronic health records and/or electronic medical records of the patient within the medical environment; and one or more data signals associated with a medical history of the patient within the medical environment.
Regarding Claim 9:
Instant Application ‘632 Claim 9
Co-pending Application ‘613 Claim 9
The computer-implemented method of claim 8 wherein the one or more data signals associated with a medical device utilized on a patient within the medical environment concern one or more details of the medical device and/or uses of the medical device.
The computer-implemented method of claim 8 wherein the one or more data signals associated with a medical device utilized on a patient within the medical environment concern one or more details of the medical device and/or uses of the medical device.
Regarding Claim 10:
Instant Application ‘632 Claim 10
Co-pending Application ‘613 Claim 10
The computer-implemented method of claim 8 wherein the medical device includes one or more sub-medical devices.
The computer-implemented method of claim 8 wherein the medical device includes one or more sub-medical devices.
Wherein para 341-342 and Fig. 18C of both application specifications (‘632 and ‘613) discloses a summary and recommending an alarm limit update, therefore, under the broadest reasonable interpretation of the claims, the limitations of producing a recommendation and producing a summary are substantially similar as applies to claims 1 and 5-7.
Further, “so the gain more insight into these one or more incidents” and “be quickly knowledgeable of these one or more incidents” are broadly interpreted to disclose substantially similar limitations as applies to claims 1 and 5-7. These limitations are simply disclosing similar benefits of utilizing the same invention.
In regards to claims 11, 13-21, and 23-30 of each application, these claims recite the computer program product residing on [a non-transitory in ‘632] computer readable medium having a plurality of instructions stored thereon which, when executed by a processor, cause the processor to perform operations (claims 11 and 13-20) and the computing system including a processor and memory configured to perform operations (claims 21 and 23-30) implementing the method of claims 1 and 3-10 and as such a similarly rejected on the ground of nonstatutory double patenting.
Therefore, Claims 1, 3-11, 13-21, 23-30 of the instant Application 18/798,632 (‘632) are provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1, 3-11, 13-21, 23-30 of co-pending Application No. 18/798,613 (‘613) in view of Gupta (US PG Pub 2024/0029848 A1) and Rusin (US PG Pub 2015/0137968 A1).
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1, 3-11, 13-21, and 23-30 are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claimed invention is directed to an abstract idea without significantly more. Claims 1, 3-11, 13-21, and 23-30 are directed to a system, method, or product which are one of the statutory categories of invention. (Step 1: YES).
Independent Claim 1 discloses a computer-implemented method, executed on a computing device, comprising: monitoring a plurality of data signals associated with a patient within a medical environment; detecting one or more incidents defined within one or more of the data signals, wherein the one or more incidents each include a specific combination of alarms occurring in a temporally proximate fashion; utilizing a generative AI model to generate a predictive report concerning the one or more incidents, including analyzing patterns in a medical history and current health metrics associated with the patient, including iteratively refining a prompt based upon an initial output of the generative AI model; and processing the one or more incidents defined within the one or more data signals to produce a recommendation based upon the one or more incidents so that a user may gain more insight into these one or more incidents, including: providing a historical timeline view of at least a portion of the plurality of signals: receiving a selection of one or more incidents within the historical timeline view: and providing supplemental information regarding the selected one or more incidents including an analysis of the selected one or more incidents, and the recommendation, wherein the recommendation concerns the selected one or more incidents.
Independent Claim 11 discloses a computer program product residing on a non-transitory computer readable medium having a plurality of instructions stored thereon which, when executed by a processor, cause the processor to perform operations comprising: monitoring a plurality of data signals associated with a patient within a medical environment; detecting one or more incidents defined within one or more of the data signals, wherein the one or more incidents each include a specific combination of alarms occurring in a temporally proximate fashion; utilizing a generative AI model to generate a predictive report concerning the one or more incidents, including analyzing patterns in a medical history and current health metrics associated with the patient, including iteratively refining a prompt based upon an initial output of the generative AI model; and processing the one or more incidents defined within the one or more data signals to produce a recommendation based upon the one or more incidents so that a user may gain more insight into these one or more incidents, including: providing a historical timeline view of at least a portion of the plurality of signals; receiving a selection of one or more incidents within the historical timeline view; and providing supplemental information regarding the selected one or more incidents including an analysis of the selected one or more incidents, and the recommendation, wherein the recommendation concerns the selected one or more incidents.
Independent Claim 21 discloses a computing system including a processor and memory configured to perform operations comprising: monitoring a plurality of data signals associated with a patient within a medical environment; detecting one or more incidents defined within one or more of the data signals, wherein the one or more incidents each include a specific combination of alarms occurring in a temporally proximate fashion; utilizing a generative AI model to generate a predictive report concerning the one or more incidents, including analyzing patterns in a medical history and current health metrics associated with the patient, including iteratively refining a prompt based upon an initial output of the generative AI model; and processing the one or more incidents defined within the one or more data signals to produce a recommendation based upon the one or more incidents so that a user may gain more insight into these one or more incidents, including: providing a historical timeline view of at least a portion of the plurality of signals; receiving a selection of one or more incidents within the historical timeline view; and providing supplemental information regarding the selected one or more incidents including an analysis of the selected one or more incidents, and the recommendation, wherein the recommendation concerns the selected one or more incidents.
The examiner is interpreting the above bolded limitations as additional elements as further discussed below. The remaining un-bolded limitations are merely directed to providing patient recommendations. These steps represent certain methods of organizing human activity because it is managing the behavior of an individual by providing rules or instructions in the form of recommendations (MPEP 2106.04(a)(2).II.C). The steps reciting monitoring the data signals any limitations that may be related to the outputting of the recommendations are considered a part of the abstract idea (“We have recognized that "information as such is an intangible" and that collecting, analyzing, and displaying that information, without more, is an abstract idea. Elec. Power Grp. , 830 F.3d at 1353-54 ; see also id. at 1355 (noting claim requirement of " ‘displaying concurrent visualization” of two or more types of information" was insufficient to confer patent eligibility) Interval Licensing LLC v. AOL, Inc., 896 F.3d 1335, 1344 (Fed. Cir. 2018); see MPEP § 2106.04(a)(2)(I1)(C)).
Further, the series of steps recited above, given the broadest reasonable interpretation, are merely directed to a mental process because they recite a process that could be practically performed in the human mind (i.e. observations, evaluations, judgements, and/or opinions). In this case, “monitoring a plurality of data signals associated with a patient within a medical environment,” “detecting one or more incidents defined within one or more of the data signals, wherein the one or more incidents each include a specific combination of alarms occurring in a temporally proximate fashion,” “generate a predictive report concerning the one or more incidents, including analyzing patterns in a medical history and current health metrics associated with the patient,” and “processing the one or more incidents defined within the one or more data signals to produce a recommendation based upon the one or more incidents so that a user may gain more insight into these one or more incidents,” is reasonably interpreted as at least observations, evaluations, judgements, and/or opinions that could be performed by a human mentally or using a pen and paper.
The two abstract ideas are considered together as a single abstract idea for further analysis. (Step 2A- Prong 1: YES. The claims are abstract).
This judicial exception is not integrated into a practical application. Limitations that are not indicative of integration into a practical application include: (1) Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea (MPEP 2106.05.f), (2) Adding insignificant extra- solution activity to the judicial exception (MPEP 2106.05.g), (3) Generally linking the use of the judicial exception to a particular technological environment or field of use (MPEP 2106.05.h).
Independent Claim 1 discloses the following additional elements:
a computer-implemented method, executed on a computing device
utilizing a generative AI model
including iteratively refining a prompt based upon an initial output of the generative AI model
Independent Claim 11 discloses the following additional elements:
a computer program product residing on a non-transitory computer readable medium having a plurality of instructions stored thereon which, when executed by a processor, cause the processor to perform operations
utilizing a generative AI model
including iteratively refining a prompt based upon an initial output of the generative AI model
Independent Claim 21 discloses the following additional elements:
a computing system including a processor and memory configured to perform operations
utilizing a generative AI model
including iteratively refining a prompt based upon an initial output of the generative AI model
In particular, a computer-implemented method, executed on a computing device (of claim 1), the computer program product residing on a non-transitory computer readable medium having a plurality of instructions stored thereon which, when executed by a processer, cause the processor to perform operations (of claim 11), the computing system including a processor and memory configured to perform operations (of claim 21), and the generative AI model and iteratively refining a prompt based upon an initial output of the generative AI model (of claims 1, 11, and 21) are recited at a high-level of generality such that it amounts to no more than mere instructions to implement an abstract idea by adding the words ‘apply it' (or an equivalent) with the judicial exception. Mere instructions to apply the abstract idea using a computer are not sufficient to integrate the abstract idea into a practical application or amount to significantly more than the abstract idea (MPEP 2106.05(f)).
The recited computer components (e.g., the processor and memory) are all generically recited components (see specification, par. [0513]-[0515]). Commercially available components, generic computer components, and specially-programmed computer components performing the functions of a generic computer are not considered to be amount to significantly more than the abstract idea (MPEP 2106.05(b)).
When considered as a whole, the components do not provide anything that is not present when the component parts are considered individually. Using the broadest reasonable interpretation, the system as a whole is a general purpose computer analyzing patient data and providing recommendations. This is a general purpose computer performing the abstract idea through these generically described devices performing well-understood, routine, and conventional functions of a generic computer (MPEP 2106.05(d).II).
Accordingly, claim(s) 1, 11, and 21 are directed to an abstract idea(s) without a practical application. (Step 2A-Prong 2: NO: the additional claimed elements are not integrated into a practical application).
The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of the computer-implemented method, executed on a computing device (of claim 1), the computer program product residing on a non-transitory computer readable medium having a plurality of instructions stored thereon which, when executed by a processer, cause the processor to perform operations (of claim 11), the computing system including a processor and memory configured to perform operations (of claim 21), and the generative AI model and iteratively refining a prompt based upon an initial output of the generative AI model (of claims 1, 11, and 21) amounts to no more than mere instructions to apply the exception using a generic computer component. Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept ("significantly more' ). MPEP2106.05(I)(A) indicates that merely saying "apply it” or equivalent to the abstract idea cannot provide an inventive concept ("significantly more"). Accordingly, even in combination, this additional element does not provide significantly more. As such the independent claims 1, 11, and 21 are not patent eligible. (Step 2B: NO. The claims do not provide significantly more).
Dependent claim(s) 3-10, 13-20, and 23-30 are similarly rejected because they either further define/narrow the abstract idea and/or do not further limit the claim to a practical application or provide an inventive concept such that the claims are subject matter eligible even when considered individually or as an ordered combination.
Dependent claims 5-10, 15-20, and 25-30 do further disclose the additional element(s) as follows:
Claims 8-9, 18-19, and 28-29 all recite additional limitations that serve to select by type or source the data to be manipulated by describing either specific types of data that is to be used as part of the patient monitoring or the sources of the data signals to be used as part of the patient monitoring. Selecting by type or source the data to be manipulated is considered an insignificant extra-solution activity that is not sufficient to integrate the abstract idea into a practical application or amount to significantly more than the abstract idea (MPEP 2106.05(g)).
Claims 5-7, 10, 15-17, 20, 25-27, and 30 recite limitations that amount to mere instructions to apply the abstract idea because they simply describe the types of devices and/or computer programs used to perform the identified abstract idea (utilizing massive data sets processed by ML, utilizing a generative AI model, utilizing prompt engineering and the generative AI model, and sub-medical devices). Mere instructions to apply the abstract idea are not considered sufficient to integrate the abstract idea into a practical application or amount to significantly more than the abstract idea (MPEP 2106.05(f)).
Accordingly, even in combination, these additional elements do not integrate the abstract idea into a practical application or amount to significantly more than the abstract idea.
Therefore, the dependent claims are also directed to an abstract idea.
Thus, Claims 1, 3-11, 13-21, and 23-30 are rejected under 35 U.S.C. 101 as being directed to non-statutory subject matter.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 3-11, 13-21, and 23-30 are rejected under 35 U.S.C. 103 as being unpatentable over Katra (US PG Pub. 2020/0357513) in view of Gupta (US PG Pub. 2024/0029848), further in view of Rusin (US PG Pub 2015/0137968 A1).
Regarding Claim 1, Katra discloses:
A computer-implemented method, executed on a computing device, comprising: (Para 44 discloses this disclosure presents systems and methods for remote post-IMD monitoring for implantation site infections.)
monitoring a plurality of data signals associated with a patient within a medical environment (Para 48 discloses the computing device executing the app (e.g., a virtual check-in process) may perform various functionalities described below, whether via local computing resources provided by the computing device, via cloud-based backend systems, or both. In some examples, the computing device may implement the app via a web browser. In some examples, the computing device may perform a device check. In such examples, the computing device may implement one or more interrogations of one or more medical devices (e.g., IMDs, CIEDs, etc.). In addition, the computing device may analyze medical device settings, parameters, and performance metrics. Para 84 discloses medical device(s) 6 may include wearable devices (e.g., smart watches, headsets, etc.) configured to obtain physiological data (e.g., activity data, heart rate, etc.) and transfer such data to computing device(s) 2, network 10, edge device(s) 12, etc. for subsequent utilization, in accordance with one or more of the various techniques of this disclosure.)
detecting one or more incidents defined within one or more of the data signals, (Para 81 discloses medical device(s) 6 may include diagnostic medical devices. In an example, medical device(s) 6 may include a device that predicts heart failure events or that detects worsening heart failure of patient 4. In a non-limiting and illustrative example, system 100 may be configured to measure impedance fluctuations of patient 4 and process impedance data to accumulate evidence of worsening heart failure. In any case, medical device(s) 6 may be configured to determine a health status relating to patient 4. Medical device(s) 6 may transmit the diagnostic data or health status to computing device(s) 2 as interrogation data, such that computing device(s) 2 may correlate the interrogation data with image data to determine whether an abnormality present with a particular one of medical device(s) 6 (e.g., an IMD) or patient 4 (e.g., infection at an implantation site).)
wherein the one or more incidents each include a specific combination of alarms occurring in a temporally proximate fashion; (Para 115 discloses In an illustrative example, computing device(s) 2 may determine a collective abnormality based on an identified potential abnormality at the implantation site and based on an abnormality of a physiological parameter (e.g., an ECG abnormality). In some instances, computing device(s) 2 may nevertheless determine the presence of a potentially health-threatening abnormality when no abnormality is identified in one subsession (e.g., an implantation site abnormality) but that an abnormality is identified in another subsession (e.g., an IMD abnormality), such that an analysis of the at least two subsession indicates that there is or is not an abnormality that warrants a follow-up appointment advisement [detecting incident]. In addition, computing device(s) 2 may provide different audible notifications following each subsession [specific combination of alarms] and then again, following the completion of all proscribed subsessions (e.g., two subsessions, three subsessions, etc.). Para 298 discloses processing circuitry 20 may provide an alert, such as a text- or graphics-based notification, a visual notification, etc. In some examples, processing circuitry 20 may cause an audible alarm to sound or cause a tactile alarm, alerting patient 4 of a determined abnormality. In other examples, computing device(s) 2 may provide a visual light indication, such as emitting a red light for high severity or a yellow light for medium severity. The alert may indicate a potential, possible or predicted abnormality event (e.g., a potential infection).)
processing the one or more incidents defined within the one or more data signals to produce a recommendation based upon the one or more incidents so that a user may gain more insight into these one or more incidents (Para 151 discloses computing device 2 may determine, from physiological parameters obtained via a second subsession, an ECG change that indicates device migration and as such, increases the likelihood that a potential abnormality is being detected from the image data. In such instances, computing device 2 may analyze the image data using a bias toward detecting an abnormality or may include, in a post-implant report, a heightened likelihood (e.g., probability, confidence interval) that is based on the likelihood of a potential abnormality determined from the first and second set data items. Para 196 discloses in an illustrative example, when processing circuitry 20 determines that any one or any combination of subsessions yielded an abnormal result (or abnormal outside an acceptable margin of normal), processing circuitry 20 may use the mobile device app to output a prompt to patient 4. In some examples, the prompt may indicate an abnormality. In other examples, the prompt may include a recommendation or instruction to schedule a follow-up visit with the HCP. Para 298 discloses processing circuitry, e.g., processing circuitry 20 of computing device(s) 2, processing circuitry 64 of edge device(s) 12, processing circuitry 98 of server(s) 94, or processing circuitry 40 of medical device(s) 17, may determine instructions for medical intervention based on the health condition status of patient 4 (2304). For example, where processing circuitry 20 determines the presence of an abnormality at the implantation site of IMD(s) 6, processing circuitry 20 may determine instructions for medical intervention based on the abnormality. In another example, where processing circuitry 20 determines the presence of an abnormality at the implantation site of IMD(s) 6 and an abnormality in a physiological parameter of patient 4, processing circuitry 20 may determine instructions for medical intervention based on a post-implant report that processing circuitry 20 may generate based at least in part on the multiple abnormalities. In some examples, processing circuitry 20 may determine different instructions for different severity levels or abnormality categories.)
While Katra discloses the above limitations, it does not fully disclose the following limitations that Gupta discloses:
utilizing a generative AI model to generate a predictive report concerning the one or more incidents, including analyzing patterns in a medical history and current health metrics associated with the patient (Para 203 and FIG. 5 disclose a system for simulating healthcare journey of one or more patients. The system 500 comprises a database 502 and a processor 504 communicably coupled with a communication network. Para 204 discloses the system 500 comprises the database 502 configured to store patient data related to the one or more patients. The system 500 comprises the processor 504 configured to receive a patient data wherein the patient data is accessed from the database 502 and/or from an external source. The external source may include the one or more patients, wherein the external source provides the patient data via a patient input using a user-interface. The processor 504 is further configured to create a simulation model of the one or more patients, using the received patient data, and employing a machine learning. The simulation model is then executed by the processor to predict one more health variables. In response to the one or more health variables, one or more treatment variables are generated. The processor 504 provides the predicted one or more health variables, the generated one or more treatment variables and one or more clinician inputs to the simulation model for continuous learning of the simulation model. Furthermore, a final outcome including patient's healthcare journey and patient's disease diagnosis and treatment is provided to the one or more patient by the simulation model. Para 224 discloses the system employs Generative Large-Language-Models (G-LLMs) for generating responses on the queries received from the clinicians. The Generative Large-Language-Models (G-LLMs) are deep learning algorithms trained on a large amount of data. The Generative Large-Language-Models (G-LLMs) assist the clinician at each step of patient journey and generate the summaries of patient journey. In this regard, the patient journey comprises some steps that are normally followed by the clinicians in evaluating the patients such as patient history evaluation, conducting physical examination, differential diagnosis, targeted clinical investigations, diagnosis, evaluation of diagnosis and then define a plan and monitoring the patient. For example, the Generative Large-Language-Models (G-LLMs) assist the clinicians in identifying the clinical features related to a target disease, in diagnostic evaluation or in generating a patient report based on test reports.)
including iteratively refining a prompt based upon an initial output of the generative AI model (This step only requires repeating the steps previously recited. Therefore, it is a duplication of the steps of providing input to generate the reports. This is akin to the duplication of parts, which says that “mere duplication of parts has no patentable significance unless a new and unexpected result is produced.” (MPEP 2144.VI.B). Because the purpose of iterating the steps is to “refin[e]” the results by tweaking the inputs, the duplication of this step does not produce “a new and unexpected result”. Therefore, this limitation will be given no patentable significance in accordance with the MPEP.)
It would have been obvious to one having ordinary skill in the art before the effective filing date of this application to add to the system of Katra the ability to utilize a generative AI model to generate a predictive report concerning the one or more incidents, including analyzing patterns in a medical history and current health metrics associated with the patient, as taught by Gupta, because the algorithm used by Gupta “allows the processing arrangement to become more accurate in generating the document, without being explicitly programmed.” (Gupta, par. [0225]).
While the combination of Katra and Gupta disclose the above limitations, it does not fully disclose the following limitation that Rusin discloses:
providing a historical timeline view of at least a portion of the plurality of signals; (Para 106-107 and FIGS. 8 and 9 disclose example screenshots illustrating patient alarm data monitoring in the system 100 according to one embodiment. In this example, a clinician assigns a patient alarm data monitor to the patient. The system 100 then displays a screen as illustrated in FIG. 8… Line 810 contains patient and bed information such as the patient's name, unit and bed number, patient identification number, and date of birth, in this example “Bed: PICU-RM012-012” and “MRN 5458246623.” Area 820 is a patient's alarm summary indicating the period from admission to the present.)
receiving a selection of one or more incidents within the historical timeline view; and providing supplemental information regarding the selected one or more incidents including an analysis of the selected one or more incidents, and the recommendation, wherein the recommendation concerns the selected one or more incidents. (Paras 108-109 disclose when the clinician selects an alarm, for example, “SpO2 LOW,” the system 100 displays a recommendation screen illustrated in FIG. 9. Area 910 is the Standard Recommended for Unit with the recommended Limit set by Hospital policy for typical patients in that unit… Area 920 displays “Observed Vitals for Unit Population,” which is a histogram of this unit's typical patient alarm values for a sample of the past 25 hours, excluding the last hour. The first percentile value, the 50th percentile value, and the 99th percentile value are marked in the histogram. In this example the typical patient in this unit had an SpO2 Low alarm value at or below 83 in only 1% of the sample, a value of 100 at or below in 50% of the sample (the mean), and a value at or below 103 in 99% of the sample [analysis of the one or more incidents]. Area 930 displays “Patient-specific Recommendations” as a chart displaying a calculated matrix showing: for each threshold value the % of time in alarm (for the most recent 25 hours, excluding the last hour) and an estimated % reductions possible, if the alarm limits are set at that threshold. [wherein para 341-342 of the Applicant’s specification discloses the recommendation can be an alarm limit update].)
It would have been obvious to one having ordinary skill in the art before the effective filing date of this application to modify the combination of the system of Katra and the ability to utilize a generative AI model to generate a predictive report concerning the one or more incidents as taught by Gupta with the alarm management system as taught by Rusin in order to visualize an alarm summary of a specific patient and to recommend an alarm limit to avoid non-medically actionable alarms.
Regarding Claim 3, this claim recites the limitations of Claim 1 and as to those limitations is
rejected for the same basis and reasons as disclosed above. The combination of Katra, Gupta, and Rusin discloses the following limitations that Katra further discloses:
The computer-implemented method of claim [[2]] 1, wherein detecting one or more incidents defined within one or more of the data signals includes: monitoring the data signals associated with a medical device utilized on a patient within the medical environment to detect the occurrence of the one or more alarms (Para 298 discloses processing circuitry, e.g., processing circuitry 20 of computing device(s) 2, processing circuitry 64 of edge device(s) 12, processing circuitry 98 of server(s) 94, or processing circuitry 40 of medical device(s) 17, may determine instructions for medical intervention based on the health condition status of patient 4 (2304). For example, where processing circuitry 20 determines the presence of an abnormality at the implantation site of IMD(s) 6, processing circuitry 20 may determine instructions for medical intervention based on the abnormality. In another example, where processing circuitry 20 determines the presence of an abnormality at the implantation site of IMD(s) 6 and an abnormality in a physiological parameter of patient 4, processing circuitry 20 may determine instructions for medical intervention based on a post-implant report that processing circuitry 20 may generate based at least in part on the multiple abnormalities. In some examples, processing circuitry 20 may determine different instructions for different severity levels or abnormality categories. For example, processing circuitry 20 may determine a first set of instructions for one abnormality that processing circuitry 20 determines is likely less severe than another abnormality. In some examples, processing circuitry 20 may not determine intervention instructions where processing circuitry 20 determines that the abnormality level does not satisfy a predefined threshold. In some examples, processing circuitry 20 may provide an alert, such as a text- or graphics-based notification, a visual notification, etc. In some examples, processing circuitry 20 may cause an audible alarm to sound or cause a tactile alarm, alerting patient 4 of a determined abnormality. In other examples, computing device(s) 2 may provide a visual light indication, such as emitting a red light for high severity or a yellow light for medium severity. The alert may indicate a potential, possible or predicted abnormality event (e.g., a potential infection).)
Regarding Claim 4, this claim recites the limitations of Claim 1 and as to those limitations is
rejected for the same basis and reasons as disclosed above. The combination of Katra, Gupta, and Rusin discloses the following limitations that Katra further discloses:
The computer-implemented method of claim 1 wherein the one or more incidents define an event (Para 103 discloses a trained ML model 30 and/or AI engine 28 may be configured to process and analyze the user input (e.g., images of the implantation site, patient status data, etc.), device parameters (e.g., accelerometer data), historical data of medical device (e.g., medical device 6), and/or physiological parameters, in accordance with certain examples of this disclosure where ML models are considered advantageous (e.g., predictive modeling, inference detection, contextual matching, natural language processing, etc.)… In another example, these models and engines may be trained to synthesize data in order to identify abnormalities of patient 4 or medical device(s) 17 and to identify abnormalities of patient 4 or medical device(s) 17 from individual data items. Para 180 discloses computing device 502 may identify the follow-up schedule by receiving a physiological parameter that indicates an abnormality (e.g., an ECG abnormality) and determining a first time period for the follow-up schedule based on the physiological parameter abnormality. That is, computing device(s) 2 may determine a triggering event for identifying a follow-up schedule that includes a trigger based on an amount of time that has passed, a particular signal received from one of medical device(s) 17 (e.g., IMD 6), such as an activity level, an ECG, etc., or based on a trigger received via network 10 (e.g., a computing device 2 of an HCP).)
Regarding Claim 5, this claim recites the limitations of Claim 1 and as to those limitations is
rejected for the same basis and reasons as disclosed above. The combination of Katra, Gupta, and Rusin discloses the following limitations that Katra further discloses:
The computer-implemented method of claim 1 wherein processing the one or more incidents defined within the one or more data signals to produce a recommendation based upon the one or more incidents so that a user may gain insight into these one or more incidents includes: utilizing massive data sets processed by ML to produce the recommendation based upon the one or more incidents so that the user may gain more insight into these one or more incidents (Para 103 discloses a trained ML model 30 and/or AI engine 28 may be configured to process and analyze the user input (e.g., images of the implantation site, patient status data, etc.), device parameters (e.g., accelerometer data), historical data of medical device (e.g., medical device 6), and/or physiological parameters, in accordance with certain examples of this disclosure where ML models are considered advantageous (e.g., predictive modeling, inference detection, contextual matching, natural language processing, etc.). Examples of ML models and/or AI engines that may be so configured to perform aspects of this disclosure include classifiers and non-classification ML models, artificial neural networks (“NNs”), linear regression models, logistic regression models, decision trees, support vector machines (“SVM”), Naïve or a non-Naïve Bayes network, k-nearest neighbors (“KNN”) models, deep learning (DL) models, k-means models, clustering models, random forest models, or any combination thereof. Depending on the implementation, the ML models may be supervised, unsupervised or in some instances, a hybrid combination (e.g., semi-supervised). These models may be trained based on data indicating how users (e.g., patient 4) interact with computing device(s) 2. For example, certain aspects of the disclosure will be described using events or behaviors (such as clicking, viewing, or watching) with respect to items (e.g., wound images, cameras, videos, physiological parameters, etc.), for purposes of illustration only. In another example, these models and engines may be trained to synthesize data in order to identify abnormalities of patient 4 or medical device(s) 17 and to identify abnormalities of patient 4 or medical device(s) 17 from individual data items.)
Regarding Claim 6, this claim recites the limitations of Claim 1 and as to those limitations is
rejected for the same basis and reasons as disclosed above. The combination of Katra, Gupta, and Rusin discloses the following limitations that Katra further discloses:
The computer-implemented method of claim 1 wherein the processing the one or more incidents defined within the one or more data signals to produce a recommendation based upon the one or more incidents so that a user may gain insight into these one or more incidents includes: utilizing a generative AI model to produce the recommendation based upon the one or more incidents so that the user may gain more insight into these one or more incidents (Para 103 discloses the possible AI models that can be used. Para 104 discloses in some examples, processing circuitry 40 may use ML algorithms (e.g., DL algorithms) to, for example, monitor a progression of a wound that is healing or predict that a potential infection is occurring, for example, with respect to an implant site of one of medical device(s) 17. In an illustrative and non-limiting example, AI engine(s) 28 and/or ML model(s) 30 may utilize a deep-neural network to localize an implantation site in an image and classify an abnormality status. In another example, AI engine(s) 28 and/or ML model(s) 30 may utilize Naïve Bayes and/or decision trees to synthesize (e.g., combine) data items and the analysis thereof (e.g., image analysis and ECG analysis) in order to obtain a comprehensive abnormality determination for patient 4 and include such comprehensive determinations in a report, such as for patient 4. [This shows that the AI models can include the determined information in a report.]
Regarding Claim 7, this claim recites the limitations of Claim 6 and as to those limitations is rejected for the same basis and reasons as disclosed above. The combination of Katra, Gupta, and Rusin discloses the following limitations that Katra further discloses:
The computer-implemented method of claim 6, wherein the processing the one or more incidents defined within the one or more data signals to produce a recommendation based upon the one or more incidents so that a user may gain insight into these one or more incidents includes: utilizing prompt engineering and the generative AI model to produce the recommendation based upon the one or more incidents so that the user may gain more insight into these one or more incidents (Para 103-104 describe the AI models that can be used. Para 104 discloses patient 4 may exhibit difficulty with capturing images of the implantation site from various angles. This helpful data may be shared across a health monitoring or computing network so that optimal results may be presented to more than one user based on similar queries and user reactions to those queries. Para 173 discloses processing circuitry 20 may receive user input, via UI 22, indicating a patient name of patient 4. Processing circuitry 20 may query the database and based on a result of the query, identify patient 4 as a known patient of system 100 (e.g., system 300). Para 174 discloses processing circuitry 20 may reference patient data (e.g., patient identifiers) such that the same computing device(s) 2 (or the same algorithm base) can be shared across multiple patients (e.g., in a clinic). As described herein, computing device(s) 2 may adjust, based on patient data, the base of the site-check algorithm in order to tailor the process and UI visualizations in order to accommodate each respective patient. In another example, a common site-check algorithm may be deployed to accommodate all patients of a certain class (e.g., nursing home patients, patients of a particular nursing home, etc.). In this way, the site-check algorithms may maintain and provide a particular level of uniformity for the various users of the interactive session, where those users may be part of a common class.” [These show that the system has the ability to utilize queries to the system to tailor the data that is analyzed and output to the user.]
Regarding Claim 8, this claim recites the limitations of Claim 1 and as to those limitations is
rejected for the same basis and reasons as disclosed above. The combination of Katra, Gupta, and Rusin discloses the following limitations that Katra further discloses:
The computer-implemented method of claim 1 wherein the plurality of data signals include one or more of: one or more data signals associated with a medical device utilized on a patient within the medical environment (Para 48 discloses the computing device may implement one or more interrogations of one or more medical devices (e.g., IMDs, CIEDs, etc.). In addition, the computing device may analyze medical device settings, parameters, and performance metrics.)
One or more data signals associated with drugs administered to the patient within the medical environment (Para 209 and FIG. 10 discloses a non-limiting example of a patient status questionnaire that either the backend system or the mobile device application of this disclosure may generate. The patient status questionnaire may be used to gauge information on a general health picture of patient 4, on implant-recovery-specific symptoms, medications that the patient has taken recently or will take soon, etc.)
One or more data signals associated with clinical assessments performed on the patient within the medical environment (Para 177 discloses IMD information may include historical data relating to the implantation site of the IMD and/or historical data relating to the 1 MB. In some examples, the historical data may include images of the implantation site following the implantation procedure, wound characteristics, shape and size of the implantation site (e.g., the wound size), incision information, history of any complications during surgery, date of the implant, etc. In some examples, IMD information may further include IMD type information, 1 MB communication protocol information, an estimated orientation of medical device(s) 17 (e.g., 1 MB 6), data regarding one or more HCPs (e.g., surgeons, clinicians) responsible for implanting (e.g., inserting) medical device(s) 17 (e.g., 1 MB 6), information relating to one or more methods employed by the one or more HCPs when sealing the implantation site, images of the implantation site over time, etc.)
One or more data signals associated with clinical procedures performed on the patient within the medical environment (Para 177 discloses IMD information may include historical data relating to the implantation site of the IMD and/or historical data relating to the 1 MB. In some examples, the historical data may include images of the implantation site following the implantation procedure, wound characteristics, shape and size of the implantation site (e.g., the wound size), incision information, history of any complications during surgery, date of the implant, etc. In some examples, IMD information may further include IMD type information, 1 MB communication protocol information, an estimated orientation of medical device(s) 17 (e.g., 1 MB 6), data regarding one or more HCPs (e.g., surgeons, clinicians) responsible for implanting (e.g., inserting) medical device(s) 17 (e.g., 1 MB 6), information relating to one or more methods employed by the one or more HCPs when sealing the implantation site, images of the implantation site over time, etc.)
One or more data signals associated with electronic health records and/or electronic medical records of the patient within the medical environment (Para 94 discloses system 100 may include one or more databases (e.g., storage device 96) that store various medical data records, cohort data, image data. In such examples, server(s) 94 (e.g., one or more databases) may be managed or controlled by one or more separate entities (e.g., internet service providers (ISPs), etc.).)
One or more data signals associated with a medical history of the patient within the medical environment (Para 186 discloses computing device 502 may generate historical reports that include an aggregation of any one or more of the reports, such that in response to detecting a selection of an interactive-session tracker tile or other tracker tile, computing device 502 may retrieve historical reports and compile and/or summarize reports from the past in order to produce a single post-implant historical report for export and/or display (e.g., via a pop-up interface).)
Regarding Claim 9, this claim recites the limitations of Claim 8 and as to those limitations is
rejected for the same basis and reasons as disclosed above. The combination of Katra, Gupta, and Rusin discloses the following limitations that Katra further discloses:
The computer-implemented method of claim 8, wherein the one or more data signals associated with a medical device utilized on a patient within the medical environment concern one or more details of the medical device and/or uses of the medical device (Para 48 discloses the computing device may implement one or more interrogations of one or more medical devices (e.g., IMDs, CIEDs, etc.). In addition, the computing device may analyze medical device settings, parameters, and performance metrics.)
Regarding Claim 10, this claim recites the limitations of Claim 8 and as to those limitations is
rejected for the same basis and reasons as disclosed above. The combination of Katra, Gupta, and Rusin discloses the following limitations that Katra further discloses:
The computer-implemented method of claim 8, wherein the medical device includes one or more sub-medical devices (Para 142 discloses one or more of electrodes 16 may be coupled to at least one lead. In some examples, medical device(s) 17 may employ electrodes 16 in order to provide sensing and/or pacing functionalities. The configurations of electrodes 16 may be unipolar or bipolar. Sensing circuitry 52 may be selectively coupled to electrodes 16 via switching circuitry 58, e.g., to select the electrodes 16 and polarity, referred to as the sensing vector, used to sense impedance and/or cardiac signals, as controlled by processing circuitry 40. Sensing circuitry 52 may sense signals from electrodes 16, e.g., to produce a cardiac EGM or subcutaneous ECG, in order to facilitate monitoring the post-implant status of IMD 6. Sensing circuitry 52 also may monitor signals from sensors 54, which may include one or more accelerometers, pressure sensors, temperature sensors, and/or optical sensors, as examples. In some examples, sensing circuitry 52 may include one or more filters and amplifiers for filtering and amplifying signals received from electrodes 16 and/or sensors 54.)
As to claims 11 and 13-20, the claims are directed to the computer program product residing on a non-transitory computer readable medium having a plurality of instructions stored thereon which, when executed by a processor, cause the processor to perform operations implementing the method of claims 1 and 3-7 and further recite a computer program product, a computer readable medium having a plurality of instructions, and a processor (e.g., see Katra Para 156 teaching storage device 50 includes computer-readable instructions that, when executed by processing circuitry 40, cause medical device(s) 17, including processing circuitry 40, to perform various functions attributed to medical device(s) 17 and processing circuitry 40 herein) and are similarly rejected.
As to claims 21 and 23-30, the claims are directed to the computer system implementing the method of claims 1 and 3-7 and further recite a computing system including a processor and memory configured to perform operations (e.g., see Karta Para 15) and are similarly rejected.
Response to Arguments
Applicant's arguments filed 09/03/2025 with respect to 35 U.S.C. § 101 have been fully considered, but are not persuasive. The Applicant makes mere assertions regarding the patent eligibility of the claimed invention without providing any arguments rebutting specifics of the rejection made or pointing to specific limitations in the claimed invention that would make the claimed invention eligible.
Therefore, the arguments are not persuasive, and the 101 rejections based on the claimed inventions being directed towards a judicial exception will be sustained.
Applicant’s arguments filed 09/03/2025 with respect to 35 U.S.C. § 103 have been fully
considered and are persuasive regarding the newly added limitations (in regards to the historical timeline view). Therefore, the previous 35 U.S.C. § 103 rejection has been withdrawn. However, upon further consideration, a new grounds of rejection under 35 U.S.C. § 103 necessitated by Applicant’s amendments as disclosed above.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SARA J MORICE DE VARGAS whose telephone number is (703)756-4608. The examiner can normally be reached M-F 8:30-5:30 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Peter H. Choi can be reached at (469)295-9171. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SARA JESSICA MORICE DE VARGAS/Examiner, Art Unit 3681
/PETER H CHOI/Supervisory Patent Examiner, Art Unit 3681