DETAILED ACTION
Response to Amendment
This action is in response to the amendment filed on February 2, 2026. Claims 1-2, 6, 11-12, 16, 21-22, and 26 have been amended. Claims 1-30 have been examined and are currently pending.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Inventorship
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Information Disclosure Statement
The Information Disclosure Statements filed on November 24, 2025 and February 2, 2026 have been considered. An initialed copy of the Form 1449 is enclosed herewith.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-30 are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter.
ALICE/ MAYO: TWO-PART ANALYSIS
2A. First, a determination whether the claim is directed to a judicial exception (i.e., abstract idea).
Prong 1: A determination whether the claim recites a judicial exception (i.e., abstract idea).
Groupings of abstract ideas enumerated in the 2019 Revised Patent Subject Matter Eligibility Guidance.
Mathematical concepts- mathematical relationships, mathematical formulas or equations, mathematical calculations.
Certain methods of organizing human activity- fundamental economic principles or practices (including hedging, insurance, mitigating risk); commercial or legal interactions (including agreements in the form of contracts; legal obligations; advertising, marketing or sales activities or behaviors; business relations); managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions).
Mental processes- concepts performed in the human mind (including an observation, evaluation, judgement, opinion).
Prong 2: A determination whether the judicial exception (i.e., abstract idea) is integrated into a practical application.
Considerations indicative of integration into a practical application enumerated in the 2019 Revised Patent Subject Matter Eligibility Guidance.
Improvement to the functioning of a computer, or an improvement to any other technology or technical field
Applying or using a judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition
Applying the judicial exception with, or by use of a particular machine.
Effecting a transformation or reduction of a particular article to a different state or thing
Applying or using the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception
Considerations that are not indicative of integration into a practical application enumerated in the 2019 Revised Patent Subject Matter Eligibility Guidance.
Merely reciting the words “apply it” (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea.
Adding insignificant extra-solution activity to the judicial exception.
Generally linking the use of the judicial exception to a particular technological environment or field of use.
2B. Second, a determination whether the claim provides an inventive concept (i.e., Whether the claim(s) include additional elements, or combinations of elements, that are sufficient to amount to significantly more than the judicial exception (i.e., abstract idea)).
Considerations indicative of an inventive concept (aka “significantly more”) enumerated in the 2019 Revised Patent Subject Matter Eligibility Guidance.
Improvement to the functioning of a computer, or an improvement to any other technology or technical field
Applying the judicial exception with, or by use of a particular machine.
Effecting a transformation or reduction of a particular article to a different state or thing
Applying or using the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception NOTE: The only consideration that does not overlap with the considerations indicative of integration into a practical application associated with step 2A: Prong 2.
Considerations that are not indicative of an inventive concept (aka “significantly more”) enumerated in the 2019 Revised Patent Subject Matter Eligibility Guidance.
Merely reciting the words “apply it” (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea.
Adding insignificant extra-solution activity to the judicial exception.
Generally linking the use of the judicial exception to a particular technological environment or field of use.
Simply appending well-understood, routine, conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception. NOTE: The only consideration that does not overlap with the considerations that are not indicative of integration into a practical application associated with step 2A: Prong 2.
See also, 2019 Revised Patent Subject Matter Eligibility Guidance; Federal Register; Vol. 84, No. 4; Monday, January 7, 2019
Claims 1-30 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more.
1: Statutory Category
Applicant’s claimed invention, as described in independent claim 1 is directed to a method, independent claim 11 is directed to a computer program product, and independent claim 21 is directed to a system.
2(A): The claim(s) are directed to a judicial exception (i.e., an abstract idea).
PRONG 1: The claim(s) recite a judicial exception (i.e., an abstract idea).
Mental Processes
Independent claims 1, 11, and 21 recite the limitations, “receiving data, thus defining received data, wherein the data includes, at least in part, physiological alarm data, and wherein receiving the data includes parsing real-time alarm messages and translating information from the physiological alarm data into a structured format; deidentifying the received data to generate deidentified data; receiving a request from a requester for at least a portion of the deidentified data, thus defining requested data; reidentifying the requested data to generate reidentified data when the requester has privilege to receive a reidentified version of the requested data; and providing the reidentified data to the requester.” are directed to the abstract idea of mental processes. Specifically, the limitations recited above consists of steps that can performed in the human mind by observation, evaluation, and judgment. In particular, the steps of receiving data, parsing data, translating data into a structured format, deidentifying and reidentifying require observation, evaluation and judgment of data by a user or human to perform an activity such as protecting a patient’s health data/privacy.
Additionally, under step 2A of “integration into a practical application” requires:
• Improvement to the functioning of a computer, or an improvement to any other technology or technical field
• Applying or using a judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition
• Applying the judicial exception with, or by use of a particular machine.
• Effecting a transformation or reduction of a particular article to a different state or thing
• Applying or using the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception
The applicant has not shown or demonstrated any of the requirements described above under "integration into a practical application" under step 2A. Specifically, the applicant's limitations are not "integrated into a practical application" because they are adding words "apply it" with the judicial exception, or mere instructions to implement an abstract idea merely as a tool to perform an abstract idea (see MPEP 2106.05(f)). Additionally, improvements to the functioning of a computer or any other technology or technical field has not been shown or disclosed (see MPEP 2106.05(a)). The claim(s) does/do not include additional elements that are sufficient to amount to significantly more than the judicial exception. Specifically, the applicant’s limitations are not “significantly more” because they are adding words “apply it” with the judicial exception, or mere instructions to implement an abstract idea merely as a tool to perform an abstract idea (see MPEP 2106.05(f)) and adding insignificant extra-solution activity to the judicial exception (see MPEP 2106.05(g)). The current application does not amount to 'significantly more' than the abstract idea as described above. The claim does not include additional elements or limitations individually or in combination that are sufficient to amount to significantly more than the judicial exception. Specifically, the individual elements of computing device, processor, and memory add no more than implementing an idea with a computerized system and they are adding words “apply it” with the judicial exception, or mere instructions to implement an abstract idea merely as a tool to perform an abstract idea and adding insignificant extra solution activity to the judicial exception. The additional elements taken in combination add nothing more than what is present when the elements are considered individually. Therefore, based on the two-part Alice Corp. analysis, there are no meaningful limitations in the claims that transform the exception (i.e., abstract idea) into a patent eligible application.
Dependent claims 2-10, 12-20, and 22-30 are rejected as ineligible subject matter under 35 U.S.C. 101 based on a rationale similar to the claims from which they depend.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-30 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
The term “real time” in claims 1, 11, and 21 is a relative term which renders the claim indefinite. The term “real time” is not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-30 are rejected under 35 U.S.C. 103 as being unpatentable over Yu US Publication 20110112862 A1 in view of Tonna et al. US Publication 20170229006 A1 further in view of Chandrasekaran et al. US Patent 11366927 B1.
Claims 1 and 11:
As per claim 1 and 11, Yu teaches a method and computer program product comprising:
receiving a request from a requester for at least a portion of the deidentified data, thus defining requested data (paragraph 0055 “The de-identified database system is generally accessed by a physician or other care provider logging onto the PHR website, and validating himself by inputting the registered ID and password. Using the advanced search engine 246, the physician can request the health records of specific patients or conditions. The ABID management process 244 creates all ABID indexed de-identified health accounts that match the search criteria. The ABID file is transmitted to the physician computer, which then calls a web service to get the III-XML file for the defined target population, which is then transmitted to the physician computer 204.);
reidentifying the requested data to generate reidentified data when the requester has privilege to receive a reidentified version of the requested data (paragraphs 0031 and 0051-0053 “The re-identified patient data can be provided in response to several different use scenarios related to requests for information about a patient by a physician, care provider, lab, insurance company, and so on. Among the most common use cases is where a physician pulls up a patient's records in order to provide care requested by the patient, such as in a diagnostic appointment, emergency, or regular doctor visit. The authorized physician logs onto the PHR website and is validated as an authorized party to receive the re-identified patient data.”);
and providing the reidentified data to the requester (paragraphs 0031 and 0052-0053 “The re-identified patient data can be provided in response to several different use scenarios related to requests for information about a patient by a physician, care provider, lab, insurance company, and so on. Among the most common use cases is where a physician pulls up a patient's records in order to provide care requested by the patient, such as in a diagnostic appointment, emergency, or regular doctor visit. The authorized physician logs onto the PHR website and is validated as an authorized party to receive the re-identified patient data.”).
Yu does not explicitly teach receiving data, thus defining received data, wherein the data includes, at least in part, physiological alarm data, and wherein receiving the data includes parsing real-time alarm messages and translating information form the physiological alarm data into a structured format. However, Tonna teaches Systems and Methods for Managing Patient Devices and further teaches, “…The PD integration module 120 may be further configured to monitor for detection of alarm conditions at the patient device(s) 170. The patient device 170 may be located within a particular patient area 180 (e.g., a room, an intensive care unit (ICU), a recovery room, and/or the like), and may be used to provide healthcare-related services to a patient 182…” (paragraph 0040), “…As used herein, an “alarm notification” refers to electronic data comprising information pertaining to an alarm condition detected at a patient device 170. An alarm notification 132 may comprise electronic data embodied on a machine-readable memory, communication network, and/or non-transitory storage system, such as the storage resources 115 of the computing device 111 and/or a storage system 160. As disclosed in further detail herein, an alarm notification 132 may comprise any suitable information pertaining to an alarm condition of a patient device 170…” (paragraph 0041), “The processing module 332 may be configured to process state data 172A-N acquired from the patient devices 170A-N by the PD integration module 120. The processing module 332 may be configured to, inter alia, convert, normalize, and/or translate the acquired data into a format usable by other components of the PD management system 110. The processing module 332 may be further configured to parse the data acquired from the patient devices 170A-N in order to, inter alia, identify alarm conditions detected at the patient devices 170A-N, determine information pertaining to detected alarm conditions (e.g., extract and/or parse state data 172A-N of the patient devices 170A-N), and so on.” (paragraph 0067), “The evaluation module 334 may be configured to generate alarm notifications 132 in response to detection of alarm conditions at the patient devices 170A-N. Generating an alarm notification 132 may comprise generating electronic data that, inter alia, describes the alarm condition, identifies the patient device 170A-N corresponding to the alarm condition, identifies the patient area 180 and/or patient 182 corresponding to the alarm condition, assigns a criticality level to the alarm condition, assigns a response type to the alarm condition, and so on…” (paragraph 0068), and “…The evaluation module 334 may be configured to determine information identifying the condition and/or event that triggered detection of the alarm condition at the patient device 170A-N, which may be extracted and/or parsed from state data 172A-N acquired from the patient device 170A-N. Such information may include, but is not limited to: monitoring data that triggered the alarm condition (e.g., a blood pressure value below a threshold), status data that triggered the alarm condition (e.g., less than a threshold amount of remaining medication), and so on.” (paragraph 0069). Therefore, it would have been obvious to one of ordinary skilled in the art at the time of filing to modify Yu to include receiving data, thus defining received data, wherein the data includes, at least in part, physiological alarm data, and wherein receiving the data includes parsing real-time alarm messages and translating information form the physiological alarm data into a structured format as taught by Tonna in order to standardize the data received from a machine or device.
Yu and Tonna do not explicitly teach deidentifying the received data to generate deidentified data. However, Chandrasekaran teaches Computing System for De-Identifying Patient Data and further teaches, “The server computing system 101 receives the patient data 112 from a source. In an example, the source may be an electronic health records application (EHR) that has generated the patient data 112. In another example, the source may be a data warehouse. In still another example the source may be an HIE. The server computing system 101 can then receive a request to de-identify the patient data 112. In an exemplary embodiment, the server computing system 101 receives the request to de-identify the patient data 112 from a client computing system 118 executing a client de-identifying application 126. In another exemplary embodiment, the server computing system 101 receives the request to de-identify the patient data 112 by way of user input received at the server computing system 101.” (column 7, lines 42-55) and “The server computing system 101 further includes a data store 110. The data store 110 includes patient data 112 pertaining to a patient. For example, the patient data 112 can be or include data that pertains to a patient and that has been generated by an electronic health records application (EHR). In another example, the patient data 112 can be or include data that pertains to a patient that was retrieved from a data warehouse, healthcare information exchange (HIE), or other repository of patient data. The data store 110 can further include de-identified patient data 114 (e.g., as generated by the de-identifying application 106 based upon patient data 112). The de-identifying application 106 is configured to generate the de-identified patient data 114 based upon the patient data 112.” (column 5, lines 22-35). Therefore, it would have been obvious to one of ordinary skilled in the art at the time of filing to modify Yu to include deidentifying the received data to generate deidentified data as taught by Chandrasekaran in order to protect patient privacy.
Claim 21:
As per claim 21, Yu teaches a system comprising:
a processor and a memory configured to perform operation comprising (paragraph 0077 “Aspects of the de-identified database structure described herein may be implemented as functionality programmed into any of a variety of circuitry, including programmable logic devices ("PLDs"), such as field programmable gate arrays ("FPGAs"), programmable array logic ("PAL") devices, electrically programmable logic and memory devices and standard cell-based devices, as well as application specific integrated circuits. Some other possibilities for implementing aspects include: microcontrollers with memory (such as EEPROM), embedded microprocessors, firmware, software, etc.):
receiving a request from a requester for at least a portion of the deidentified data, thus defining requested data (paragraph 0055 “The de-identified database system is generally accessed by a physician or other care provider logging onto the PHR website, and validating himself by inputting the registered ID and password. Using the advanced search engine 246, the physician can request the health records of specific patients or conditions. The ABID management process 244 creates all ABID indexed de-identified health accounts that match the search criteria. The ABID file is transmitted to the physician computer, which then calls a web service to get the III-XML file for the defined target population, which is then transmitted to the physician computer 204.);
reidentifying the requested data to generate reidentified data when the requester has privilege to receive a reidentified version of the requested data (paragraphs 0031 and 0051-0053 “The re-identified patient data can be provided in response to several different use scenarios related to requests for information about a patient by a physician, care provider, lab, insurance company, and so on. Among the most common use cases is where a physician pulls up a patient's records in order to provide care requested by the patient, such as in a diagnostic appointment, emergency, or regular doctor visit. The authorized physician logs onto the PHR website and is validated as an authorized party to receive the re-identified patient data.”);
and providing the reidentified data to the requester (paragraphs 0031 and 0052-0053 “The re-identified patient data can be provided in response to several different use scenarios related to requests for information about a patient by a physician, care provider, lab, insurance company, and so on. Among the most common use cases is where a physician pulls up a patient's records in order to provide care requested by the patient, such as in a diagnostic appointment, emergency, or regular doctor visit. The authorized physician logs onto the PHR website and is validated as an authorized party to receive the re-identified patient data.”).
Yu does not explicitly teach receiving data, thus defining received data, wherein the data includes, at least in part, physiological alarm data, and wherein receiving the data includes parsing real-time alarm messages and translating information form the physiological alarm data into a structured format. However, Tonna teaches Systems and Methods for Managing Patient Devices and further teaches, “…The PD integration module 120 may be further configured to monitor for detection of alarm conditions at the patient device(s) 170. The patient device 170 may be located within a particular patient area 180 (e.g., a room, an intensive care unit (ICU), a recovery room, and/or the like), and may be used to provide healthcare-related services to a patient 182…” (paragraph 0040), “…As used herein, an “alarm notification” refers to electronic data comprising information pertaining to an alarm condition detected at a patient device 170. An alarm notification 132 may comprise electronic data embodied on a machine-readable memory, communication network, and/or non-transitory storage system, such as the storage resources 115 of the computing device 111 and/or a storage system 160. As disclosed in further detail herein, an alarm notification 132 may comprise any suitable information pertaining to an alarm condition of a patient device 170…” (paragraph 0041), “The processing module 332 may be configured to process state data 172A-N acquired from the patient devices 170A-N by the PD integration module 120. The processing module 332 may be configured to, inter alia, convert, normalize, and/or translate the acquired data into a format usable by other components of the PD management system 110. The processing module 332 may be further configured to parse the data acquired from the patient devices 170A-N in order to, inter alia, identify alarm conditions detected at the patient devices 170A-N, determine information pertaining to detected alarm conditions (e.g., extract and/or parse state data 172A-N of the patient devices 170A-N), and so on.” (paragraph 0067), “The evaluation module 334 may be configured to generate alarm notifications 132 in response to detection of alarm conditions at the patient devices 170A-N. Generating an alarm notification 132 may comprise generating electronic data that, inter alia, describes the alarm condition, identifies the patient device 170A-N corresponding to the alarm condition, identifies the patient area 180 and/or patient 182 corresponding to the alarm condition, assigns a criticality level to the alarm condition, assigns a response type to the alarm condition, and so on…” (paragraph 0068), and “…The evaluation module 334 may be configured to determine information identifying the condition and/or event that triggered detection of the alarm condition at the patient device 170A-N, which may be extracted and/or parsed from state data 172A-N acquired from the patient device 170A-N. Such information may include, but is not limited to: monitoring data that triggered the alarm condition (e.g., a blood pressure value below a threshold), status data that triggered the alarm condition (e.g., less than a threshold amount of remaining medication), and so on. Therefore, it would have been obvious to one of ordinary skilled in the art at the time of filing to modify Yu to include receiving data, thus defining received data, wherein the data includes, at least in part, physiological alarm data, and wherein receiving the data includes parsing real-time alarm messages and translating information form the physiological alarm data into a structured format as taught by Tonna in order to standardize the data received from a machine or device.
Yu and Tonna do not explicitly teach deidentifying the received data to generate deidentified data. However, Chandrasekaran teaches Computing System for De-Identifying Patient Data and further teaches, “The server computing system 101 receives the patient data 112 from a source. In an example, the source may be an electronic health records application (EHR) that has generated the patient data 112. In another example, the source may be a data warehouse. In still another example the source may be an HIE. The server computing system 101 can then receive a request to de-identify the patient data 112. In an exemplary embodiment, the server computing system 101 receives the request to de-identify the patient data 112 from a client computing system 118 executing a client de-identifying application 126. In another exemplary embodiment, the server computing system 101 receives the request to de-identify the patient data 112 by way of user input received at the server computing system 101.” (column 7, lines 42-55) and “The server computing system 101 further includes a data store 110. The data store 110 includes patient data 112 pertaining to a patient. For example, the patient data 112 can be or include data that pertains to a patient and that has been generated by an electronic health records application (EHR). In another example, the patient data 112 can be or include data that pertains to a patient that was retrieved from a data warehouse, healthcare information exchange (HIE), or other repository of patient data. The data store 110 can further include de-identified patient data 114 (e.g., as generated by the de-identifying application 106 based upon patient data 112). The de-identifying application 106 is configured to generate the de-identified patient data 114 based upon the patient data 112.” (column 5, lines 22-35). Therefore, it would have been obvious to one of ordinary skilled in the art at the time of filing to modify Yu to include deidentifying the received data to generate deidentified data as taught by Chandrasekaran in order to protect patient privacy.
Claims 2, 12, and 22:
As per claims 2, 12, and 22, Yu, Tonna, and Chandrasekaran teach a method, computer program product, and system of claims 1, 11, and 21 as described and Chandrasekaran further teaches above wherein the received data includes one or more of:
patient data (column 5, line 66 to column 6, line 45). Therefore, it would have been obvious to one of ordinary skilled in the art at the time of filing to modify Yu to include patient data as taught by Chandrasekaran in order to analyze data associated with a patient.
treatment data (column 5, line 66 to column 6, line 11). Therefore, it would have been obvious to one of ordinary skilled in the art at the time of filing to modify Yu to include treatment data as taught by Chandrasekaran in order to identify conditions and remedies associated with a patient’s condition.
billing data (column 6, lines 29-31). Therefore, it would have been obvious to one of ordinary skilled in the art at the time of filing to modify Yu to include billing data as taught by Chandrasekaran in order to determine the overall costs of a patient’s treatment or medication.
and technical alarm data.
Claims 3, 13, and 23:
As per claims 3, 13, and 23, Yu, Tonna, and Chandrasekaran teach a method, computer program product, and system of claims 1, 11, and 21 as described above and Chandrasekaran further teaches wherein the received data includes healthcare data (column 5, line 66 to column 6, line 45). Therefore, it would have been obvious to one of ordinary skilled in the art at the time of filing to modify Yu to include wherein the received data includes healthcare data as taught by Chandrasekaran in order to capture and analyze all types of information within healthcare.
Claims 4, 14, and 24:
As per claims 4, 14, and 24, Yu, Tonna, and Chandrasekaran teach a method, computer program product, and system of claims 1, 11, and 21 as described above and Yu further teaches wherein the requester is a healthcare professional (paragraphs 0031, 0039, and 0052).
Claims 5, 15, and 25:
As per claims 5, 15, and 25, Yu, Tonna, and Chandrasekaran teach a method, computer program product, and system of claims 1, 11, and 21 as described above and Chandrasekaran further teaches wherein deidentifying the received data to generate deidentified data includes:
converting the received data from a first format to a common format when generating the deidentified data (column 2, line 58 to column 3, line 7 and column 9, line 24-56). Therefore, it would have been obvious to one of ordinary skilled in the art at the time of filing to modify Yu to include converting the received data from a first format to a common format when generating the deidentified data as taught by Chandrasekaran in order to standardize the data and remove any errors or inconsistencies.
Claims 6, 16, and 26:
As per claims 6, 16, and 26, Yu, Tonna, and Chandrasekaran Yu and Chandrasekaran teach the method, computer program product, and system of claims 1, 11, and 21 as described above and Yu further teaches wherein reidentifying the requested data to generate reidentified data if the requester has privilege to receive a reidentified version of the requested data includes:
determining when the requester has privilege to receive the reidentified version of the requested data (paragraphs 0031 and 0051-0053).
Claims 7, 17, and 27:
As per claims 7, 17, and 27, Yu, Tonna, and Chandrasekaran teach the method, computer program product, and system of claims 1, 11, and 21 as described above and Chandrasekaran further teaches further comprising:
storing the deidentified data within a data repository (column 6, line 59 to column 7, line 9). Therefore, it would have been obvious to one of ordinary skilled in the art at the time of filing to modify Yu to include storing the deidentified data within a data repository as taught by Chandrasekaran in order to maintain records of the deidentified data.
Claims 8, 18, and 28:
As per claims 8, 18, and 28, Yu, Tonna, and Chandrasekaran teach the method, computer program product, and system of claims 1, 11 and 21 as described above and Chandrasekaran further teaches wherein the received data is from a first source (column 3, lines 17-28 and column 7, lines 42-47). Therefore, it would have been obvious to one of ordinary skilled in the art at the time of filing to modify Yu to include wherein the received data is from a first source as taught by Chandrasekaran in order to receive a particular set of data.
Claims 9, 19 and 29:
As per claims 9, 19, and 29, Yu, Tonna, and Chandrasekaran teach the method, computer program product, and system of claims 8, 18, and 28 as described above and Chandrasekaran further teaches wherein the first source includes one or more of:
a database system (column 5, lines 24-30 and column 7, lines 41-47). Therefore, it would have been obvious to one of ordinary skilled in the art at the time of filing to modify Yu to include a database system as taught by Chandrasekaran to store data.
an asset management system;
a records system (column 5, lines 24-30 and column 7, lines 41-47). Therefore, it would have been obvious to one of ordinary skilled in the art at the time of filing to modify Yu to include a records system as taught by Chandrasekaran in order to maintain historical records of information or data.
a human resources system;
an insurance system;
a monitoring system;
a middleware system that aggregates data signals;
and one or more on-network devices.
Claims 10, 20, and 30:
As per claims 10, 20, and 30, Yu, Tonna, and Chandrasekaran teach a method, computer program product, and system of claims 8, 18, and 28 as described above and Chandrasekaran further teaches wherein the received data is from at least a second source (column 3, lines 17-28 and column 7, lines 42-47). Therefore, it would have been obvious to one of ordinary skilled in the art at the time of filing to modify Yu to include wherein the received data is from at least a second source as taught by Chandrasekaran in order to receive a particular set of data.
Response to Arguments
Applicant’s arguments, see pages 9-11, filed February 2, 2026, with respect to the rejection(s) of claim(s) 1-30 under 35 U.S.C. 103 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of Yu, Tonna, and Chandrasekaran under 35 U.S.C. 103.
According to applicant’s argument on page 8 of the remarks discloses, “Claims 1-30 were rejected under 35 U.S.C. §101 for the stated reason that the claimed invention is directed to the abstract idea of mental processes because the claims include steps that can be performed in the human mind by observation, evaluation, and judgment." See, instant action, page 6. Without conceding the foregoing, herein the independent claims have been amended to include additional features. Applicant respectfully submits that as amended herein the independent claims are not directed toward an abstract idea. Further, even if the independent claims could be considered to include an abstract idea, Applicant respectfully submits that any such abstract idea is integrated into a practical application thereof. Accordingly, withdrawal of the rejections under 35 U.S.C. §101 is respectfully requested.” The examiner respectfully disagrees.
Independent claims 1, 11, and 21 recite the limitations, “receiving data, thus defining received data, wherein the data includes, at least in part, physiological alarm data, and wherein receiving the data includes parsing real-time alarm messages and translating information from the physiological alarm data into a structured format; deidentifying the received data to generate deidentified data; receiving a request from a requester for at least a portion of the deidentified data, thus defining requested data; reidentifying the requested data to generate reidentified data when the requester has privilege to receive a reidentified version of the requested data; and providing the reidentified data to the requester.” are directed to the abstract idea of mental processes. Specifically, the limitations recited above consists of steps that can performed in the human mind by observation, evaluation, and judgment. In particular, the steps of receiving data, parsing data, translating data into a structured format, deidentifying and reidentifying require observation, evaluation and judgment of data by a user or human to perform an activity such as protecting a patient’s health data/privacy. Additionally, the applicant's limitations are not "integrated into a practical application" because they are adding words "apply it" with the judicial exception, or mere instructions to implement an abstract idea merely as a tool to perform an abstract idea (see MPEP 2106.05(f)). In addition, improvements to the functioning of a computer or any other technology or technical field have not been shown or disclosed (see MPEP 2106.05(a)). Therefore, the examiner maintains the rejection.
Objections to claims 1, 6, 11, 16, 21, and 26 have been withdrawn.
Claims 11-20 are rejected under 35 U.S.C. 101 due computer readable media that covering signals per se have been withdrawn.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Hubert et al. US Publication 20170095217 A1 Unobtrusive Advisors for Patient Monitor
Hubert discloses a patient monitor system (10) includes a signal processor (30) which analyzes signals from physiological parameter sensors (20) to derive physiological data values, plots, and alarms. A display (12) displays physiological values and plots (42). An advisor engine, routine, or processor (34) is connected with the signal processor to analyze at least alarm occurrences, also in combination with user interactions, and provide directly actionable advice for reducing alarm frequency which is displayed unobtrusively on the display. The directly actionable advice includes specific actions or adjustments which are specific to a current patient, a present situation, or a present mode or setting of the patient monitor system. The unobtrusively displayed directly actionable advice is displayed without the use requesting it.
McNeal et al. US Patent 7671733 B2 Method and System for Medical Alarm Monitoring, Reporting and Normalization
McNeal discloses a system for monitoring and reporting medical alarms includes an alarm messenger for receiving an alarm signal from monitored equipment. The alarm signal includes information to enable determination of the location of the monitored equipment. The alarm messenger outputs an alarm messenger signal including the information. A database includes a master association table stored in the database. A central server receives the alarm signal, utilizes the information from the alarm signal to access the master association table to determine alarm information and, in response to the alarm information, notifies the appropriate staff of an alarm condition.
Neubauer US Publication 20250014451 A1 Systems and Methods to Reduce Alarm Fatigue
Neubauer discloses a method for managing an alarm issued by a medical device (132) is disclosed. The method includes the steps of receiving a first alarm from the medical device (132), retrieving a dynamic attribute (Tables 1, 10) associated with the patient, assigning a sub-priority to the alarm based in part on evaluation of the dynamic attribute, and providing an alert (112-116) to a staff member that is associated with the sub-priority.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MATTHEW L HAMILTON whose telephone number is (571)270-1837. The examiner can normally be reached Monday-Thursday 9:30-5:30 pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Fonya Long can be reached at (571)270-5096. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MATTHEW L HAMILTON/Primary Examiner, Art Unit 3682