DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Specification
The disclosure is objected to because of the following informalities:
Paragraph 44, Line 2: The word “a” should be deleted or the word “environments” should be singular.
Paragraph 92, Line 4: The word –above—has been misspelled in this line.
Paragraph 132, Line 15: The patient is numbered –232—in Figure 3.
Paragraph 140, Line 7: A word –to—is needed after the word “respect”.
Appropriate correction is required.
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claims 1-11, 13-24, 26-37, and 39 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1, 2, 4-12, 14-22, and 24-27 of U.S. Patent No. 12,106,851. Although the claims at issue are not identical, they are not patentably distinct from each other because the subject claims 1-11, 13-24, 26-37, and 39 are a broader version of patented claims 1, 2, 4-12, 14-22, and 24-27. See comparison and explanation in the table below.
Claims 12, 25, and 38 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims, 1, 2, 11, 12, 21, and 22 of U.S. Patent No. 12,106,851, and further in view of prior art reference Shi et al (US 2023/0098165 A1).
Subject claims
Patented claims
1. A computer-implemented method, executed on a computing device, comprising: defining an incident as the occurrence of a plurality of required alarms; monitoring a plurality of devices to detect the occurrence of alarms, thus defining a plurality of detected alarms; and predicting the occurrence of the incident if a defined portion of the plurality of required alarms has occurred.
1. A computer-implemented method, executed on a computing device, comprising: defining an incident as the occurrence of a plurality of required alarms; defining an event as the occurrence of a plurality of required incidents; monitoring a plurality of devices to detect the occurrence of alarms, thus defining a plurality of detected alarms, including: monitoring the plurality of devices to receive data signals indicative of the plurality of devices; and comparing the data signals to defined signal norms to identify one or more of the plurality of detected alarms wherein the defined signal norms include patient-specific signal norms that are automatically defined by processing data signals concerning a patient over a defined period of time; defining the event as having occurred if the plurality of detected alarms includes the plurality of required alarms for each of the plurality of required incidents; processing one or more of the plurality of detected alarms to determine an authenticity of one or more of the plurality of detected alarms; and adjusting one or more monitoring criteria of one or more of the plurality of devices producing the plurality of detected alarms when one or more of the plurality of detected alarms is non-authentic.
Claim 2
Claim 2
Claim 3
Claim 1: monitoring the plurality of devices to receive data signals indicative of the plurality of devices; and comparing the data signals to defined signal norms to identify one or more of the plurality of detected alarms wherein the defined signal norms include patient-specific signal norms that are automatically defined by processing data signals concerning a patient over a defined period of time
Claim 4
Claim 4
Claim 5
Claim 5
Claim 6
Claim 6
Claim 7
Claim 7
Claim 8
Claim 8
Claim 9
Claim 9
Claim 10
Claim 10
Claim 11
Claim 7
12. The computer-implemented method of claim 1 wherein predicting the occurrence of the incident if a defined portion of the plurality of required alarms has occurred includes: requiring that the defined portion of the plurality of required alarms have occurred in a defined sequence.
In view of patented claim 2 and further in view of cited prior art Shi See ¶ 0021 and 0041.
Claim 13
Claim 2
14. A computer program product residing on a computer readable medium having a plurality of instructions stored thereon which, when executed by a processor, cause the processor to perform operations comprising: defining an incident as the occurrence of a plurality of required alarms; monitoring a plurality of devices to detect the occurrence of alarms, thus defining a plurality of detected alarms; and predicting the occurrence of the incident if a defined portion of the plurality of required alarms has occurred.
11. A computer program product residing on a non-transitory computer readable medium having a plurality of instructions stored thereon which, when executed by a processor, cause the processor to perform operations comprising: defining an incident as the occurrence of a plurality of required alarms; defining an event as the occurrence of a plurality of required incidents; monitoring a plurality of devices to detect the occurrence of alarms, thus defining a plurality of detected alarms, including: monitoring the plurality of devices to receive data signals indicative of the plurality of devices; and comparing the data signals to defined signal norms to identify one or more of the plurality of detected alarms wherein the defined signal norms include patient-specific signal norms that are automatically defined by processing data signals concerning a patient over a defined period of time; defining the event as having occurred if the plurality of detected alarms includes the plurality of required alarms for each of the plurality of required incidents; processing one or more of the plurality of detected alarms to determine an authenticity of one or more of the plurality of detected alarms; and adjusting one or more monitoring criteria of one or more of the plurality of devices producing the plurality of detected alarms when one or more of the plurality of detected alarms is non-authentic.
Claim 15
Claim 12
Claim 16
Claim 11: …monitoring the plurality of devices to receive data signals indicative of the plurality of devices; and comparing the data signals to defined signal norms to identify one or more of the plurality of detected alarms wherein the defined signal norms include patient-specific signal norms that are automatically defined by processing data signals concerning a patient over a defined period of time
Claim 17
Claim 14
Claim 18
Claim 15
Claim 19
Claim 16
Claim 20
Claim 17
Claim 21
Claim 18
Claim 22
Claim 19
Claim 23
Claim 20
Claim 24
Claim 17
Claim 25
In view of patented claim 12 and further in view of cited prior art Shi See ¶ 0021 and 0041.
Claim 26
Claim 13
27. A computing system including a processor and memory configured to perform operations comprising: defining an incident as the occurrence of a plurality of required alarms; monitoring a plurality of devices to detect the occurrence of alarms, thus defining a plurality of detected alarms; and predicting the occurrence of the incident if a defined portion of the plurality of required alarms has occurred.
21. A computing system including a processor and memory configured to perform operations comprising: defining an incident as the occurrence of a plurality of required alarms; defining an event as the occurrence of a plurality of required incidents; monitoring a plurality of devices to detect the occurrence of alarms, thus defining a plurality of detected alarms, including: monitoring the plurality of devices to receive data signals indicative of the plurality of devices; and comparing the data signals to defined signal norms to identify one or more of the plurality of detected alarms wherein the defined signal norms include patient-specific signal norms that are automatically defined by processing data signals concerning a patient over a defined period of time; defining the event as having occurred if the plurality of detected alarms includes the plurality of required alarms for each of the plurality of required incidents; processing one or more of the plurality of detected alarms to determine an authenticity of one or more of the plurality of detected alarms; and adjusting one or more monitoring criteria of one or more of the plurality of devices producing the plurality of detected alarms when one or more of the plurality of detected alarms is non-authentic.
Claim 28
Claim 22:
Claim 29
Claim 21:… monitoring a plurality of devices to detect the occurrence of alarms, thus defining a plurality of detected alarms, including: monitoring the plurality of devices to receive data signals indicative of the plurality of devices; and comparing the data signals to defined signal norms to identify one or more of the plurality of detected alarms wherein the defined signal norms include patient-specific signal norms that are automatically defined by processing data signals concerning a patient over a defined period of time
Claim 30
Claim 24
Claim 31
Claim 25
Claim 32
Claim 26
Claim 33
Claim 27
Claim 34
Claim 18
Claim 35
Claim 19
Claim 36
Claim 20
Claim 37
Claim 27
Claim 38
In view of patented claim 22 and further in view of cited prior art Shi See ¶ 0021 and 0041.
Claim 39
Claim 22
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 14-26 are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. Claims 14-26 are directed toward a “computer program product”, e.g. a “computer instructions”, which is a non-statutory invention, because a computer instructions are not a machine, process, article of manufacturer, or composition of matter, or any new and useful improvement thereof. To be statutory the claim must claim a machine, process, article of manufacturer, or composition of matter, or any new and useful improvement thereof. An ordinary skilled in the art interprets a non-tangible modulated signal as a computer readable medium. The specification and the claims must define the patentable subject matter away from being a “computer instruction” without adding any new matter. Please make appropriate corrections.
A claim drawn to such a computer readable medium that covers both transitory and non-transitory embodiments may be amended to narrow the claim to cover only statutory embodiments to avoid a rejection under 35 U.S.C. § 101 by adding the limitation "non-transitory" to the claim. Such an amendment would typically not raise the issue of new matter, even when the specification is silent because the broadest reasonable interpretation relies on the ordinary and customary meaning that includes signals per se. The limited situations in which such an amendment could raise issues of new matter occur, for example, when the specification does not support a non-transitory embodiment because a signal per se is the only viable embodiment such that the amended claim is impermissibly broadened beyond the supporting disclosure. See, e.g., Gentry Gallery, Inc. v. Berkline Corp., 134F.3d 1473 (Fed. Cir. 1998).
Claims 1-39 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more. Claim(s) 1-13 are directed to a “process”, claims 14-26 are directed to “a computer program product,” claims 27-39 are directed to a “machine.” The claim(s) does/do not include additional elements that are sufficient to amount to significantly more than the judicial exception because the claimed steps could be met by: Claims 1-13 have following additional elements “computing device,” “plurality of devices,” claims 14-26 have following additional element: “computer readable medium,” “processor,” “plurality of devices,” claims 27-39 have the following additional elements: “computer system,” “processor,” memory, and “plurality of devices.” Claims 9, 22 and 35 recites , “wherein the plurality of devices includes one or more of: … a computing device.”
Step 1: Is the claim to a process, machine, manufacture or composition of matter? Claim 14-261: No, See 35 USC 101 rejection for invention directed to non-statutory subject matter. Claim 1-13, and 27-39: Yes, because Claim(s) 1-13 are directed to a “process”, claims 27-39 are directed to a “machine.”
Can Analysis be streamlined? No, because when viewed claims 1-13, and 27-39 as a whole, the eligibility of the claim is not self-evident.
Step 2A Prong One: evaluate whether the claim recites a judicial exception (an abstract idea enumerated in the 2019 PEG, a law of nature, or a natural phenomenon). Yes, because the claims 1-13, and 27-39 recite a judicial exception:
The claims 1-13, and 27-39 to fall into the “Mental Process’” category of abstract ideas defined by the courts and 2019 PEG Guidance. Specifically, the claims to fall into the following subcategories: Concepts Relating To Organizing Or Analyzing Information In A Way That Can Be Performed Mentally Or Is Analogous To Human Mental Work: For Example, with respect to independent claims 1 and 27:
defining an incident as the occurrence of a plurality of required alarms, i.e. (setting a rule/definition) a human mental process (concepts performed in the human mind e.g., observation, evaluation, judgment) per Oct. 2019 Update;
monitoring a plurality of devices to detect the occurrence of alarms, i.e. ” (collecting/monitoring data) a data gathering activity; and
predicting the occurrence of the incident, i.e. (evaluative decision based on a condition) a human mental process,
if a defined portion of the plurality of required alarms has occurred, i.e. (portion/threshold logic implicates mathematical relationships), Mathematical concepts (mathematical relationships/thresholds);
In this case, examples of gathering information/data and mathematical concepts can be performed by a human with a pen and paper; examples of mental process can be performed by a human mental process See MPEP 2106.05(g) and Vanda Memo.
Analysis of dependent claims 2-13, and 28-39: dependent claims seem to narrow the information contained in the abstract idea and/or within human implementation; however, “defining an incident… monitoring the plurality of devices to receive data signals… predicting the occurrence of the incident” is general and does not involve actually carrying out the process in a meaningful way. Therefore, Claims 2-13, and 28-39 are nothing more than, examples of gathering information/data can be performed by a human with a pen and paper; examples of mental process can be performed by a human mental process.
Step 2A Prong Two: Identifying whether there are any additional elements recited in the claim beyond the judicial exception(s), and evaluating those additional elements to determine whether they integrate the judicial exception into a practical application? No, because the claims do not recite any additional elements recited in the claim beyond the judicial exception, and those additional elements do not integrate the judicial exception into a practical application because:
It is Examiner’s position that claims 1-13, and 27-39 comprise following additional elements: “computing device,” “plurality of devices,” “computer system,” “processor,” and memory” The additional elements do not integrate into a practical application of the exception. Generally linking the use of the judicial exception to a particular technological environment or field of use – see MPEP 2106.05(h). Furthermore, Applying or using the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception - see MPEP 2106.05(e) and Vanda Memo.
Furthermore, the additional elements “computing device,” “plurality of devices,” “computer system,” “processor,” and memory” perform no meaningful improvement or meaningful limitation: including (i) improvement to computer or (ii) improvement a non-computer technology in the field of “alarm detection”. The additional elements “computing device,” “plurality of devices,” “computer system,” “processor,” and memory” do not “use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that the claim is more than a drafting effort designed to monopolize the judicial exception.” The additional elements only add insignificant extra-solution activity to the judicial exception - see MPEP 2106.05(g)
Analysis of dependent claims 2-13, and 28-39: dependents seem to narrow the information contained in the abstract idea and/or within human implementation; however, “the plurality of required alarms is defined via massive data sets that are processed by machine learning” examples of insignificant extra solution activities have been ruled ineligible subject matter by the superior courts, See MPEP 2106.05(g) and Vanda Memo.
Step 2B: Does the claim recite additional elements that amount to “significantly more” than the judicial exception? No, because the claims “as a whole” do not recite additional elements that amount to “significantly more” than the judicial exception because:
It is Examiner’s position that claims 1-13, and 27-39 comprise following additional elements: “computing device,” “plurality of devices,” “computer system,” “processor,” and “memory.” However, these elements are identified as generic components and “as a whole” do not amount to “significantly more” than an abstract idea.
At best, the claimed subject matter requires the use of a generic “computing device,” “plurality of devices,” “computer system,” “processor,” and “memory,” which are shown in the prior art rejections of claims 1-39, below. Examiner has cited sections of Boyer in view of Shi et al, that teach these elements; therefore, these elements alone and in combination do not qualify as something “significantly more” than an abstract idea. The claimed limitations are (i) routine and conventional in the field of “alarm detection.”
Limitations that are indicative of an inventive concept (aka “significantly more”):
Improvements to the functioning of a computer, or to any other technology or technical field - see MPEP 2106.05(a): None.
Applying the judicial exception with, or by use of, a particular machine - see MPEP 2106.05(b): None.
Effecting a transformation or reduction of a particular article to a different state or thing - see MPEP 2106.05(c): None.
Applying or using the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception - see MPEP 2106.05(e) and Vanda Memo: None.
Adding a specific limitation other than what is well-understood, routine, conventional activity in the field - see MPEP 2106.05(d): None.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 8, 9, 21, and 34 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 8, 21, and 34, recite, “wherein the machine-defined signal norms are compartmentalized (e. g., gender, race, age, location, device type, device class, seasonality, time of day, etc.).”
First, it is unclear to the examiner whether applicant is trying to claim, “gender, race, age, location, device type, device class, seasonality, time of day” as one of the Markush elements of the claimed machine-defined signal norms.
Secondly, the phrase "for example" or “e.g.” renders the claim indefinite because it is unclear whether the limitation(s) following the phrase are part of the claimed invention. See MPEP § 2173.05(d).
Finally, the phrase “or the like” or “etc.” renders the claim(s) indefinite because the claim(s) include(s) elements not actually disclosed (those encompassed by "or the like" or “etc.”), thereby rendering the scope of the claim(s) unascertainable. See MPEP § 2173.05(d).
Applicant may recite the limitations as following in an effort to overcome these rejections: wherein the machine-defined signal norms are compartmentalized, the machine-defined signal norms includes at least one of [[(e. g.,]] gender, race, age, location, device type, device class, seasonality, or time of day[[, etc.)]].
Claim 9 recites, “wherein the plurality of devices includes one or more of: “a computing device,” however, this element was previously introduced in claim 1 and it is unclear to the examiner whether the claim 9 is referring to the claimed “computing device” introduced in claim 1 or another “computing device.” Therefore claim 9 is rejected for improper antecedent basis of claimed “a computing device.”
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-39 are rejected under 35 U.S.C. 103 as being unpatentable over Boyer (US 2016/0093205 A1) in view of Shi et al (US 2023/0098165 A1).
Consider claim 1, Boyer teaches, a computer-implemented method, executed on a computing device (12, Boyer teaches, “processor configured to execute code… e.g., stored in a memory of the monitor 12” See ¶ 0022), Boyer teaches, “generating an alarm in response to a determination that the physiological signal or physiological parameter value meets an alarm condition” See ¶ 0006, comprising:
defining an incident as the occurrence of a plurality of required alarms, Boyer teaches, “[f]or example, the physiological signal or physiologic data may be received from a sensor (e.g., the sensor 14) or from one or more local or remote medical devices.” See ¶ 0043; Boyer teaches, “[a]n alarm is generated by a medical device (such as the monitor 12 or a therapeutic device, such as a ventilator) when an alarm condition or protocol is met. Alarm conditions include several types, such as physiologic alarm conditions, patient event alarm conditions, and device alarm conditions…. Further, alarm conditions may be based on a combination of different alarm conditions, such as two physiologic parameters each violating a respective limit, a combined alarm index violating a limit, or specified combinations of monitor and sensor status events.” See ¶ 0025; Boyer teaches, “[i]n another embodiment, analysis of the collected relevance event data may reveal a relationship between two or more physiologic parameters, and modifying the alarm condition may include combining alarm conditions from two or more physiologic parameters. The new combined alarm is not triggered unless both (or all) conditions are met” See ¶ 0055;
monitoring a plurality of devices to detect the occurrence of alarms, thus defining a plurality of detected alarms, Boyer teaches, “collecting relevance data for triggered alarms in accordance with an embodiment. The method 250 includes receiving a physiological signal or physiologic data (block 252). For example, the physiological signal or physiologic data may be received from a sensor (e.g., the sensor 14) or from one or more local or remote medical devices… The method 250 also includes determining whether the physiological signal or physiologic data meets an alarm condition (block 254)… In response to determining that the physiological signal or physiologic data meets the alarm condition, the method 250 includes generating an alarm (block 256). Additionally, the method 250 includes receiving a relevance indicator (e.g., the relevance indicator 220) indicating the relevance of the generated alarm (block 258) and storing the relevance indicator and the alarm condition (block 260).” See ¶ 0043; Boyer teaches, “[i]n another embodiment, analysis of the collected relevance event data may reveal a relationship between two or more physiologic parameters, and modifying the alarm condition may include combining alarm conditions from two or more physiologic parameters. The new combined alarm is not triggered unless both (or all) conditions are met” See ¶ 0055; and
[[predicting]] determining the occurrence of the incident if a defined portion of the plurality of required alarms has occurred, Boyer teaches, “[a]n alarm is generated by a medical device (such as the monitor 12 or a therapeutic device, such as a ventilator) when an alarm condition or protocol is met. Alarm conditions include several types, such as physiologic alarm conditions, patient event alarm conditions, and device alarm conditions…. Further, alarm conditions may be based on a combination of different alarm conditions, such as two physiologic parameters each violating a respective limit, a combined alarm index violating a limit, or specified combinations of monitor and sensor status events” See ¶ 0025; Boyer teaches, “In another embodiment, analysis of the collected relevance event data may reveal a relationship between two or more physiologic parameters, and modifying the alarm condition may include combining alarm conditions from two or more physiologic parameters. The new combined alarm is not triggered unless both (or all) conditions are met” See ¶ 0055.
Boyer does not explicitly state, predicting the occurrence of the incident; nonetheless, in an analogous art, Shi teaches, “systems… for correlating an event with an existing event record based on machine-learning correlation models. The system may comprise one or more processors, event memory storing information related to a plurality of existing event records, and one or more memory devices storing program code to be executed by the one or more processors.” See ¶ 0003, Shi teaches, “[t]he trained supervised machine-learning model(s) 114 may be configured to predict whether a newly received alert (or received alert information) is correlated with one or more of the plurality of existing incident records 142. The one or more correlated existing incident records may be referred to as correlation candidates.” See ¶ 0039. Shi teaches, “retrieve a frequent pattern model prediction for the event, determine first patterns for the event based on the frequent pattern model prediction, perform a first search of the event memory for matching frequent patterns in the plurality of existing event records, and return a first list of possible event records correlated to the event from the plurality of existing event records in response the first search, (2) retrieve a sequential pattern model prediction for the event,” See ¶ 0126.
It would have been obvious to one of ordinary skilled in the art at the time of invention (effective filing date for AIA application) to modify the invention of Boyer and have a “trained supervised machine-learning model(s) 114 may be configured to predict whether a newly received alert (or received alert information) is correlated with one or more of the plurality of existing incident” as suggested by Shi in an effort to determine, “whether a newly received alert is related to an existing issue (i.e., existing incident record) that's already being worked on, or a whether the new alert pertains to a new issue that should be opened in their incident” See ¶ 0001.
Consider claim 2, the computer-implemented method of claim 1 wherein defining an incident as the occurrence of a plurality of required alarms includes:
defining an incident as the occurrence of a plurality of required alarms within a defined period of time, Boyer teaches, “analysis of the collected relevance event data may reveal a relationship between two or more physiologic parameters, and modifying the alarm condition may include combining alarm conditions from two or more physiologic parameters. The new combined alarm is not triggered unless both (or all) conditions are met… Accordingly, a new alarm protocol may be created that triggers an alarm when these identified conditions are all met at the same time.” See ¶ 0055.
Consider claim 3, the computer-implemented method of claim 1 wherein monitoring a plurality of devices to detect the occurrence of alarms includes:
monitoring the plurality of devices to receive data signals indicative of the plurality of devices, Boyer teaches, “collecting relevance data for triggered alarms in accordance with an embodiment. The method 250 includes receiving a physiological signal or physiologic data (block 252). For example, the physiological signal or physiologic data may be received from a sensor (e.g., the sensor 14) or from one or more local or remote medical devices… The method 250 also includes determining whether the physiological signal or physiologic data meets an alarm condition (block 254)… In response to determining that the physiological signal or physiologic data meets the alarm condition, the method 250 includes generating an alarm (block 256). Additionally, the method 250 includes receiving a relevance indicator (e.g., the relevance indicator 220) indicating the relevance of the generated alarm (block 258) and storing the relevance indicator and the alarm condition (block 260)” See ¶ 0043; and
comparing the data signals to defined signal norms to identify one or more of the plurality of detected alarms, Boyer teaches, “An alarm is generated by a medical device (such as the monitor 12 or a therapeutic device, such as a ventilator) when an alarm condition or protocol is met. Alarm conditions include several types, such as physiologic alarm conditions, patient event alarm conditions, and device alarm conditions. Physiologic alarm conditions trigger an alarm when a measured or calculated physiologic parameter satisfies an alarm condition, such as when the parameter value crosses a threshold, deviates from a specified range, matches a stored pattern, deviates from a threshold for a specified time and/or extent (e.g., exceeding a limit on a value of an integral taken between the parameter value and a threshold), or meets other conditions that indicate a clinically significant event.).” See ¶ 0025.
Consider claim 4, the computer-implemented method of claim 3 wherein the data signals concern one or more details of the plurality of devices and/or one or more uses of the plurality of devices, Boyer teaches, “An alarm is generated by a medical device (such as the monitor 12 or a therapeutic device, such as a ventilator) when an alarm condition or protocol is met. Alarm conditions include several types, such as physiologic alarm conditions, patient event alarm conditions, and device alarm conditions.” See ¶ 0025; Boyer teaches, “collecting relevance data for triggered alarms in accordance with an embodiment. The method 250 includes receiving a physiological signal or physiologic data (block 252). For example, the physiological signal or physiologic data may be received from a sensor (e.g., the sensor 14) or from one or more local or remote medical devices.” See ¶ 0043.
Consider claim 5, the computer-implemented method of claim 3 wherein the defined signal norms include user-defined signal norms, Boyer teaches, “the stored relevance event data 222 may include information about the patient such as patient characteristics (e.g., age, weight, height, gender, race, condition, diagnosis, or others) or the patient's overall health index. In some embodiments, the patient health index is a numerical value provided by the user. For example, a caregiver may assess the physiological parameter data of the patient and determine a patient health index…In other embodiments, the patient health index 330 may be a numeric value between −5 and 5 or between −3 and 3, where a patient health index of 0 is indicative of an acceptable or normal physiological status and a higher patient health index (positive or negative) is indicative of a worsening physiological status.” See ¶ 0035.
Consider claim 6, the computer-implemented method of claim 3 wherein the defined signal norms include machine-defined signal norms, Boyer teaches, “The physiological input 204 may include an incoming raw or processed physiologic signal, or measured or calculated physiologic data. The physiological input 204 may be received from a sensor coupled to the patient (e.g., the sensor 14) or from other medical devices” see ¶ 0030; Boyer teaches, “collecting relevance data for triggered alarms in accordance with an embodiment. The method 250 includes receiving a physiological signal or physiologic data (block 252). For example, the physiological signal or physiologic data may be received from a sensor (e.g., the sensor 14) or from one or more local or remote medical devices…Further, in some embodiments, the plurality of physiological signals may include at least two different types of physiological signals (e.g., a photoplethysmograph signal, an electrocardiography signal, a blood pressure signal, etc.) and the plurality of physiological parameter values may include at least two different types of physiological parameter values (e.g., oxygen saturation, heart rate, respiration rate, blood pressure, BISPECTRAL™ index, etc.).” See ¶ 0043
Consider claim 7, the computer-implemented method of claim 6 wherein the machine-defined signal norms are defined via massive data sets that are processed by machine learning, Boyer teaches, “the processor 206 may include a statistical analysis engine or machine learning engine 224. The statistical analysis engine or machine learning engine 224 may analyze the collected relevance event data 222 to identify and modify nuisance alarm conditions.” See ¶ 0039; Boyer teaches, “The data associated with the plurality of generated alarms may include the alarm conditions and other relevance event data, as described in detail above.” See ¶ 0049; Boyer teaches, “Analyzing the data and the relevance indicators may include performing statistical analysis on the collected data…or any other classification or learning-based algorithms” See ¶ 0052.
Consider claim 8, the computer-implemented method of claim 6 wherein the machine-defined signal norms are compartmentalized (e. g., gender, race, age, location, device type, device class, seasonality, time of day, etc.), Boyer teaches, “Thresholds or other alarm conditions may also vary with patient characteristics such as age, weight, gender, or others. Alarm conditions that rely on multiple parameters may be enabled or disabled based on the available parameters in a particular situation with a particular patient” See ¶ 0057.
Consider claim 9, the computer-implemented method of claim 1, wherein the plurality of devices includes one or more of: a medical device, a process control device, a networking device, a computing device, a manufacturing device, an agricultural device, an energy / refining device, an aerospace device, a forestry device, and a defense device,
Consider claim 10, the computer-implemented method of claim 1 wherein the plurality of devices are geographically dispersed, Boyer teaches, “the physiological signal or physiologic data may be received from a sensor (e.g., the sensor 14) or from one or more local or remote medical devices.” See ¶ 0043.
Consider claim 11, the computer-implemented method of claim 1 wherein the defined portion of the plurality of required alarms is defined via massive data sets that are processed by machine learning, Boyer teaches, “the processor 206 may include a statistical analysis engine or machine learning engine 224. The statistical analysis engine or machine learning engine 224 may analyze the collected relevance event data 222 to identify and modify nuisance alarm conditions.” See ¶ 0039; Boyer teaches, “The data associated with the plurality of generated alarms may include the alarm conditions and other relevance event data, as described in detail above.” See ¶ 0049; Boyer teaches, “Analyzing the data and the relevance indicators may include performing statistical analysis on the collected data…or any other classification or learning-based algorithms” See ¶ 0052.
Consider claim 12, the computer-implemented method of claim 1 wherein predicting the occurrence of the incident if a defined portion of the plurality of required alarms has occurred includes:
requiring that the defined portion of the plurality of required alarms have occurred in a defined sequence, Boyer teaches, “data may include the type of alarm that triggered (such as a high pulse rate alarm, or low respiration rate alarm, or sensor disconnected alarm, or others), the date and time that it was triggered, the duration of time that the alarm sounded before it was silenced or canceled, the severity of the alarm, the frequency and types of other alarms over a specified time period before” See ¶ 0033, Boyer teaches, “analysis of the collected relevance event data may reveal a relationship between two or more physiologic parameters, and modifying the alarm condition may include combining alarm conditions from two or more physiologic parameters. The new combined alarm is not triggered unless both (or all) conditions are met… Accordingly, a new alarm protocol may be created that triggers an alarm when these identified conditions are all met at the same time.” See ¶ 0055. Therefore there is a suggestion that alarm events occurred in a defined sequence, nonetheless, in an analogous art, Shi teaches, “frequent-pattern and sequential pattern mining may be performed using historical alert data to learn co-occurrence of alerts and dependency information.” See ¶ 0021, Shi teaches, “the sequential pattern algorithm may output sequential pattern model 120. Sequential pattern model 120 may be trained based on historical alert data and may output a sequence of co-occurring alert dimensions…. Machine-learning-based correlation engine 112 may be configured to store these sequential patterns in a sequential pattern lookup table for faster searching. This sequential pattern lookup table may include confidence scores for each of the sequential patterns. At prediction time, machine-learning-based correlation engine 112 may be configured to perform a look up to this sequential pattern table to find any sequential patterns that apply to alert information 110 of the received alert. If a matching sequential pattern is found in the sequential pattern lookup table, machine-learning-based correlation engine 112 may be configured to perform a search of the plurality of existing incident records 142 to find one or more correlated existing incidents with matching sequential patterns.” See ¶ 0041.
Consider claim 13, the computer-implemented method of claim 1 wherein predicting the occurrence of the incident if a defined portion of the plurality of required alarms has occurred includes:
requiring that the defined portion of the plurality of required alarms have occurred within a defined period of time, Boyer teaches, “analysis of the collected relevance event data may reveal a relationship between two or more physiologic parameters, and modifying the alarm condition may include combining alarm conditions from two or more physiologic parameters. The new combined alarm is not triggered unless both (or all) conditions are met… Accordingly, a new alarm protocol may be created that triggers an alarm when these identified conditions are all met at the same time.” See ¶ 0055.
Consider claim 14, a computer program product residing on a computer readable medium having a plurality of instructions stored thereon which, Boyer teaches, “non-transitory machine-readable medium or media having instructions recorded thereon for execution by a processor” See ¶ 0041, when executed by a processor, cause the processor to perform operations, comprising:
defining an incident as the occurrence of a plurality of required alarms, See rejection of claim 1;
monitoring a plurality of devices to detect the occurrence of alarms, thus defining a plurality of detected alarms, See rejection of claim 1; and
predicting the occurrence of the incident if a defined portion of the plurality of required alarms has occurred, See rejection of claim 1.
Consider claim 15, the computer program product of claim 14 wherein defining an incident as the occurrence of a plurality of required alarms includes:
defining an incident as the occurrence of a plurality of required alarms within a defined period of time, See rejection of claim 2.
Consider claim 16, the computer program product of claim 14 wherein monitoring a plurality of devices to detect the occurrence of alarms includes:
monitoring the plurality of devices to receive data signals indicative of the plurality of devices; and
comparing the data signals to defined signal norms to identify one or more of the plurality of detected alarms, See rejection of claim 3.
Consider claim 17, the computer program product of claim 16 wherein the data signals concern one or more details of the plurality of devices and/or one or more uses of the plurality of devices, See rejection of claim 4.
Consider claim 18, the computer program product of claim 16 wherein the defined signal norms include user-defined signal norms, See rejection of claim 5.
Consider claim 19, the computer program product of claim 16 wherein the defined signal norms include machine-defined signal norms, See rejection of claim 6.
Consider claim 20, the computer program product of claim 19 wherein the machine-defined signal norms are defined via massive data sets that are processed by machine learning, See rejection of claim 7.
Consider claim 21, the computer program product of claim 19 wherein the machine-defined signal norms are compartmentalized (e.g., gender, race, age, location, device type, device class, seasonality, time of day, etc.), See rejection of claim 8.
Consider claim 22, the computer program product of claim 14 wherein the plurality of devices includes one or more of: a medical device, a process control device, a networking device, a computing device, a manufacturing device, an agricultural device, an energy / refining device, an aerospace device, a forestry device, and a defense device, See rejection of claim 9.
Consider claim 23, the computer program product of claim 14 wherein the plurality of devices are geographically dispersed, See rejection of claim 10.
Consider claim 24, the computer program product of claim 14 wherein the defined portion of the plurality of required alarms is defined via massive data sets that are processed by machine learning, See rejection of claim 11.
Consider claim 25, the computer program product of claim 14 wherein predicting the occurrence of the incident if a defined portion of the plurality of required alarms has occurred includes:
requiring that the defined portion of the plurality of required alarms have occurred in a defined sequence, See rejection of claim 12.
Consider claim 26, the computer program product of claim 14 wherein predicting the occurrence of the incident if a defined portion of the plurality of required alarms has occurred includes:
requiring that the defined portion of the plurality of required alarms have occurred within a defined period of time, See rejection of claim 13.
Consider claim 27, Boyer teaches, a computing system including a processor 206 and memory 210, Boyer teaches, “processor configured to execute code (e.g., stored in a memory of the monitor 12” See ¶ 0022, configured to perform operations comprising:
defining an incident as the occurrence of a plurality of required alarms, See rejection of claim 1;
monitoring a plurality of devices to detect the occurrence of alarms, thus defining a plurality of detected alarms See rejection of claim 1; and
predicting the occurrence of the incident if a defined portion of the plurality of required alarms has occurred, See rejection of claim 1.
Consider claim 28, the computing system of claim 27 wherein defining an incident as the occurrence of a plurality of required alarms includes:
defining an incident as the occurrence of a plurality of required alarms within a defined period of time, See rejection of claim 2.
Consider claim 29, the computing system of claim 27 wherein monitoring a plurality of devices to detect the occurrence of alarms includes:
monitoring the plurality of devices to receive data signals indicative of the plurality of devices; and
comparing the data signals to defined signal norms to identify one or more of the plurality of detected alarms, See rejection of claim 3.
Consider claim 30, the computing system of claim 29 wherein the data signals concern one or more details of the plurality of devices and/or one or more uses of the plurality of devices, See rejection of claim 4.
Consider claim 31, the computing system of claim 29 wherein the defined signal norms include user-defined signal norms, See rejection of claim 5.
Consider claim 32, the computing system of claim 29 wherein the defined signal norms include machine-defined signal norms, See rejection of claim 6.
Consider claim 33, the computing system of claim 32 wherein the machine-defined signal norms are defined via massive data sets that are processed by machine learning, See rejection of claim 7.
Consider claim 34, the computing system of claim 32 wherein the machine-defined signal norms are compartmentalized (e.g., gender, race, age, location, device type, device class, seasonality, time of day, etc.), See rejection of claim 8
Consider claim 35, the computing system of claim 27 wherein the plurality of devices includes one or more of: a medical device, a process control device, a networking device, a computing device, a manufacturing device, an agricultural device, an energy / refining device, an aerospace device, a forestry device, and a defense device, See rejection of claim 9.
Consider claim 36, the computing system of claim 27 wherein the plurality of devices are geographically dispersed, See rejection of claim 10.
Consider claim 37, the computing system of claim 27 wherein the defined portion of the plurality of required alarms is defined via massive data sets that are processed by machine learning, See rejection of claim 11.
Consider claim 38, the computing system of claim 27 wherein predicting the occurrence of the incident if a defined portion of the plurality of required alarms has occurred includes:
requiring that the defined portion of the plurality of required alarms have occurred in a defined sequence, See rejection of claim 12.
Consider claim 39, the computing system of claim 27 wherein predicting the occurrence of the incident if a defined portion of the plurality of required alarms has occurred includes:
requiring that the defined portion of the plurality of required alarms have occurred within a defined period of time, See rejection of claim 13.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Hu, Xiao et al. (US 2017/0046499 A1) teaches, “Methods for predicting patient deterioration or clinical events by detecting patterns in heterogeneous temporal clinical data streams that are predictive of certain clinical end points and matching the patient state with those patterns are described. The detected patterns, referred to as SuperAlarm triggers, are a predictive combination of frequently co-occurring monitor alarms, conditions and laboratory test results that can predict patient deterioration for imminent life-threatening events. SuperAlarm triggers may also exhibit patterns in the sequence of SuperAlarms that are triggered over the monitoring time of a patient. Sequential patterns of SuperAlarm triggers may also indicate a temporal process of change in patient status.” See abstract.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Omer S. Khan whose telephone number is (571)270-5146. The examiner can normally be reached 10:00 am to 8:00 pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Brian A. Zimmerman can be reached at 571-272-3059. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Omer S Khan/Primary Examiner, Art Unit 2686
1 Mere correction of CRM will make this claim set fall under the same category (i.e. abstract idea) as other claim sets, applicant is advised to amend claims 14-26 to address not only CRM but also to amend them in a similar manner as other claim sets to address abstract idea grounds as well.