DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
This action is in reply to Applicant’s communication filed on August 1, 2025.
Claims 1-5, 7-14 and 16-22 have been amended and are hereby entered.
Claims 1-5, 7-14 and 16-22 are currently pending and have been examined.
This action is made FINAL.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-5, 7-14 and 16-22 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1 analysis:
Claims 1 and 10 are directed to a method and a system respectively and therefore each fall into one of the four statutory categories. (Step 1: Yes, the claims fall into one of the four statutory categories).
Step 2A analysis - Prong one:
The substantially similar independent method and system claims, taking claim 1 as exemplary, recite the following limitations: A method comprising: obtaining patient consent data; detecting, based on the patient consent data, a change in a consent associated with a patient; identifying private data associated with the change in the consent; …control access to health data associated with the patient; determining that input data used…indicates the private data; determining, in response to detecting the change in the consent associated with the private data, to transform the private data; generating modified input data based at least on an impact score, wherein the modified input data includes the transformed private data; and adjusting…based on the modified input data to change the access to the health data in accordance with the change in consent.
Under the broadest reasonable interpretation, the terms of the claim are presumed to have their plain meaning consistent with the specification as it would be interpreted by one of ordinary skill in the art. See MPEP 2111.
The steps of obtaining data, detecting changes in data, determining training data, generating modified data and adjusting access to the data are all interpreted as a series of steps a person would follow to, to paraphrase, manage the use and access of patient health data based on a patients’ consent data, and are grouped as certain methods of organizing human activity.
The series of steps as recited above describe managing personal behavior or relationships or interactions between people including following rules or instructions and are thus are grouped as certain methods of organizing human activity which are abstract ideas. (Step 2A – Prong 1: Yes, the claims are abstract).
Step 2A analysis - Prong two:
This judicial exception is not integrated into a practical application. Claims 1 and 10 recite additional elements beyond the abstract idea. Claims 1 and 10 recite identifying and training a ML model. Claim 10 further recites a processor.
The claims are applying generic computer components to the recited abstract limitations. In particular, the claims recite a trained machine learning model and a processor which are recited at a high-level of generality (i.e., as a generic processor performing generic computer functions) such that it amounts to no more than mere instructions to apply the exceptions using a generic computer component. For example, Applicant’s specification explains that the processor may be a well-known processor capable of receiving electrical inputs, outputting information, reading computer programs, interpret data, executing control functions, etc. (see Applicant’s spec. paras 81, 84, 131, 133). Accordingly, this additional element, when considered separately and as an ordered combination, does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. Therefore, Claims 1 and 10 are directed to an abstract idea without practical application. (Step 2A – Prong 2: No, the additional claimed elements are not integrated into a practical application).
Step 2B analysis:
The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity: i) receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network); but see DDR Holdings, LLC v. Hotels.com, L.P., 773 F.3d 1245, 1258, 113 USPQ2d 1097, 1106 (Fed. Cir. 2014) ("Unlike the claims in Ultramercial, the claims at issue here specify how interactions with the Internet are manipulated to yield a desired result‐‐a result that overrides the routine and conventional sequence of events ordinarily triggered by the click of a hyperlink." (emphasis added)); iv) storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93. See MPEP §2106.05(d)(II).
This listing is not meant to imply that all computer functions are well‐understood, routine, conventional activities, or that a claim reciting a generic computer component performing a generic computer function is necessarily ineligible. Courts have held computer‐implemented processes not to be significantly more than an abstract idea (and thus ineligible) where the claim as a whole amounts to nothing more than generic computer functions merely used to implement an abstract idea, such as an idea that could be done by a human analog (i.e., by hand or by merely thinking). On the other hand, courts have held computer-implemented processes to be significantly more than an abstract idea (and thus eligible), where generic computer components are able in combination to perform functions that are not merely generic. See MPEP §2106.05(d)(II) – emphasis added.
Here, the steps are receiving or transmitting data over a network (MPEP 2106.05(d)(II)); storing and retrieving information in memory (MPEP 2106.05(d)(II)) – all of which have been recognized by the courts as well-understood, routine and conventional functions.
The claims are directed to an abstract idea with additional generic computer elements that do not add meaningful limitations to the abstract idea because they require no more than a generic computer to perform generic computer functions that are well-understood, routine, and conventional activities previously known in the industry.
For the next step of the analysis, it must be determined whether the limitations present in the claims represent a patent-eligible application of the abstract idea. A claim directed to a judicial exception must be analyzed to determine whether the elements of the claim, considered both individually and as an ordered combination are sufficient to ensure that the claim as a whole amounts to significantly more than the exception itself.
For the role of a computer in a computer implemented invention to be deemed meaningful in the context of this analysis, it must involve more than performance of well-understood, routine, and conventional activities previously known to the industry. Further, the mere recitation of a generic computer cannot transform a patent ineligible abstract idea into a patent-eligible invention. See MPEP 2106.05(d).
Applicant’s specification discloses the following:
Applicant describes embodiments of the disclosure at a very high level to include the use of a wide variety of processors, input/output devices, networks, communication connections, buses, interfaces, storage devices, memories, cloud computing systems, servers, databases, displays, medical devices, image sensors, smart devices, sensing systems, machine learning models and neural networks, etc. (see Applicant’s Spec. paras 22, 67, 73, 80-82, 84-87, 111-112, 404). The invention, may use any computer via any transmission medium (a communication network or broadcast waves) capable of transmitting the program.
Generic computer components recited as performing generic computer functions that are well-understood, routine and conventional activities amount to no more than implementing the abstract idea with a computerized system.
Looking at the limitations as an ordered combination adds nothing that is not already present when looking at the elements taken individually. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. The collective functions appear to be implemented using conventional computer systemization.
The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of a trained machine learning model and a processor to perform all of the steps discussed above amount to no more than mere instructions to apply the exceptions using generic computer components. Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept. The claims do not provide an inventive concept significantly more than the abstract idea. Accordingly, these additional elements, when considered separately and as an ordered combination, do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. (Step 2B: No, the claims do not provide significantly more).
Dependent Claims 2-5, 7-9, 11-14 and 16-22 further define the abstract idea that is presented in independent Claims 1 and 10, and are further grouped as certain methods of organizing human activity and are abstract for the same reasons and basis as presented above. Further, Claims 3, 5, 9, 12, 14 and 18 recite additional elements beyond the abstract idea. Claims 3 and 12 recite a patient data display interface. Claims 5 and 14 recite a control program and a medical device. Claims 9 and 18 recite an external system. These additional elements are recited at a high level of generality such that it amounts to no more than mere instructions to apply the exception using a generic computer component. For example, as noted above, the Applicant’s specification indicates the use of known machine learning models. Accordingly, these additional elements, when considered separately and as an ordered combination, do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claims do not recite additional elements that integrate the judicial exception into a practical application when considered both individually and as an ordered combination. Therefore, the dependent claims are also directed to an abstract idea.
Thus, Claims 1-5, 7-14 and 16-22 are rejected under 35 U.S.C. 101 as being directed to abstract ideas without significantly more.
Claim Rejections - 35 USC § 103
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 4-5, 7-10, 13-14, 16-18, 20 and 22 are rejected under 35 U.S.C. 103 as being unpatentable over Shelton et al. (US 20220233102) (IDS reference) in view of Sarferaz (US 20200380155), further in view of Liang et al. (JP 2020030738).
Regarding Claim 1, Shelton discloses the following limitations:
A method comprising: obtaining patient consent data; (Shelton discloses securing and recording consent from a user. – paras 7-8)
detecting, based on the patient consent data, a change in a consent associated with a patient; (Shelton discloses securing and recording consent from a user (the patient consent data). Consent from the user may be received and adjustment of consent may take place (detecting…a change in a consent – para 453) if certain thresholds are met. – paras 7-8, 434-435, 453, 466)
identifying private data associated with the change in the consent; (Shelton discloses that the consent of the user may (associated with the change in the consent) be a consent to share data from the sensing system (identifying private data), such as a wearable device, with a health care provider (HCP). The consent of the user may be confirmed when the identity of the user may be confirmed and the state of mind of the user indicates that the user is able to provide consent. Data from the sensing system may be sent to the HCP. Adjustment of consent may take place if the thresholds are met. – paras 434-435, 453)
identifying a machine learning (ML) model, wherein the ML model is trained to control access to health data associated with the patient; (Shelton discloses the output of machine learning's training process may be a model (a machine learning (ML) model) for predicting outcome(s) on a new dataset. Machine learning is a branch of artificial intelligence that seeks to build computer systems that may learn from data without human intervention (wherein the ML model is trained) (para 418). Model deployment may be another stage of the machine learning lifecycle. The model may be deployed as a part of a standalone computer program or as a part of a larger computing system (para 428). The computing system 29443 (the ML model) may block the data from being shared if the user may not be able to give consent (to control access to health data associated with the patient), for example (para 514). – paras 418, 423-425, 428, 514)
determining that input data used to train the ML model indicates the private data; (Shelton discloses that the data collected for machine learning (has been used to train the trained ML model) may include a patient's medical conditions, biomarker measurement data, and a patient’s EMR (input data… indicates the private data). – paras 423-424)
Shelton does not disclose the following limitations met by Sarferaz:
determining, in response to detecting the change in the consent associated with the private data, to transform the private data; (Sarferaz teaches techniques for managing data used in machine learning applications including managing updates to a use status (e.g., a retention status or a description of any authorized purposes for which the data may be used) (in response to detecting the change in the consent associated with the private data). A data subject may consent to their data being used for some purposes, but not others. Consent for a particular purpose can change over time. Consent can expire, or be withdrawn, in some cases. Thus, disclosed technologies can facilitate restricting data used for machine learning applications to data where the proposed use is consistent with the consent (or other authorization) that exists for the data. One status or action can be data deletion. Data, which may have previously been stored and potentially available for use in machine learning applications, can be marked for deletion, and deleted (transform the private data) or can be “blocked data” which is maintained but not available for use in machine learning (transform the private data). – paras 28-29, 31-35, 43)
generating modified input data…, wherein the modified input data includes the transformed private data; (Sarferaz teaches that the status of data available for use in machine learning applications may change over time (generating modified input data). Certain data may be deleted or blocked (the modified input data includes the transformed private data) to stay consistent with the subjects consent. – paras 28-29, 31-35, 43)
and adjusting the ML model based on the modified input data to change the access to the health data in accordance with the change in consent. (Sarferaz teaches that the machine learning algorithm 220 can access data, such as for use in generating the trained model 212, or in conducting an analysis using the trained model, through a data view 224. If data is deleted from the application data 232, or removed from the application data and placed in the archive 240, it is no longer available to the data view 224 (based on the modified input data) and thus no longer available to the machine learning algorithm 220 and trained model 212 (adjusting the ML model) because the data view 224 can be used to filter data to be retrieved such that blocked data is not included in data provided to the machine learning algorithm 220 (change the access to the health data in accordance with the change in consent). – paras 33-35, 43, 45-46, 55; FIG. 2)
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified updating a model as more data becomes available as training data and further training based on the additional data as disclosed by Shelton (see para 430) to incorporate techniques for managing data used in machine learning applications including managing updates to a status of data based on consent use as taught by Sarferaz in order to determine whether particular data can be used for particular purposes (see Sarferaz para 5).
Shelton and Sarferaz do not disclose the following limitations met by Liang:
generating modified input data based at least on an impact score… (Liang teaches selecting training data to be included in the training data set (generating modified input data) to be input for constructing a predictor from a plurality of training data on the basis of the influence score (based at least on an impact score) of each of plural learning data. – abstract)
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the data collection for machine learning as disclosed by Shelton to incorporate calculating an influence score for training data as taught by Liang in order to determine the strength of the effect of the learning data on the prediction accuracy of the predictor/ML model (see Liang page 3, paragraph 18 of copy provided).
Regarding Claim 4, Shelton, Sarferaz and Liang disclose all the limitations above and further disclose the following limitations:
wherein the method further comprises: receiving a request to modify the patient consent data associated with the patient; (Shelton discloses adjustment of consent (modify the patient consent data). The consent of the user may be confirmed when the identity of the user is confirmed and the state of mind of the user indicates that the user is able to provide consent. Data from the sensing system may be set to the HCP. Examiner notes that “receiving a request to modify” is being interpreted as the confirmation step taught by Shelton because this confirmation indicates that the user is either attempting or requesting to modify consent in some way which required confirmation of identity and state of mind. – paras 7-8, 434-435, 526)
obtaining patient capacity information associated with the patient; (Shelton discloses that a state of mind of the user may be identified or determined (e.g., a mental state and/or a cognitive state) (patient capacity information). – paras 7-8, 434-435, 526)
determining whether to allow modification of the patient consent data based on the obtained patient capacity information; (Shelton discloses that consent may require a predetermined state of mind. Lack of mental capacity or cognitive impairment (based on the obtained patient capacity information) may prevent adjustment of consent (determining whether to allow modification of the patient consent data – para 453). For example, the consent of the user may be confirmed when the identity of the user is confirmed and the state of mind of the user indicates that the user is able to provide consent (based on the obtained patient capacity information). – paras 434, 453, 466-467)
and generating a response to the request based on the determining. (Shelton discloses that the consent of the user may be confirmed (generating a response). For example, the consent of the user may be confirmed when the identity of the user is confirmed and the state of mind of the user indicates that the user is able to provide consent (based on the determining). – para 434)
Regarding Claim 5, Shelton, Sarferaz and Liang disclose all the limitations above and further disclose the following limitations:
obtaining a subset of the health data; (Shelton discloses receiving (obtaining) measurement data and physical state data (a subset of the health data) associated with a patient. – para 414)
obtaining an indication of a control program for a medical device; (Shelton discloses transmitting the measurement data and physical state data it received from the sensing systems 20336 and/or data associated with the surgical devices 20337 (a medical device) to analytics servers 20338 for processing thereon. The surgical hub 20326 may compare the measurement data from the sensing systems 20336 with one or more thresholds in real-time. The surgical hub 20326 may generate a notification for displaying (indication of a control program). – paras 415-416; FIG. 8)
determining, for the control program, a utilization score for the subset of the health data; (Shelton discloses that the surgical hub 20326 may associate the measurement data (the subset of the health data), e.g., related to a surgeon, with other relevant (a utilization score) pre-surgical data and/or data from situational awareness system to generate control signals for controlling the surgical devices 20337, for example, as illustrated in FIG. 8. – para 414, 416; FIG. 8)
and associating the utilization score with the subset of the health data. (Shelton discloses that relevant pre-surgical data and/or data from situational awareness system may be associated (associating the utilization score) with the measurement data (the subset of the health data). – paras 414, 416)
Regarding Claim 7, Shelton, Sarferaz and Liang disclose all the limitations above and further disclose the following limitations:
further comprising: obtaining an annotation associated with the health data; (Shelton discloses that patient data 29411 are given an activity classification of sleeping at 29414. Sleeping may be classified at 29414 based on patient data 29411 and any additional data that may provide a context (the health data). The patient data 29411 may include HCP notes (an annotation) that a patient was sleeping at the time indicated, for example. The patient data 29411 may include HCP notes that a patient was sedated, for example. – paras 472, 477, 481, 518)
determining, based on the annotation, utilization information associated with the health data; (Shelton discloses that patient data 29411 may be given an activity classification of sleeping at 29414. Sleeping may be classified at 29414 based on patient data 29411 and any additional data that may provide a context (the health data). The patient data 29411 may include HCP notes (the annotation) that a patient was sleeping at the time indicated, for example. The data streams may include electronic medical records, patient data, data from external devices such as wearable devices, etc. Identification of user data may be performed by receiving the data streams and using the one or more incoming data streams. (utilization information associated with the health data). – paras 477, 518-519)
and generating a data utilization report comprising a portion of the utilization information. (Shelton discloses that the data stream used to train the ML model is generated/created via data from wearable sensors, measurement devices, EHRs, etc. Identification of user data (utilization information) is performed and an output of the data streams associated with a patient are generated (generating a data utilization report – para 520). – paras 427, 477, 481, 518-520; FIG. 8)
Regarding Claim 8, Shelton, Sarferaz and Liang disclose all the limitations above and further disclose the following limitations:
wherein the method further comprises: receiving an indication that the health data associated with the patient has been accessed; (Shelton discloses that directional measures (the health data associated with the patient) may be sent to a display, a computing system, and/or a device (receiving an indication that the health data…has been accessed). The directional measures are generated and indicate a contextual summary of the one or more cooperative measures. The one or more cooperative measures determined may be related to a physiologic function and/or morbidity and is determined using the biomarkers (i.e., the data streams via medical devices). – paras 502-505)
adding an annotation to the health data, based on the indication; (Shelton discloses that healthcare providers (HCPs) may include notes (adding an annotation) to the patient data (the health data). – paras 472, 477, 481, 518)
determining, based on the annotation, utilization information associated with the health data; (Shelton discloses that patient data 29411 may be given an activity classification of sleeping at 29414. Sleeping may be classified at 29414 based on patient data 29411 and any additional data that may provide a context (the health data). The patient data 29411 may include HCP notes (the annotation) that a patient was sleeping at the time indicated, for example. The data streams may include electronic medical records, patient data, data from external devices such as wearable devices, etc. Identification of user data may be performed by receiving the data streams and using the one or more incoming data streams. (utilization information associated with the health data). – paras 477, 518-519)
and generating a patient data utilization report comprising a portion of the utilization information. (Shelton discloses that the data stream used to train the ML model is generated/created via data from wearable sensors, measurement devices, EHRs, etc. Identification of user data (utilization information) is performed and an output of the data streams associated with a patient are generated (generating a data utilization report – para 520). – paras 427, 477, 481, 518-520; FIG. 8)
Regarding Claim 9, Shelton, Sarferaz and Liang disclose all the limitations above and further disclose the following limitations:
wherein the method further comprises: determining that the health data associated with the patient is stored in an external system; (Shelton discloses that data streams and other patient data (the health data associated with the patient) may be stored in a database such as an electronic medical record (stored in an external system). – paras 415, 425, 456, 468)
obtaining a consent key for accessing the health data associated with the patient; (Shelton discloses that the means for ensuring the user is the authorized user (obtaining a consent) may include mechanisms that authenticate specific patients to wearables (a consent key) to reduce data falsification and/or fabrication. A wearable device may be used to authenticate and/or identify a user. For example, a wearable used as a key to other secured treatments may include authentication to access and monitor stored medical records (for accessing the health data). – para 465-466, 523)
send a request to access the health data associated with the patient, wherein the request indicates the consent key; (Shelton discloses that the wearables may be used as a key for authentication (the request indicates the consent key) to access (send a request to access) and monitor stored medical records (the health data associated with the patient). – paras 465-466, 523)
and accessing the health data associated with the patient from the external system. (Shelton discloses that the wearables may be used as a key for authentication to access (accessing) and monitor stored medical records (the health data). The medical records may be stored in a database such as an electronic medical record (the external system). – paras 465-466, 523)
Regarding Claim 10, Shelton, Sarferaz and Liang disclose all the limitations above and further disclose the following limitations:
A health data system comprising: a processor configured to: (Shelton discloses a computing system (A health data system) for securing and recording consent from a user to communicate with a health care provider. The computing system may comprise a memory and a processor. – paras 7-8)
Regarding Claim 13, this claim recites substantially similar limitations to those recited in claim 4 above; thus, the same rejection applies.
Regarding Claim 14, this claim recites substantially similar limitations to those recited in claim 5 above; thus, the same rejection applies.
Regarding Claim 16, this claim recites substantially similar limitations to those recited in claim 7 above; thus, the same rejection applies.
Regarding Claim 17, this claim recites substantially similar limitations to those recited in claim 8 above; thus, the same rejection applies.
Regarding Claim 18, this claim recites substantially similar limitations to those recited in claim 9 above; thus, the same rejection applies.
Regarding Claim 20, Shelton, Sarferaz and Liang disclose all the limitations above and further disclose the following limitations:
The method of claim 1, further comprising: predicting a change of an output of the ML model based on the change in the consent associated with the private data; (Shelton discloses that through iterative optimization of an objective function (e.g., cost function), a supervised learning algorithm may learn a function (“prediction function”) that may be used to predict the output (predicting a change of an output of the ML model) associated with one or more new inputs (based on the change in the consent associated with the private data). A suitably trained prediction function may determine the output for one or more inputs that may not have been a part of the training data. – para 419)
Shelton and Sarferaz do not disclose the following limitations met by Liang:
wherein the impact score is estimated based on the predicted change of the output. (Liang teaches calculating an influence score (the impact score is estimated) indicating the strength of the influence that the training data has on the prediction accuracy of the predictor/ML model (based on the predicted change of the output). – abstract)
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the data collection for machine learning as disclosed by Shelton to incorporate calculating an influence score for training data as taught by Liang in order to determine the strength of the effect of the learning data on the prediction accuracy of the predictor/ML model (see Liang page 3, paragraph 18 of copy provided).
Regarding Claim 22, this claim recites substantially similar limitations to those recited in claim 20 above; thus, the same rejection applies.
Claims 2 and 11 are rejected under 35 U.S.C. 103 as being unpatentable over Shelton et al. (US 20220233102) in view of Sarferaz (US 20200380155), in view of Liang et al. (JP 2020030738), further in view of Livesay et al. (US 20190348158).
Regarding Claim 2, Shelton, Sarferaz and Liang disclose all the limitations above and further disclose the following limitations:
receiving a data request for a portion of the health data associated with the patient; (Shelton discloses that a health care provider may request (receiving a data request) access to patient information and/or records (for a portion of the health data associated with the patient). – para 516)
Shelton and Sarferaz do not disclose the following limitations met by Livesay:
retrieving, from the patient consent data, a consent subject to a condition; (Livesay discloses that the consent directive ledger contains information corresponding to the person (i.e., a patient), consent information, and a pointer. For example, the consent information may be a specific the type of information that the patient consents to sharing, and the pointer field may indicate one or more entities who have the patient's consent to receive and/or transport the patient's health information. Further, a validity period may be applied to patient consent (a consent subject to a condition). For example, the patient may specify that the consent is to expire on a particular date in the future or after a predetermined duration of time has elapsed. – paras 42, 66-67)
determining whether to block or allow access to the portion of the health data associated with the patient based on the data request and whether the condition has been met; (Livesay discloses that it can be determined that data sharing organization has permission to share health information related to the patient, based on the query (based on the data request) and the consent record. In some implementations, the consent management module 132 can also determine whether the patient's consent is still valid (i.e., that the consent has not been revoked by the patient or expired at the time the query is received) (whether the condition has been met). – abstract; paras 12, 66, 93-94, 101; FIG. 3 item 320; FIG. 4, item 425)
and generating a response to the data request based on the determining. (Livesay discloses generating a response to the query (generating a response to the data request) after determining whether the data sharing organization has permission to share the patient's health information, based on the query and the patient's stored consent record (based on the determining). – para 94; FIG. 3 item 325)
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have further modified requesting access to patient information and/or records as disclosed by Shelton to incorporate a consent directive ledger including consent information regarding access to certain information and/or a validity period as taught by Livesay in order to maintain patient privacy (see Livesay para 1).
Regarding Claim 11, this claim recites substantially similar limitations to those recited in claim 2 above; thus, the same rejection applies.
Claims 3 and 12 are rejected under 35 U.S.C. 103 as being unpatentable over Shelton et al. (US 20220233102) in view of Sarferaz (US 20200380155), in view of Liang et al. (JP 2020030738), further in view of Yousfi et al. (US 20170372096).
Regarding Claim 3, Shelton, Sarferaz and Liang disclose all the limitations above, however, do not disclose the following limitations met by Yousfi:
wherein the method further comprises: Identifying, in the health data, a dataset associated with a patient data display interface; (Yousfi teaches data collection devices that obtain patient-specific data. Exemplary data collection device(s) 110 may thus include imaging modalities, wearables, blood pressure cuffs, thermometers, fitness trackers, glucometers, heart rate monitors, etc. The patient-specific anatomical data obtained (e.g., by data collection device(s) 110) (a dataset) may be transferred over a secure communication line (e.g., via a network). For example, the data may be transferred to a server or other computer system including a user interface (a patient data display interface). In an exemplary embodiment, the data may be transferred to a server or other computer system operated by a service provider providing a web-based service, e.g., cloud platform 113. Alternatively, the data may be transferred to a computer system operated by the patient's physician or other user, e.g., for analysis, viewing, or storage within the first region. – para 43-46, 48)
identifying a relevant controlling region associated with the dataset; (Yousfi teaches that a cloud platform in a first region (a relevant controlling region) may receive health data. The health data may include patient case/health information with or without patient privacy information. The health data (the dataset) may include data produced, generated, or received at a first region (region associated with the dataset). The first region cloud platform 113 may identify which patient is associated with the received analyzed data, and provide access to the analyzed data by consumer device(s) 130. It is understood that the first region cloud platform may reside in any region and the patient privacy information may correspond to the patient privacy regulations and standards of that respective first region. – paras 28, 47, 55; FIG. 1)
determining a region-based consent requirement associated with the dataset based on the relevant controlling region; (Yousfi teaches that it is understood that the first region cloud platform may reside in any region and the patient privacy information may correspond to the patient privacy regulations and standards (a region-based consent requirement associated with the dataset) of that respective first region (based on the at least one relevant controlling region). – paras 28, 47, 55)
and determining whether to display the dataset via the patient data display interface based on the determined region-based consent requirement and the patient consent data. (Yousfi teaches an embodiment may include three regions, e.g., Europe, the United States (US), and Japan. Europe may serve as the first region where patient health data (including image data) is collected. All patient privacy information associated with the patient health data may remain in Europe, while collected image data is transmitted to the US for analysis. An analysis of the image data may be generated in the US and transmitted back to Europe. A physician may log into a portal (the patient data display interface) and access the analyzed data for his/her patient in Europe. However, if the physician (or a partnering health care professional) in Japan visits the portal, the portal may show that analyzed data is available, without revealing patient privacy information associated with the data. In one instance, the analyzed data may be accessible or viewable in Japan, but without any identifying information for the patient associated with the analyzed data. The patient privacy information may be exclusive to data access in Europe, since the patient privacy information may not be transferred outside the border of the first region (e.g., Europe, in this example) (the determined region-based consent requirement and the patient consent data). – para 55)
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have further modified receiving consent from a user as disclosed by Shelton to incorporate receiving consent information for particular regions as taught by Yousfi in order to preserve patient privacy while still permitting data collected in one region to be transmitted and analyzed in a second region (see Yousfi para 4).
Regarding Claim 12, this claim recites substantially similar limitations to those recited in claim 3 above; thus, the same rejection applies.
Claims 19 and 21 are rejected under 35 U.S.C. 103 as being unpatentable over Shelton et al. (US 20220233102) in view of Sarferaz (US 20200380155), in view of Liang et al. (JP 2020030738), further in view of Takahash et al. (US 20180012039).
Regarding Claim 19, Shelton, Sarferaz and Liang disclose all the limitations above, however do not disclose the following limitations met by Takahash:
The method of claim 1, further comprising: based on the determination to transform the private data, generating synthetic data associated with the private data, wherein the modified input data is generated further based on the synthetic data. (Takahash teaches an anonymization processing device that anonymizes input data and outputs anonymized output data. Figures 6A-6B show example data before anonymization and Figures 10A-10B show the same example data after anonymization. The anonymization process management information includes information to have the data before anonymization associated with the data after anonymization (generating synthetic data associated with the private data), and to represent the state of progress of an anonymization process and the anonymization method and the anonymization level. After anonymization, the data is replaced by more generic data (the modified input data is generated further based on the synthetic data). - abstract; paras 53, 113; FIG. 6A-6B; FIG. 10A-10B)
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have further modified removing identifying data as disclosed by Shelton to incorporate data anonymization as taught by Takahash in order to execute data analysis while protecting individual privacy (see Takahash para 2).
Regarding Claim 21, this claim recites substantially similar limitations to those recited in claim 19 above; thus, the same rejection applies.
Relevant Prior Art of Record Not Currently Being Applied
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Califano (US 20030033168) discloses systems and methods providing a dynamic process for obtaining and managing informed consent documentation. The processes allows the human subject to change consent stored in the data memory in response to a request to change the consent, or, optionally, at their own volition and unprompted. The human subject can change the consent data in any manner that they choose, including expanding the granted level of access and rights to the stored data, reducing the granted level of access and rights, and eliminating altogether the ability to access or use the data.
Iyood et al. (US 20210133557) discloses computer systems, methods and program products for automating pseudonymization of personal identifying information (PII) using machine learning, metadata, and crowdsourcing patterns to identify and replace PII.
Response to Arguments
Regarding rejections under 35 USC § 101 to Claims 1-15, 7-14 and 16-22, Applicant’s arguments have been fully considered, and are not persuasive. The rejection has been updated in light of latest amendments. Applicant argues:
(a) Applicant respectfully asserts that claim 1 does not manage a personal behavior, and thus, it cannot be said to recite certain methods or organizing human activity. Instead, the method of claim 1 recites limitations that cannot be performed between people, including for example, "adjusting the ML model based on the modified input data to change the access to the health data in accordance with the change in the consent." Because the limitations of claim 1 do not include social activities, teaching, or following rules or instructions, the claims do not recite an abstract idea under the first prong of Step 2A. (p. 12).
Regarding (a), Examiner respectfully disagrees. MPEP 2106.04(a)(2)(II) states that a claimed invention is directed to certain methods of organizing human activity if the identified claim elements contain limitations that encompass fundamental economic behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions). Examiner notes that the ML model was identified as an additional element and not part of the abstract idea.
The Examiner submits that the identified claim elements represent a series of rules or instructions that a person or persons, with or without the aid of a computer, would follow to manage the use and access of patient health data based on a patients’ consent data. The Applicant has not pointed to anything in the claims that fall outside of this characterization. Because the claim elements fall under a series of rules or instruction that a person or persons would follow to manage the use and access of patient health data based on a patients’ consent data, the claimed invention is directed to an abstract idea.
(b) A "claim that integrates a judicial exception into a practical application will apply, rely on, or use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that the claim is more than a drafting effort designed to monopolize the judicial exception." Limitations the courts have found indicative that an additional element (or combination of elements) may have integrated the exception into a practical application include an improvement to other technology or technical field and implementing the judicial exception with a particular machine. Applicant respectfully submits that the elements of amended Claim 1 reflect an integration of any supposed abstract idea into a practical application for at least the reasons explained above. (p. 12).
Regarding (b), Examiner respectfully disagrees. MPEP 2106.04(d)(2) indicates that a practical application may be present where the judicial exception is implemented using or in conjunction with a particular machine or manufacture. The instant claims do not recite a particular machine and, instead, recite that the abstract idea is implemented by a general-purpose computer (see Applicant’s Spec. paras 22, 67, 73, 80-82, 84-87, 111-112, 404). MPEP 2106.05(b)(I) indicates that applying the judicial exception “by use of conventional computer functions does not qualify as a particular machine.” Because there is no particularity with respect to the computer that implements the abstract idea, thus requiring the Examiner to conclude that the abstract idea is implemented by a general-purpose computer, a practical application is not present.
(c) Further, the elements of amended Claim 1 reflect an integration of any supposed abstract idea into a practical application because they comprise an improvement to at least the technical field of managing access to healthcare data via a neural network, for example, to provide improved accuracy and access based on patient consent. (p. 12-13).
Regarding (c), Examiner respectfully disagrees. MPEP 2106.04(d)(1) states "the word 'improvements' in the context of this consideration is limited to improvements to the functioning of a computer or any other technology/technical field, whether in Step 2A Prong Two or in Step 2B." Here there is no improvement to the computer nor is there an improvement to another technology. Because neither type of improvement is present in the claims, an improvement to technology is not present and there is no practical application. Further, Examiner notes that the stated improvement of managing access to healthcare data via a neural network are interpreted as not being rooted in technology. The problems are not caused by nor related to computer technology and the claims do not provide any limitations that may be interpreted as technical improvements to computer technology. Rather, the claimed invention is using a computer (i.e., the “neural network”) as a tool and any improvement present is an improvement to the abstract idea of, to paraphrase, managing the use and access of patient health data based on a patients’ consent data.
(d) Applicant respectfully submits that amended Claim 1 relates at least to the improvement of the autonomous operation of a surgical instrument based on the monitored performance of a surgical task as described herein. (p. 13).
Regarding (d), Examiner respectfully disagrees. The Examiner respectfully disagrees. MPEP 2106.04(d)(1) and MPEP 2106.05(a) indicates that a practical application may be present where the claimed invention provides a technical solution to a technical problem. See, e.g., DDR Holdings, LLC. v. Hotels-com, L.P., 773 F.3d 1245, 1259 (Fed. Cir. 2014) (finding that claiming a website that retained the "look and feel" of a host webpage provided a technological solution to the problem of retention of website visitors by utilizing a website descriptor that emulated the "look and feel" of the host webpage, where the problem arose out of the internet and was thus a technical problem). Here, while