DETAILED CORRESPONDENCE
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This non-final first office action on merits is in response to the Patent Application filed on 5 November 2024. Claims 1-20 are pending and considered below.
Priority
Applicant’s claim for the benefit of a prior-filed application under 35 U.S.C. 119(e) or under 35 U.S.C. 120, 121, or 365(c) is acknowledged. Applicants claim of priority to the application filed with Germany under number DE10-2023 211 015.6 filed 7 November 2023 is acknowledged. Therefore the instant invention is afforded a priority date of 7 November 2023.
Claim Rejections - 35 USC § 101
Examiner has evaluated the instant invention under the requirements of the 2019 PEG Revised Step 2A Prongs One and Two and the requirements of the MPEP 2106 and has determined that the instant invention is directed to a judicial exception and is further directed to a practical application or improvement to computer functioning and is therefore eligible under the requirements of the statute.
With respect to the requirements of the 2019 PEG Revised Step 2A Prong One and requirements of MPEP 2106 Examiner has determined that the instant invention is directed to a judicial exception or abstract idea similar to certain methods of organizing human activity including managing personal behavior or relationships or interactions between people including social activities, teaching, and following rules or instructions and as well to mental processes including concepts performed in the human mind including observation, evaluation, judgement or opinion.
With respect to the requirements of the 2019 PEG Revised Step 2A Prong Two and requirements of MPEP 2106 the Examiner has determined that the instant invention is directed to a practical application or improvement to computing technology as considered with respect to the requirements of the statute. Examiner’s conclusion is guided by the disclosures of the written description at paragraphs [43]-[57] which specifically detail the performance of model training as well as the coordination of encrypted and decrypted data processing schemes which are unique to particular facilities and as well particular vendor systems and the processing and coordination of the encrypted data. The performance of the claimed invention results in optimized data protection and results in an improvement to data security and management.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1-5, 8-16, 18-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over the combination of Crabtree et al. (20250349399) in view of Vizitiu et al., Applying Deep Neural Networks over Homomorphic Encrypted Medical Data, (hereinafter Vizitiu), and in further view of Najarian et al. (20250266983).
Claims 1, 15 and 20: Crabtree discloses a computer-implemented method, storage medium, and system of generating a protocol for a medical imaging device installed at a specific site, the computer-implemented method comprising:
providing a homomorphic encrypted large-language model trained to generate a homomorphic encrypted medical imaging protocol ([116 “data could include information from medical tests, imaging studies, and patient questionnaires), tech companies (e.g., wearable technology industry), government agencies, and consumer research firms,” 117, 124 “encryption platform 121 may implement homomorphic encryption when processing or otherwise analyzing personal health information. In this way, the system can provide processing of encrypted data without having to decrypt and potentially leak personal information,” 139 “Medical data 250 encompasses a range of health-related information….imaging studies (e.g., x-rays, ultrasound, magnetic resonance imaging, computed tomography scan, etc.), electrocardiogram, and various biometric measurements (e.g., heart rate variability, electrodermal activity, pulse oximetry, etc.),” 149, 150, 157 “Large Language Model (LLM) computing 370 refers to the utilization of advanced computational systems to train, deploy, and utilize large language models. These models are typically built using deep learning techniques and have the capability to understand, generate, and process human language at a sophisticated level,”]);
obtaining patient-specific input data for an intended medical imaging task ([116 “data could include information from medical tests, imaging studies, and patient questionnaires), tech companies (e.g., wearable technology industry), government agencies, and consumer research firms,” 117, 295 “incorporating patient-specific anatomical data, these models can predict areas of abnormal flow, potential for plaque formation in arteries, or efficiency of gas exchange in the lungs,” ” 336 “integrating this advanced neurosymbolic AI subsystem 3600, the PHDB system significantly enhances its capabilities in personal health management. It can provide users with more accurate, explainable, and actionable health insights, supporting both day-to-day health management and complex medical decision-making,” 339 “Patient context 3750 ensures that all analyses and recommendations are tailored to the individual user's specific circumstances. It considers factors like the patient's medical history, lifestyle, preferences, and environmental factors, allowing for highly personalized insights,”]);
performing homomorphic encryption on the patient-specific input data ([123, 124 “encryption platform 121 may implement homomorphic encryption when processing or otherwise analyzing personal health information. In this way, the system can provide processing of encrypted data without having to decrypt and potentially leak personal information,” 244]);
applying the homomorphic encrypted patient-specific input data to the homomorphic encrypted large-language model ([112, 157, 220]) to generate the homomorphic encrypted medical imaging protocol ([117 “treatment plans (e.g., information about the patient's treatment plan, including any medications, therapies, or procedures that have been prescribed), progress notes (e.g., notes on the patient's progress, including any changes in their condition, response to treatment, or other relevant information), diagnostic tests (e.g., information about any diagnostic tests that have been performed, such as blood tests, imaging studies, or biopsies, and the results of those tests),” 149 “system can engage in express parametric studies (e.g., initial seed manipulation) to look at potential impacts of various factors/constraints such as imaging, sampling, extraction, errors and uncertainty for statistical, ML/AI, or modelling simulation processes (e.g., diagnostics, treatment advisory, treatment calibration, etc.),” 213 “kinds of tasks that may be used in combination with a neural network are potentially unlimited so long as the problem is deterministic, but common applications include classification problems, labeling problems, compression or algorithm parameter tuning problems, image or audio recognition, and natural language processing. Neural networks may be used as part of a machine learning engine, as the method by which training is done and a model is generated,” 214, 216 “a neural network component processes raw data, such as images, text, or sensor inputs, to extract relevant features and learn patterns. This component is responsible for tasks like image recognition, natural language processing, or sensor data analysis. The symbolic reasoning component manipulates abstract symbols and rules to perform tasks that require logical reasoning or explicit knowledge representation. This component is responsible for tasks like logical inference, planning, or knowledge representation,” 244 “identifying a match using homomorphic encryption with a PHDB, according to an aspect of the invention. According to the aspect, the process begins at step 2210 as the user attempts to find a match for themselves in a specific domain, such as dating, blood or organ donor, gaming partner, etc. The PHDB data is then subjected to homomorphic encryption by an encryption engine at step 2220. In an embodiment, the homomorphic encryption scheme is partially homomorphic encryption. In an embodiment, the homomorphic encryption scheme is fully homomorphic encryption,”]);
Crabtree does not explicitly disclose, however Vizitiu discloses:
performing homomorphic decryption (Page 5 “ensure that the message can only be identified through proper decryption and by possessing the secret key,” Page 8 “results in a model that provides encrypted predictions, which can only be decrypted by the owner of the secret key,” Page 17 “training loss for the regression task, as resulted after decryption, is depicted in Figure 10(a). Similarly, the evolution of the training and validation accuracy of the privacy-preserving CNN model fed with encrypted X-ray coronary angiographies, obtained after decryption,”) on the generated homomorphic encrypted medical imaging protocol to obtain a plain-text medical imaging protocol (Page 2 “detailed description of privacy-preserving techniques for machine learning, homomorphic encryption,” Page 4 “additive homomorphic scheme where addition in the ciphertext space corresponds to multiplication in the plaintext space, and the ElGamal scheme [44], a multiplicative homomorphic scheme, which, through some modifications,” Page 4 “a plaintext scalar is encrypted as a n × n ciphertext matrix, and matrix algebra is employed to enable computations on ciphertext data. All operations performed on ciphertext data are therefore defined as matrix operations, e.g., the multiplication of plaintext scalars is formulated as the matrix multiplication of ciphertext matrices,” Page 4 “employing partially homomorphic encryption (PHE) instead of FHE. Since FHE is currently practically impossible to be used in a realworld system, a viable approach is a system based on PHE that is specialized only for certain operations,” Page 7 “deep convolutional neural network (CNN) architecture was proposed to enable feature learning directly from the input images….amount of available data, and the formulated problems, herein we focus on a FCNN for solving a regression task and on a CNN for image-based analysis,” Page 8 “access only to the encrypted version of the data (ciphertext), while the actual data (plaintext) are detached from the processing unit and remain private on the side of the data provider. Finally, with the homomorphic property underlying the MORE encryption scheme, the direct support for floating-point arithmetic,” Page 12 “main imaging modality for the diagnosis of coronary artery disease (CAD) is invasive X-ray coronary angiography (ICA) [74]. It allows for a comprehensive assessment of both the function and the structure of the heart. During the invasive procedure, a dye with radio opaque characteristics is inserted in the coronary vessels and a set of images is recorded in succession by an X-ray scanner,” Page 14 “class labels or real-valued quantities, were also encrypted, except for the binary classification problem where the target was given as plaintext. We chose to encrypt only the input data, i.e., the coronary angiography images, and leave the target, i.e., binary label 0 or 1, as plaintext to show that training can as well be performed if labels are formulated as plaintext,”); Examiner Note: Examiner under a broadest reasonable interpretation considers the disclosures of Vizitiu with respect to the processing of data by means of homomorphic decryption and the processing of plain text data as derived from the images and the implementation of the disclosed processing for medical images and associated processing of collected data to disclose the above limitation.
Therefore it would be obvious for Crabtree to performing homomorphic decryption on the generated homomorphic encrypted medical imaging protocol to obtain a plain-text medical imaging protocol as disclosed by the Vizitiu disclosure and the implementation of the collection of the homomorphic encrypted data for the purpose of interpreting the contents of medical images and the development of treatment protocols for patients.
Crabtree does not explicitly disclose, however Najarian discloses
the homomorphic encryption and the homomorphic decryption ([34 “analyzing, by the one or more processors executing an encrypted ML model that is encrypted in accordance with a second fully homomorphic encryption technique, the encrypted data to output an encrypted classification without decrypting the encrypted data, and/or transmitting, by the one or more processors, the encrypted classification to a user computing device for decryption,”]) utilize a first encryption scheme specific to the specific site and a second encryption scheme specific to a vendor ([37 “example environment 100 may be a distributed computing environment where the encrypted data and encrypted ML model are located/stored in any suitable location(s) and may be transmitted across a network 116 to the various components, as necessary. In particular, the example environment 100 includes a central server 104, a model server 106, and a user device 108. Broadly, the user device 108 may store and transmit encrypted user data 108b1 to the central server 104 across the network 116, and the central server 104 may also receive an encrypted,” 38 “computations performed by FHE trained ML models herein may be performed, in some architectures, across decentralized devices or servers. In some such examples, each distributed device or server may include an FHE trained ML model and a central server may coordinate operations across these distributed devices,” 40 “central server 104 may include a processor 104a, a memory 104b, and a networking interface 104c. The memory 104b may include a ML module 104b1 and a training module 104b2. The model server 106 may include a training module 106a, a ML model 106b, and encryption instructions 106c. The user device 108 may include a processor 108a, a memory 108b, a networking interface 108c, and an input/output (I/O) interface 108d,” 44 “central server 104 may include a ML module 104b1 and a training module 104b2 that may generally be configured to collectively train, re-train, and/or update the ML model 106b. The ML module 104b1 may receive input encrypted user data 108b1 (e.g., data from a hospital) and may execute the encrypted ML model 106b to generate encrypted classification(s). The training module 104b2 may utilize new input data (e.g., new encrypted data) to re-train/update the ML model 106b after training at the model server 106 to generate outputs (e.g., classifications) based on the prior training dataset and the new input data,” 46 “example system 120 of FIG. 1B includes the central server 104 receiving data transmissions from the model server 106 and the user device 108 and generating and transmitting outputs to the model server 106 and the user device 108 based on those data transmissions,” 47 “model server 106 may transmit an encrypted ML model 106b to the central server 104, and the user device 108 may transmit encrypted data 108b1 to the central server 104. The central server 104 may analyze the encrypted data 108b1 using the encrypted ML model 106b to generate an encrypted classification, which the central server 104 may transmit to the user device 108 for decryption, in accordance with the encryption instructions,” 48, 55 “user device 108 and the model server 106 may execute the encryption instructions 108b2, 106c to encrypt/decrypt the data transmitted to the central server 104 and/or received from the central server 104. For example, the user device 108 may execute the encryption instructions 108b2 to encrypt the user data 108b1 using a first fully homomorphic encryption technique prior to transmitting the user data 108b1 to the central server 104 and to decrypt the encrypted classification received from the central server,” 57 “model server 106 may execute the encryption instructions 106c to encrypt the ML model 106b using a second fully homomorphic encryption technique prior to transmitting the ML model 106b to the central server 104 and to decrypt any updates to the ML model 106b or other data received from the central server 104. In certain embodiments, the first fully homomorphic encryption technique may be different from the second fully homomorphic encryption technique,” 98 “method 200 further includes receiving, from each user of a plurality of users, a training dataset encrypted using an independently generated key pair, in accordance with a multi-key, multi-hop fully homomorphic encryption technique. The example method 200 may further include training, with the training datasets, a ML model using a ML training technique and receiving encrypted test datasets from one or more users of the plurality of users. The example method 200 may further include generating, by executing the ML model, encrypted outputs for each encrypted test dataset, and causing user computing devices of each of the one or more users to participate in on-the-fly, multiparty computation to decrypt one or more respective encrypted outputs of the encrypted outputs by transmitting the encrypted outputs to each respective user of the one or more users,”]) Examiner Note: Examiner under a broadest reasonable interpretation interprets the disclosures of Najarian as cited to above which coordinates communications of homomorphic related encryption information across, as cited to above, “example environment 100 includes a central server 104, a model server 106, and a user device,” to disclose the implementation of encryption schemes or methods as related to specific devices including a coordinating server system and associated independent user devices at a wide variety of locations. Thus the claimed system of Najarian discloses specific delivery of information as related to specific locations and conditions as related to system users and coordinators of services.
Therefore it would be obvious for Crabtree to implement the homomorphic encryption and the homomorphic decryption utilize a first encryption scheme specific to the specific site and a second encryption scheme specific to a vendor as disclosed by Najarian disclosure and the implementation of the collection of the homomorphic encrypted data for the purpose of interpreting the contents of medical images and the development of treatment protocols for patients.
Claim 2: Crabtree in view of Vizitiu and Najarian discloses the computer-implemented method according to claim 1 and Crabtree further discloses presenting the plain-text medical imaging protocol to a user ([179 “homomorphic encryption, which is a type of homomorphic encryption that allows for computations on either the ciphertext or the plaintext,” 183 “processing that may be performed on genomic data by healthcare server 1022 can include, but are not limited to, whole genome sequencing, whole exome sequencing, genetic testing for specific conditions, pharmacogenomic testing, carrier screening, genomic tumor profiling, and general genomic counseling. Healthcare server 1022 can perform these types of computations and store the results on distributed ledger,” 185 “integrates homomorphic encryption, blockchain-powered token generation, and distribution ledger technologies to ensure secure, private, and transparent transactions in the context of personal health data (PHI) exchange,” 213 “combination with a neural network are potentially unlimited so long as the problem is deterministic, but common applications include classification problems, labeling problems, compression or algorithm parameter tuning problems, image or audio recognition, and natural language processing,” 327 “pattern recognition analyzer continuously scans the developing model, identifying significant patterns, trends, or anomalies. These insights are incorporated into the visual display, helping to draw the user's attention to important aspects of their health data. This method enables the PHDB system to transform diverse, complex health data into a coherent, informative, and interactive spatiotemporal model pattern,”]).
Claim 3: Crabtree in view of Vizitiu and Najarian discloses the computer-implemented method according to claim 1 and Crabtree does not disclose however Najarian discloses wherein the homomorphic encrypted large-language model is provided by the vendor. ([37 “example environment 100 may be a distributed computing environment where the encrypted data and encrypted ML model are located/stored in any suitable location(s) and may be transmitted across a network 116 to the various components, as necessary. In particular, the example environment 100 includes a central server 104, a model server 106, and a user device 108. Broadly, the user device 108 may store and transmit encrypted user data 108b1 to the central server 104 across the network 116, and the central server 104 may also receive an encrypted,”]). Examiner Note: Examiner under a broadest reasonable interpretation interprets the disclosures of Najarian as cited to above which coordinates communications of homomorphic related encryption information across, as cited to above, “example environment 100 includes a central server 104, a model server 106, and a user device,” to disclose the implementation of encryption schemes or methods as related to specific devices including a coordinating server system and associated independent user devices at a wide variety of locations. Thus the claimed system of Najarian discloses specific delivery of information as related to specific locations and conditions as related to system users and coordinators of services which Examiner interprets to include vendors and associated system users.
Therefore it would be obvious for Crabtree wherein the homomorphic encrypted large-language model is provided by the vendor as disclosed by Najarian disclosure and the implementation of the collection of the homomorphic encrypted data for the purpose of interpreting the contents of medical images and the development of treatment protocols for patients.
Claims 4, 9, 16, and 18: Crabtree in view of Vizitiu and Najarian discloses the computer-implemented method according to claims 1 and 2 and Crabtree further discloses training a large-language model for use in the computer-implemented method wherein the computer-implemented method of training comprises:
modifying a large-language model to include a homomorphic encrypted layer architecture ([157 “Large Language Model (LLM) computing 370 refers to the utilization of advanced computational systems to train, deploy, and utilize large language models,” 179 “Homomorphic encryption engine (HEE) 920 functions as a cryptographic system providing homomorphic encryption of personal health information or other data requested from PHDB cloud services 101. The use of a homomorphic encryption engine enables computations that can be executed on encrypted data without requiring decryption,” 209 “exemplary architecture of a machine learning engine. A machine learning engine 1510 may be a software component, standalone software library, system on a chip, application-specific integrated circuit (“ASIC”), or some other form of digital computing device or system capable of interacting with and receiving data from other digital or software systems,” 213, 215, 216]);
obtaining homomorphic encrypted training data based on imaging protocols for the medical imaging device installed at the specific site ([179 “Homomorphic encryption engine,” 283 “data streams provide continuous updates on various physiological parameters, activity levels, and environmental exposures. Additionally, the subsystem can incorporate data from labs 160 and third-party services 130, which may include detailed medical imaging, test results, and specialized health assessments,” 331 “neural network components excel at pattern recognition and handling unstructured data, such as identifying trends in a user's blood pressure readings or recognizing potential early signs of disease from medical imaging,” 337 “Structured data might include numerical health metrics, lab results, or clearly formatted electronic health records. Unstructured data could encompass free-text clinical notes, medical images, or raw sensor data from wearable devices,” 338 “process natural language in clinical notes to identify key health indicators or analyze medical images to detect potential abnormalities,” 345 “using natural language processing to identify key health indicators in clinical notes or applying computer vision algorithms to detect anomalies in medical images,” 352 “subsystem is capable of processing and integrating diverse data types, including structured data (like numerical health metrics and lab results), unstructured data (such as clinical notes and medical images), and real-time data streams from IoT devices,”]);
training the homomorphic encrypted large-language model on the homomorphic encrypted training data to generate a homomorphic encrypted medical imaging protocol ([112 “homomorphic encryption” refers to the cryptographic technique that allows computations to be performed on encrypted data without decrypting it first. It enables certain operations to be carried out on encrypted data while in ciphertext as if it were still in its original form, and the results are obtained in encrypted form,” 119 “PHDB mobile device 110a-n may comprise a plurality of sensors which may be used to monitor and capture various biometric, behavioral, and/or physiological data associated with the owner (end user) of the PHDB mobile device. Captured sensors data may be stored in PHDB 111a either in raw data form, or in a format suitable for storage after one or more data processing operations,” 124 “encryption platform 121 may implement homomorphic encryption when processing or otherwise analyzing personal health information. In this way, the system can provide processing of encrypted data without having to decrypt and potentially leak personal information,” 125, 129 “can include using strong encryption algorithms such as AES, and in some implementations, homomorphic encryption. Personal health information can be segmented into categories (e.g., medical records, lab results, prescriptions, genome data, microbiome data, phenotype data, biometric data, activity data, preference data, etc.) to facilitate access control and data management,” 157 “Large Language Model (LLM) computing 370 refers to the utilization of advanced computational systems to train, deploy, and utilize large language models,” 244 “match using homomorphic encryption with a PHDB, according to an aspect of the invention. According to the aspect, the process begins at step 2210 as the user attempts to find a match for themselves in a specific domain, such as dating, blood or organ donor, gaming partner, etc. The PHDB data is then subjected to homomorphic encryption by an encryption engine at step 2220. In an embodiment, the homomorphic encryption scheme is partially homomorphic encryption. In an embodiment, the homomorphic encryption scheme is fully homomorphic encryption. This encryption process ensures that sensitive health-related information remains confidential while still allowing for meaningful comparisons and searches,” 280 “feature allows for parametric studies that explore the potential impacts of various factors, including imaging techniques, sampling methods, and environmental exposures. Such simulations can aid in diagnostics, treatment advisory, and treatment calibration, offering a more nuanced and personalized approach to healthcare,” 283 “subsystem can incorporate data from labs 160 and third-party services 130, which may include detailed medical imaging, test results, and specialized health assessments,” 331 “neural network components excel at pattern recognition and handling unstructured data, such as identifying trends in a user's blood pressure readings or recognizing potential early signs of disease from medical imaging. Meanwhile, the symbolic AI components incorporate medical knowledge and logical rules, allowing for deductive reasoning based on established healthcare guidelines and protocols,” 337, 338]).
Claim 5: Crabtree in view of Vizitiu and Najarian discloses the computer-implemented method according to claim 4 above and Crabtree further discloses wherein the homomorphic encrypted training data is generated by
obtaining protocol data for the medical imaging device installed at the specific site ([235 “approach enhances the scalability and flexibility of the system, allowing for the integration of diverse IoT Devices with different communication protocols and data formats,” 240 “secure communication protocols and encryption mechanisms guarantee the confidentiality and integrity of the shared data, fostering a trustful and privacy aware environment for the user,” 243 “ensures that the central repository of health data, accessible through cloud services, is synchronized with the user's updated information while adhering to established security protocols,” 331 “symbolic AI components incorporate medical knowledge and logical rules, allowing for deductive reasoning based on established healthcare guidelines and protocols,” 345 “could involve using natural language processing to identify key health indicators in clinical notes or applying computer vision algorithms to detect anomalies in medical images,” 352 “processing and integrating diverse data types, including structured data (like numerical health metrics and lab results), unstructured data (such as clinical notes and medical images), and real-time data streams from IoT devices,” 356 “computer vision techniques to analyze medical images with natural language processing to interpret clinical notes, providing a more comprehensive understanding of a patient's condition than either data type could offer alone,” 366 “ombine numerical data from lab tests, textual data from clinical notes, imaging data from diagnostic procedures, and continuous streams of sensor data from wearable devices to create a holistic view of a patient's health status,” 376]);
generating training data from the protocol data ([406 “historical data and outcomes to train and refine the model, ensuring that predictions become more accurate as more data becomes available. The training system might employ techniques like cross-validation and hyperparameter tuning to optimize the model's performance,” 409-418]);
Crabtree does not explicitly disclose however Najarian discloses:
performing homomorphic encryption on the training data using the first encryption scheme and the second encryption scheme ([37 “example environment 100 may be a distributed computing environment where the encrypted data and encrypted ML model are located/stored in any suitable location(s) and may be transmitted across a network 116 to the various components, as necessary. In particular, the example environment 100 includes a central server 104, a model server 106, and a user device 108. Broadly, the user device 108 may store and transmit encrypted user data 108b1 to the central server 104 across the network 116, and the central server 104 may also receive an encrypted,” 38 “computations performed by FHE trained ML models herein may be performed, in some architectures, across decentralized devices or servers. In some such examples, each distributed device or server may include an FHE trained ML model and a central server may coordinate operations across these distributed devices,” 40 “central server 104 may include a processor 104a, a memory 104b, and a networking interface 104c. The memory 104b may include a ML module 104b1 and a training module 104b2. The model server 106 may include a training module 106a, a ML model 106b, and encryption instructions 106c. The user device 108 may include a processor 108a, a memory 108b, a networking interface 108c, and an input/output (I/O) interface 108d,” 44 “central server 104 may include a ML module 104b1 and a training module 104b2 that may generally be configured to collectively train, re-train, and/or update the ML model 106b. The ML module 104b1 may receive input encrypted user data 108b1 (e.g., data from a hospital) and may execute the encrypted ML model 106b to generate encrypted classification(s). The training module 104b2 may utilize new input data (e.g., new encrypted data) to re-train/update the ML model 106b after training at the model server 106 to generate outputs (e.g., classifications) based on the prior training dataset and the new input data,” 46 “example system 120 of FIG. 1B includes the central server 104 receiving data transmissions from the model server 106 and the user device 108 and generating and transmitting outputs to the model server 106 and the user device 108 based on those data transmissions,” 47 “model server 106 may transmit an encrypted ML model 106b to the central server 104, and the user device 108 may transmit encrypted data 108b1 to the central server 104. The central server 104 may analyze the encrypted data 108b1 using the encrypted ML model 106b to generate an encrypted classification, which the central server 104 may transmit to the user device 108 for decryption, in accordance with the encryption instructions,” 48, 55 “user device 108 and the model server 106 may execute the encryption instructions 108b2, 106c to encrypt/decrypt the data transmitted to the central server 104 and/or received from the central server 104. For example, the user device 108 may execute the encryption instructions 108b2 to encrypt the user data 108b1 using a first fully homomorphic encryption technique prior to transmitting the user data 108b1 to the central server 104 and to decrypt the encrypted classification received from the central server,” 57 “model server 106 may execute the encryption instructions 106c to encrypt the ML model 106b using a second fully homomorphic encryption technique prior to transmitting the ML model 106b to the central server 104 and to decrypt any updates to the ML model 106b or other data received from the central server 104. In certain embodiments, the first fully homomorphic encryption technique may be different from the second fully homomorphic encryption technique,” 98 “method 200 further includes receiving, from each user of a plurality of users, a training dataset encrypted using an independently generated key pair, in accordance with a multi-key, multi-hop fully homomorphic encryption technique. The example method 200 may further include training, with the training datasets, a ML model using a ML training technique and receiving encrypted test datasets from one or more users of the plurality of users. The example method 200 may further include generating, by executing the ML model, encrypted outputs for each encrypted test dataset, and causing user computing devices of each of the one or more users to participate in on-the-fly, multiparty computation to decrypt one or more respective encrypted outputs of the encrypted outputs by transmitting the encrypted outputs to each respective user of the one or more users,”]) Examiner Note: Examiner under a broadest reasonable interpretation interprets the disclosures of Najarian as cited to above which coordinates communications of homomorphic related encryption information across, as cited to above, “example environment 100 includes a central server 104, a model server 106, and a user device,” to disclose the implementation of encryption schemes or methods as related to specific devices including a coordinating server system and associated independent user devices at a wide variety of locations. Thus the claimed system of Najarian discloses specific delivery of information as related to specific locations and conditions as related to system users and coordinators of services.
Therefore it would be obvious for Crabtree to perform homomorphic encryption on the training data using the first encryption scheme and the second encryption scheme by Najarian disclosure and the implementation of the collection of the homomorphic encrypted data for the purpose of interpreting the contents of medical images and the development of treatment protocols for patients.
Claim 8: Crabtree in view of Vizitiu and Najarian discloses the computer-implemented method according to claim 4 and Crabtree does not disclose however Najarian disclose wherein the computer-implemented method of training is performed by the vendor. ([37 “example environment 100 may be a distributed computing environment where the encrypted data and encrypted ML model are located/stored in any suitable location(s) and may be transmitted across a network 116 to the various components, as necessary. In particular, the example environment 100 includes a central server 104, a model server 106, and a user device 108. Broadly, the user device 108 may store and transmit encrypted user data 108b1 to the central server 104 across the network 116, and the central server 104 may also receive an encrypted,”]). Examiner Note: Examiner under a broadest reasonable interpretation interprets the disclosures of Najarian as cited to above which coordinates communications of homomorphic related encryption information across, as cited to above, “example environment 100 includes a central server 104, a model server 106, and a user device,” to disclose the implementation of encryption schemes or methods as related to specific devices including a coordinating server system and associated independent user devices at a wide variety of locations. Thus the claimed system of Najarian discloses specific delivery of information as related to specific locations and conditions as related to system users and coordinators of services which Examiner interprets to include vendors and associated system users.
Therefore it would be obvious for Crabtree wherein the computer-implemented method of training is performed by the vendor.as disclosed by Najarian disclosure and the implementation of the collection of the homomorphic encrypted data for the purpose of interpreting the contents of medical images and the development of treatment protocols for patients.
Claims 10 and 19: Crabtree in view of Vizitiu and Najarian discloses the computer-implemented method according to claim1 and 2 above and Crabtree further discloses:
an encryption module configured to perform the homomorphic encryption on the patient-specific input data for the intended medical imaging task ([117 “treatment plans (e.g., information about the patient's treatment plan, including any medications, therapies, or procedures that have been prescribed), progress notes (e.g., notes on the patient's progress, including any changes in their condition, response to treatment, or other relevant information), diagnostic tests (e.g., information about any diagnostic tests that have been performed, such as blood tests, imaging studies, or biopsies, and the results of those tests),” 149 “system can engage in express parametric studies (e.g., initial seed manipulation) to look at potential impacts of various factors/constraints such as imaging, sampling, extraction, errors and uncertainty for statistical, ML/AI, or modelling simulation processes (e.g., diagnostics, treatment advisory, treatment calibration, etc.),” 213 “kinds of tasks that may be used in combination with a neural network are potentially unlimited so long as the problem is deterministic, but common applications include classification problems, labeling problems, compression or algorithm parameter tuning problems, image or audio recognition, and natural language processing. Neural networks may be used as part of a machine learning engine, as the method by which training is done and a model is generated,” 214, 216 “a neural network component processes raw data, such as images, text, or sensor inputs, to extract relevant features and learn patterns. This component is responsible for tasks like image recognition, natural language processing, or sensor data analysis. The symbolic reasoning component manipulates abstract symbols and rules to perform tasks that require logical reasoning or explicit knowledge representation. This component is responsible for tasks like logical inference, planning, or knowledge representation,” 244 “identifying a match using homomorphic encryption with a PHDB, according to an aspect of the invention. According to the aspect, the process begins at step 2210 as the user attempts to find a match for themselves in a specific domain, such as dating, blood or organ donor, gaming partner, etc. The PHDB data is then subjected to homomorphic encryption by an encryption engine at step 2220. In an embodiment, the homomorphic encryption scheme is partially homomorphic encryption. In an embodiment, the homomorphic encryption scheme is fully homomorphic encryption,”]);
a protocol generation stage configured to apply the homomorphic encrypted patient-specific input data to the homomorphic encrypted large-language model ([112, 157, 220]) to obtain the homomorphic encrypted medical imaging protocol ([116 “data could include information from medical tests, imaging studies, and patient questionnaires), tech companies (e.g., wearable technology industry), government agencies, and consumer research firms,” 117, 124 “encryption platform 121 may implement homomorphic encryption when processing or otherwise analyzing personal health information. In this way, the system can provide processing of encrypted data without having to decrypt and potentially leak personal information,” 139 “Medical data 250 encompasses a range of health-related information….imaging studies (e.g., x-rays, ultrasound, magnetic resonance imaging, computed tomography scan, etc.), electrocardiogram, and various biometric measurements (e.g., heart rate variability, electrodermal activity, pulse oximetry, etc.),” 149, 150, 157 “Large Language Model (LLM) computing 370 refers to the utilization of advanced computational systems to train, deploy, and utilize large language models. These models are typically built using deep learning techniques and have the capability to understand, generate, and process human language at a sophisticated level,”]);
Crabtree does not explicitly disclose however Vizitu discloses:
a decryption module configured to perform the homomorphic decryption (Page 5 “ensure that the message can only be identified through proper decryption and by possessing the secret key,” Page 8 “results in a model that provides encrypted predictions, which can only be decrypted by the owner of the secret key,” Page 17 “training loss for the regression task, as resulted after decryption, is depicted in Figure 10(a). Similarly, the evolution of the training and validation accuracy of the privacy-preserving CNN model fed with encrypted X-ray coronary angiographies, obtained after decryption,”) on the homomorphic encrypted medical imaging protocol to obtain the plain-text medical imaging protocol (Page 2 “detailed description of privacy-preserving techniques for machine learning, homomorphic encryption,” Page 4 “additive homomorphic scheme where addition in the ciphertext space corresponds to multiplication in the plaintext space, and the ElGamal scheme [44], a multiplicative homomorphic scheme, which, through some modifications,” Page 4 “a plaintext scalar is encrypted as a n × n ciphertext matrix, and matrix algebra is employed to enable computations on ciphertext data. All operations performed on ciphertext data are therefore defined as matrix operations, e.g., the multiplication of plaintext scalars is formulated as the matrix multiplication of ciphertext matrices,” Page 4 “employing partially homomorphic encryption (PHE) instead of FHE. Since FHE is currently practically impossible to be used in a realworld system, a viable approach is a system based on PHE that is specialized only for certain operations,” Page 7 “deep convolutional neural network (CNN) architecture was proposed to enable feature learning directly from the input images….amount of available data, and the formulated problems, herein we focus on a FCNN for solving a regression task and on a CNN for image-based analysis,” Page 8 “access only to the encrypted version of the data (ciphertext), while the actual data (plaintext) are detached from the processing unit and remain private on the side of the data provider. Finally, with the homomorphic property underlying the MORE encryption scheme, the direct support for floating-point arithmetic,” Page 12 “main imaging modality for the diagnosis of coronary artery disease (CAD) is invasive X-ray coronary angiography (ICA) [74]. It allows for a comprehensive assessment of both the function and the structure of the heart. During the invasive procedure, a dye with radio opaque characteristics is inserted in the coronary vessels and a set of images is recorded in succession by an X-ray scanner,” Page 14 “class labels or real-valued quantities, were also encrypted, except for the binary classification problem where the target was given as plaintext. We chose to encrypt only the input data, i.e., the coronary angiography images, and leave the target, i.e., binary label 0 or 1, as plaintext to show that training can as well be performed if labels are formulated as plaintext,”); Examiner Note: Examiner under a broadest reasonable interpretation considers the disclosures of Vizitiu with respect to the processing of data by means of homomorphic decryption and the processing of plain text data as derived from the images and the implementation of the disclosed processing for medical images and associated processing of collected data to disclose the above limitation.
Therefore it would be obvious for Crabtree implement a decryption module configured to perform the homomorphic decryption on the homomorphic encrypted medical imaging protocol to obtain the plain-text medical imaging protocol by the Vizitiu disclosure and the implementation of the collection of the homomorphic encrypted data for the purpose of interpreting the contents of medical images and the development of treatment protocols for patients.
Claim 11: Crabtree in view of Vizitiu and Najarian discloses the computer-implemented method according to claim10 above and Crabtree does not explicitly disclose however Najarian discloses wherein the encryption module and the decryption module are located at the specific site ([30 “present disclosure introduces a system including a data owner providing encrypted data as inputs to an encrypted ML model that outputs an encrypted classification, wherein the encryption of all data, models, and classifications is performed and/or otherwise in accordance with one or more fully homomorphic encryption (FHE) techniques,” 44 “central server 104 may include a ML module 104b1 and a training module 104b2 that may generally be configured to collectively train, re-train, and/or update the ML model 106b. The ML module 104b1 may receive input encrypted user data 108b1 (e.g., data from a hospital) and may execute the encrypted ML model 106b to generate encrypted classification(s). The training module 104b2 may utilize new input data (e.g., new encrypted data) to re-train/update the ML model 106b after training at the model server,” 47 “model server 106 may transmit an encrypted ML model 106b to the central server 104, and the user device 108 may transmit encrypted data 108b1 to the central server 104. The central server 104 may analyze the encrypted data 108b1 using the encrypted ML model 106b to generate an encrypted classification, which the central server 104 may transmit to the user device 108 for decryption,”]). Examiner Note: Examiner under a broadest reasonable interpretation interprets the disclosures of Najarian with respect to implementing the encryption and decryption modules at a specific location as denoted by a central server to disclose the claimed limitation specifically denoting a particular location for the processing of data.
Therefore it would be obvious for Crabtree wherein the encryption module and the decryption module are located at the specific site by the Najarian disclosure and the implementation of the collection of the homomorphic encrypted data for the purpose of interpreting the contents of medical images and the development of treatment protocols for patients.
Claim 12: Crabtree in view of Vizitiu and Najarian discloses the computer-implemented method according to claim10 above and Crabtree does not explicitly disclose however Najarian discloses wherein the protocol generation stage is provided by the vendor ([37 “example environment 100 may be a distributed computing environment where the encrypted data and encrypted ML model are located/stored in any suitable location(s) and may be transmitted across a network 116 to the various components, as necessary. In particular, the example environment 100 includes a central server 104, a model server 106, and a user device 108. Broadly, the user device 108 may store and transmit encrypted user data 108b1 to the central server 104 across the network 116, and the central server 104 may also receive an encrypted,” 38 “computations performed by FHE trained ML models herein may be performed, in some architectures, across decentralized devices or servers. In some such examples, each distributed device or server may include an FHE trained ML model and a central server may coordinate operations across these distributed devices,” 40 “central server 104 may include a processor 104a, a memory 104b, and a networking interface 104c. The memory 104b may include a ML module 104b1 and a training module 104b2. The model server 106 may include a training module 106a, a ML model 106b, and encryption instructions 106c. The user device 108 may include a processor 108a, a memory 108b, a networking interface 108c, and an input/output (I/O) interface 108d,” 44 “central server 104 may include a ML module 104b1 and a training module 104b2 that may generally be configured to collectively train, re-train, and/or update the ML model 106b. The ML module 104b1 may receive input encrypted user data 108b1 (e.g., data from a hospital) and may execute the encrypted ML model 106b to generate encrypted classification(s). The training module 104b2 may utilize new input data (e.g., new encrypted data) to re-train/update the ML model 106b after training at the model server 106 to generate outputs (e.g., classifications) based on the prior training dataset and the new input data,” 46 “example system 120 of FIG. 1B includes the central server 104 receiving data transmissions from the model server 106 and the user device 108 and generating and transmitting outputs to the model server 106 and the user device 108 based on those data transmissions,” 47 “model server 106 may transmit an encrypted ML model 106b to the central server 104, and the user device 108 may transmit encrypted data 108b1 to the central server 104. The central server 104 may analyze the encrypted data 108b1 using the encrypted ML model 106b to generate an encrypted classification, which the central server 104 may transmit to the user device 108 for decryption, in accordance with the encryption instructions,” 48, 55 “user device 108 and the model server 106 may execute the encryption instructions 108b2, 106c to encrypt/decrypt the data transmitted to the central server 104 and/or received from the central server 104. For example, the user device 108 may execute the encryption instructions 108b2 to encrypt the user data 108b1 using a first fully homomorphic encryption technique prior to transmitting the user data 108b1 to the central server 104 and to decrypt the encrypted classification received from the central server,” 57 “model server 106 may execute the encryption instructions 106c to encrypt the ML model 106b using a second fully homomorphic encryption technique prior to transmitting the ML model 106b to the central server 104 and to decrypt any updates to the ML model 106b or other data received from the central server 104. In certain embodiments, the first fully homomorphic encryption technique may be different from the second fully homomorphic encryption technique,” 98 “method 200 further includes receiving, from each user of a plurality of users, a training dataset encrypted using an independently generated key pair, in accordance with a multi-key, multi-hop fully homomorphic encryption technique. The example method 200 may further include training, with the training datasets, a ML model using a ML training technique and receiving encrypted test datasets from one or more users of the plurality of users. The example method 200 may further include generating, by executing the ML model, encrypted outputs for each encrypted test dataset, and causing user computing devices of each of the one or more users to participate in on-the-fly, multiparty computation to decrypt one or more respective encrypted outputs of the encrypted outputs by transmitting the encrypted outputs to each respective user of the one or more users,”]) Examiner Note: Examiner under a broadest reasonable interpretation interprets the disclosures of Najarian as cited to above which coordinates communications of homomorphic related encryption information across, as cited to above, “example environment 100 includes a central server 104, a model server 106, and a user device,” to disclose the implementation of encryption schemes or methods as related to specific devices including a coordinating server system and associated independent user devices at a wide variety of locations. Thus the claimed system of Najarian discloses specific delivery of information as related to specific locations and conditions as related to system users and coordinators of services.
Therefore it would be obvious for Crabtree wherein the protocol generation stage is provided by the vendor by the Najarian disclosure and the implementation of the collection of the homomorphic encrypted data for the purpose of interpreting the contents of medical images and the development of treatment protocols for patients.
Claim 13: Crabtree in view of Vizitiu and Najarian discloses the computer-implemented system according to claim10 above and Crabtree further discloses:
a model development stage provided by the vendor ([413 “model and training database 4306 is present and configured to store training/test datasets and developed models. Database 4306 may also store previous versions of models,”]), the model development stage including preparatory stages configured to prepare homomorphic encrypted training data from collected imaging protocol data ([178 “Encryption platform possesses the capability to perform both regular and homomorphic encryption,” 179-185, 206 “machine learning engine 1410 may be configured to create, store, and maintain one or more predictive or analytical models developed using machine and/or deep learning techniques,” 218 “multiple LLMs may be developed, wherein each of the multiple LLMs may be configured to specific domain or preference. For example, a first LLM may be trained to be a resource for use cases involving genomic assistance, and a second LLM is trained to be a resource for use cases involving microbiome-based assistance. In other implementations, an LLM may be personalized to an individual user,” 405 “Machine learning prediction model 4230 is capable of handling a wide range of predictive tasks. For instance, it might forecast future values of key health metrics (like blood glucose levels for diabetic patients), predict the risk of developing certain health conditions based on genetic and lifestyle factors, or estimate the likelihood of hospital readmission for patients with chronic conditions,” 417]).
Claim 14: Crabtree in view of Vizitiu and Najarian discloses the computer-implemented system according to claim13 above and Crabtree further discloses:
wherein the model development stage includes further stages configured to train a homomorphic encrypted version of the large-language model with the homomorphic encrypted training data ([157 “Large Language Model (LLM) computing 370 refers to the utilization of advanced computational systems to train, deploy, and utilize large language models,” 179 “Homomorphic encryption engine (HEE) 920 functions as a cryptographic system providing homomorphic encryption of personal health information or other data requested from PHDB cloud services 101. The use of a homomorphic encryption engine enables computations that can be executed on encrypted data without requiring decryption. In contrast to regular encryption, which necessitates decryption before operations, homomorphic encryption allows computations directly on encrypted data, safeguarding its confidentiality. The homomorphic encryption engine 920 employs robust encryption algorithms to encrypt data,” 180-183, 207 “machine learning engine 1410 may train one or more machines and/or deep learning algorithms to create a model. The data used for model training purposes may comprise subsets of data stored in a plurality of PHDBs. The training data may further comprise PHDB-enabled device-specific data or metadata such as mobile device location information, a device IP address, and/or the like. Training data may further be sourced from MTSDB 1420 or third-party services 130 such as, for example, social media servers, medical services providers, data centers, and medical/behavioral/therapeutic/etc. labs,” 215 “showing an exemplary arrangement of a PHDB system utilizing Large Language Model (LLM) machine learning technology, according to an aspect of the invention. Users equipped with PHDB mobile devices 110a-n, comprising applications 113a-n, an operating system (OS) 112a, and the PHDB application 111a, possess the capability to share data directly with the PHDB-related computing platform,” .
Claim(s) 6, 7, and 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over the combination of Crabtree et al. (20250349399) in view of Vizitiu et al., Applying Deep Neural Networks over Homomorphic Encrypted Medical Data, (hereinafter Vizitiu), in view of Najarian et al. (20250266983) and in further view of Hartkens et al (20220122256).
Claim 6: Crabtree in view of Vizitiu and Najarian discloses the computer-implemented method according to claim 4 above and Crabtree does not explicitly disclose however Hartkens discloses wherein the imaging protocols include at least several hundred imaging protocols ([3 “an imaging device typically holds several hundreds or even more than a thousand different acquisition parameter protocols, which may be set depending on the specific needs of an examination. Radiological centers continuously modify those variables aiming to enhance the resulting image quality,”]).
Therefore it would be obvious for Crabtree wherein the imaging protocols include at least several hundred imaging protocols as disclosed by Hartkens to perform the implementation of the collection of the homomorphic encrypted data for the purpose of interpreting the contents of medical images and the development of treatment protocols for patients.
Claims 7 and 17: Crabtree in view of Vizitiu and Najarian discloses the computer-implemented method according to claim 4 above and Crabtree does not explicitly disclose however Hartkens discloses wherein protocol data for the imaging protocols includes a number of imaging protocol cookbooks for the medical imaging device ([11 “Each of the plurality of reference images may represent a sample image usable as template for future image acquisitions. Each of the plurality of reference images may previously be selected as being representative of a desired level of image quality for a particular type of medical image. In other words, the plurality of reference images may correspond to images which are selected to be of “good” quality for particular types of medical images,” 31 “Acquisition parameters may be managed based on reference images instead of conventional generic acquisition protocols stored locally on the medical imaging devices. Images of “good” quality may be exported from a PACS and imported as reference images into a qLoop repository (corresponding to the “central repository” described above). Example images selected from the PACS may become templates for future image acquisitions and the acquisition process may thus generally be denoted as “scan by example”. Configuring the medical imaging device by reference images may allow radiologists to define the desired acquisition parameters in an easy and intuitive manner, i.e., by just pointing to a representative sample image which has been rated to be “good”. Instead of configuring a generic acquisition protocol stored locally at the medical imaging device, the radiologist may thus not need to be physically at the imaging device but may rather select the reference image for configuring the medical imaging device from a remote location,”]). Examiner Note: Examiner under a broadest reasonable interpretation interprets the cookbook term to be as disclosed in paragraphs [28] as recited “particularly preferred embodiment of the present invention, therefore, the protocol data comprises any relevant medical imaging protocol cookbooks for a medical imaging device. A protocol cookbook in PDF format can be scanned and digitized, and converted to the same format used for the protocols,” and [29] as recited “LLM could also be trained with synthetically generated medical imaging protocols and/or protocol cookbooks in order to improve its accuracy. By training the LLM on many thousands of verified or approved protocols used for past imaging tasks, a synthesized protocol can be expected to be very precise,” of the written description which detail the creation of data collection protocols which Examiner interprets to include templates which are understood by a person of skill in the art to include templates related to data included in images. Examiner interprets the local storage of acquisition protocols to be the equivalent of providing data collection protocols.
Therefore it would be obvious for Crabtree wherein protocol data for the imaging protocols includes a number of imaging protocol cookbooks for the medical imaging device as disclosed by Hartkens to perform the implementation of the collection of the homomorphic encrypted data for the purpose of interpreting the contents of medical images and the development of treatment protocols for patients.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant’s disclosure. Please see attached References Cited form 892.
See Refael Kalim et al. (20250323777) for disclosures related to the implementation of a hybrid neural network executing homomorphic encryption processes which divides the neural network between servers and client devices. See at least paras. [35]-[60].
See Galvin (20250038765) for disclosures related to a codebook implemented homomorphic data compression system which quantizing input data and generating an optimized codebook and performing a number of data processing techniques. See at least paras. [75]-[115].
See Bronchonski (20220385449) for disclosures related to the validation of machine learning models as related to the implementation of homomorphic encryption and determining a range of encryption parameters. See at least paras [12]-[54].
See Parulan et al. (20220150048) for disclosures related to the implementation and maintenance of a homomorphic encryption configuration data structure and in response to receiving a data query performing an identification process to specifically identify data. See at least paras. [30]-[65].
See Hartkens et al. (20220122256) for disclosures related to the performance of a feedback based quality assessment for medical related images based upon the collection of image related data. See at least paras. [28]-[38].
See Burceanu et al. (20220012359) for disclosures related to the implementation of homomorphic encryption on video and audio related data and the distribution of the data as related to encryption and re-encryption procedures. See at least paras. [32]-[58].
See Wade (20210211269) for disclosures related to the encryption of images by the implementation of a homomorphic encryption function which produces cipher images for the processed images. See at least paras. [13]-[27].
See Hoang (20200358611) for disclosures related to the implementation of homomorphic encryption techniques for the processing of biometric and other related data to perform a wide range of processing of physical parameters. See at least paras. [21]-[40].
See Soon-Shiong et al. (20160105042) for disclosures related to the implementation of homomorphic encryption in a healthcare network environment which coordinates data processing across various data sources. See at least paras. [14]-[49].
See Hahn et al (WO 2025/042692 A1) for disclosures related to obtaining encrypted data, wherein the encrypted data are generated by applying the FHE to plaintext data; and (C) generating an inference with the DNN-based model based on the encrypted data. See at least pages 3-5.
See Megala et al. Secure medical image encryption using homomorphic techniques; 2024 Second International Conference on Advances in Information Technology (ICAIT-2024); for disclosures related to employs homomorphic encryption, a cutting-edge cryptographic technology, to protect medical image integrity and confidentiality during transmission.
See Vizitiu et al. Applying Deep Neural Networks over Homomorphic Encrypted
Medical Data, Computational and Mathematical Methods in Medicine Volume 2020, Article ID 3910250 for disclosures related to encryption scheme, MORE (Matrix Operation for
Randomization or Encryption), enables the computations within a neural network model to be directly performed on floating point data with a relatively small computational overhead.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to David Stoltenberg whose telephone number is (571) 270-3472.
The examiner can normally be reached on Monday-Friday 8:30AM to 5:00PM EST. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kambiz Abdi, can be reached on (571) 272-6702. The fax phone number for the organization where this application or proceeding is assigned is (571)-273-8300, or the examiner’s direct fax phone number is (571) 270 4472.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published application may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center at (866) 217-9197 (toll free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call (800) 786-9199 (IN USA OR CANADA) or (571) 272-1000.
/DAVID J STOLTENBERG/Primary Examiner, Art Unit 3685