DETAILED ACTION
This Office Action is in response to application 18/533,018 filed on September 25, 2023.
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Claims 1-20 are pending and herein considered.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1, 11,20 are rejected under 35 U.S.C. 102(a)(1)/(a)(2) as being anticipated by Sapatwar et al. (Sapatwar) U.S. Pub. Number 2021/0344478.
Regarding claim 1; Sapatwar discloses a computer-implemented method comprising:
receiving ciphertexts encrypted from training samples provided by one or more data owners; and
training a regression tree using the ciphertexts, wherein the training is configured to encrypt the regression tree through homomorphic operations on the ciphertexts, wherein the training is performed independently by a server without interaction with the one or more data owners, wherein the training samples comprise sample values corresponding to multiple attributes and respective target values, (par. [0066] the hosting cloud server 402 allows users to run inference queries on the model 400. Typically, a user is associated with a client machine 406, and the client and server are configured to operate according to the client-server model previously described. A homomorphic encryption (HE) protocol is enforced across the client-server operating environment such that the cloud protects the model's privacy while users (clients) maintain the privacy of the scoring data points returned by the model. In a typical request-response workflow, the client 406 sends an encrypted query 408 (e.g., a data point) to the cloud server 402, the cloud server 402 applies the model and then returns a response 410. The response includes the encrypted inference results. In this manner, privacy-preserving inference problems are securely evaluated).
wherein the regression tree comprises nodes organized in multiple layers and edges connecting between some of the nodes, (par. [0075] The technique of this disclosure provides significant advantages. As has been described, the approach herein provides for a way to provide privacy-preserving inference on a pre-trained decision tree (in particular, an ensemble decision-tree based regression model), but in a computationally-efficient manner. The approach leverages the notion of training a shallow neural network that learns the decision boundary of the tree, with homomorphic evaluation of the tree then approximated by instead performing homomorphic inference on the neural network. Because HE-inferencing on the neural network is efficient)
wherein the nodes comprise one or more inner nodes and a plurality of leaf nodes, wherein an inner node has two child nodes, wherein a leaf node has no child node (par. [0070] FIG. 5 depicts the basic technique of this disclosure for building and training the shallow neural network that is used as the proxy or surrogate for the decision tree (or ensemble of trees) of interest. As depicted, trained decision tree 500 represents the model of interest. It is a trained on a data set, which is sometimes referred to herein as the original data set),().
wherein the inner nodes represent test conditions for corresponding attributes, the edges represent outcomes of evaluating the sample values based on the test conditions, and the leaf nodes represent predictions of the target values (par. [0073] the networks trainer 608 trains the neural network N (or each such NN in an ensemble model) using the randomly-generated data set and the corresponding predicted outputs AY (which is also a decision tree). As noted above, the network trainer 608 trains the shallow neural network to learn the decision boundary of the decision tree (or each decision tree in the ensemble of decision trees). The network encryptor 610 performs homomorphic encryption for each N. During the process, the network encryptor 610 encrypts the shallow network using the public key of the client. Finally, the private evaluator 612 performs the homomorphic inference on the network N on one or more user-provided HE-data point(s) and returns the encrypted results (an encrypted prediction) to the user).
Regarding claim 11; claim 11 is direct to a computer system which have similar scope as claim 1. Therefore, claim 11 remains un-patentable for the same reason.
Regarding claim 20; claim 20 is direct to a computer readable-media which have similar scope as claim 1. Therefore, claim 20 remains un-patentable for the same reason.
Allowable Subject Matter
Claim 2-10, 12-19 objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Related Art
The following prior art made of record and cited on PTO-892, but not relied upon, is considered pertinent to applicant’s disclosure:
U.S. Pat. Number 2020/0244437 A1 to Ruan-Ruan teaches robustness using homomorphic encryption is described in combination with the operational computing environment. FIG. 2 shows an overall solution for a model security deployment assessment, which is implemented among a data demand party 110, a security assessment party 120 and a data source party 130. Firstly, the data demand party 110 sends to the data source party 130 a deployment request for deploying a homomorphically-encrypted data model (such as a trained machine learning model) on the data source party 130, wherein the deployment request comprises one or more ciphertext model parameters that are generated based on encrypting, with homomorphic encryption, one or more plaintext parameters of the data model and a public key for encryption. Before initiating the deployment request, the data demand party 110 trains a data model using data from various sources as a model feature, and performs homomorphic encryption on the model parameter of the data model using a public key to obtain an encrypted data model (a ciphertext model). Note that in various embodiments, a linear regression model
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to VU V TRAN whose telephone number is (571)270-1708. The examiner can normally be reached M-F, 8 AM- 4 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Rupal Dharia can be reached on 571-272-3880. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/VU V TRAN/Primary Examiner, Art Unit 2491