Prosecution Insights
Last updated: April 19, 2026
Application No. 18/330,253

PRE-PROCESSING FOR DEEP NEURAL NETWORK COMPILATION USING GRAPH NEURAL NETWORKS

Non-Final OA §101§103§112§DP
Filed
Jun 06, 2023
Examiner
WERNER, MARSHALL L
Art Unit
2125
Tech Center
2100 — Computer Architecture & Software
Assignee
Qualcomm Incorporated
OA Round
1 (Non-Final)
66%
Grant Probability
Favorable
1-2
OA Rounds
3y 11m
To Grant
99%
With Interview

Examiner Intelligence

Grants 66% — above average
66%
Career Allow Rate
133 granted / 200 resolved
+11.5% vs TC avg
Strong +44% interview lift
Without
With
+44.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 11m
Avg Prosecution
60 currently pending
Career history
260
Total Applications
across all art units

Statute-Specific Performance

§101
29.0%
-11.0% vs TC avg
§103
37.4%
-2.6% vs TC avg
§102
8.5%
-31.5% vs TC avg
§112
21.0%
-19.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 200 resolved cases

Office Action

§101 §103 §112 §DP
DETAILED ACTION This action is in response to the Applicant Response filed 06 June 2023 for application 18/330,253 filed 06 June 2023. Claim(s) 1-28 is/are pending. Claim(s) 1-28 is/are rejected. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 5, 12, 19, 26 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 5 recites the graph embedding corresponding to the ANN model is unique which is a relative term which renders the claim indefinite. The term “unique” is not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. Correction or clarification is required. Examiner’s Note: For the purposes of examination, the limitation will be interpreted as each ANN model has a graph embedding and no two graph embeddings for each different ANN model. Claim 12 recites the graph embedding corresponding to the ANN model is unique which is a relative term which renders the claim indefinite. The term “unique” is not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. Correction or clarification is required. Examiner’s Note: For the purposes of examination, the limitation will be interpreted as each ANN model has a graph embedding and no two graph embeddings for each different ANN model. Claim 19 recites the graph embedding corresponding to the ANN model is unique which is a relative term which renders the claim indefinite. The term “unique” is not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. Correction or clarification is required. Examiner’s Note: For the purposes of examination, the limitation will be interpreted as each ANN model has a graph embedding and no two graph embeddings for each different ANN model. Claim 26 recites the graph embedding corresponding to the ANN model is unique which is a relative term which renders the claim indefinite. The term “unique” is not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. Correction or clarification is required. Examiner’s Note: For the purposes of examination, the limitation will be interpreted as each ANN model has a graph embedding and no two graph embeddings for each different ANN model. Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. Claims 1-28 are provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1, 7, 13, 19 of co-pending Application No. 18/500,014 in view of Zhou et al. (US 2023/0176840 A1 – Learned Graph Optimizations for Compilers, hereinafter referred to as “Zhou”) and/or Yu et al. (Auto Graph Encoder-Decoder for Neural Network Pruning, hereinafter referred to as “Yu”) and/or Tu et al. (AutoNE: Hyperparameter Optimization for Massive Network Embedding, hereinafter referred to as “Tu”). Although the claims at issue are not identical, they are not patentably distinct from each other because as noted in the table below, claims 1-28 of the instant application have similar limitations as recited in co-pending Application No. 18/500,014 (claims 1, 7, 13, 19) except for additional limitations included in co-pending Application No. 18/500,014. This is a provisional nonstatutory double patenting rejection. Application No. 18/330,253 Co-pending Appl. No. 18/500,014 Claim 1 Claim 7 A processor-implemented method of pre-processing for deep neural network compilation, comprising: A processor-implemented method of pre-processing for deep neural network compilation performed by at least one processor, the processor-implemented method comprising: receiving a representation of an artificial neural network (ANN) model; receiving a representation of an artificial neural network (ANN) model, the ANN including multiple nodes coupled by edges; generating an operator embedding to represent operators of the ANN model in an embedding space; generating an operator embedding to represent operators of the ANN model in an embedding space based on the position information for each node; processing, by a graph neural network (GNN), the operator embedding, to generate a graph embedding corresponding to the ANN model according to a learned distance metric; and processing, by a graph neural network (GNN), the operator embedding to generate a graph embedding corresponding to the ANN model according to a learned distance metric and based on the position information; and determining, by the GNN, a set of hyperparameters for the ANN model based on the graph embedding. determining, by the GNN, a set of hyperparameters for the ANN model based on the graph embedding. Claim 2 in which the GNN determines the graph embedding based on a metric learning objective. Claim 3 in which a distance between the graph embedding corresponding to the ANN model and a second graph embedding corresponding to a second ANN model is proportional to a relative size of the ANN model and the second ANN model. Claim 4 in which the GNN is trained based on a reconstruction loss. Claim 5 in which the graph embedding corresponding to the ANN model is unique. Claim 6 compiling the ANN model using the set of hyperparameters. Claim 7 determining, by the GNN, the set of hyperparameters for the ANN model using a similarity search over a set of graph embeddings corresponding to ANN models. Claim 8 Claim 1 An apparatus, comprising: An apparatus of pre-processing for deep neural network compilation, comprising: a memory; and at least one memory; and at least one processor coupled to the memory, the at least one processor configured: at least one processor coupled to the at least one memory, the at least one processor configured to: to receive a representation of an artificial neural network (ANN) model; receive a representation of an artificial neural network (ANN) model, the ANN including multiple nodes coupled by edges; to generate an operator embedding to represent operators of the ANN model in an embedding space; generate an operator embedding to represent operators of the ANN model in an embedding space based on the position information for each node; to process, by a graph neural network (GNN), the operator embedding, to generate a graph embedding corresponding to the ANN model according to a learned distance metric; and process, by a graph neural network (GNN), the operator embedding to generate a graph embedding corresponding to the ANN model according to a learned distance metric and based on the position information; and to determine, by the GNN, a set of hyperparameters for the ANN model based on the graph embedding. determine, by the GNN, a set of hyperparameters for the ANN model based on the graph embedding. Claim 9 in which the GNN determines the graph embedding based on a metric learning objective. Claim 10 in which a distance between the graph embedding corresponding to the ANN model and a second graph embedding corresponding to a second ANN model is proportional to a relative size of the ANN model and the second ANN model. Claim 11 in which the GNN is trained based on a reconstruction loss. Claim 12 in which the graph embedding corresponding to the ANN model is unique. Claim 13 in which the at least one processor is further configured to compile the ANN model using the set of hyperparameters. Claim 14 in which the at least one processor is further configured to determine, by the GNN, the set of hyperparameters for the ANN model using a similarity search over a set of graph embeddings corresponding to ANN models. Claim 15 Claim 13 A non-transitory computer-readable medium having program code recorded thereon, the program code executed by a processor and comprising: A non-transitory computer-readable medium having program code recorded thereon, the program code executed by a processor and comprising: program code to receive a representation of an artificial neural network (ANN) model; program code to receive a representation of an artificial neural network (ANN) model, the ANN including multiple nodes coupled by edges; program code to generate an operator embedding to represent operators of the ANN model in an embedding space; program code to generate an operator embedding to represent operators of the ANN model in an embedding space based on the position information for each node; program code to process, by a graph neural network (GNN), the operator embedding, to generate a graph embedding corresponding to the ANN model according to a learned distance metric; and program code to process, by a graph neural network (GNN), the operator embedding to generate a graph embedding corresponding to the ANN model according to a learned distance metric and based on the position information; and program code to determine, by the GNN, a set of hyperparameters for the ANN model based on the graph embedding. program code to determine, by the GNN, a set of hyperparameters for the ANN model based on the graph embedding. Claim 16 in which the GNN determines the graph embedding based on a metric learning objective. Claim 17 in which a distance between the graph embedding corresponding to the ANN model and a second graph embedding corresponding to a second ANN model is proportional to a relative size of the ANN model and the second ANN model. Claim 18 in which the GNN is trained based on a reconstruction loss. Claim 19 in which the graph embedding corresponding to the ANN model is unique. Claim 20 in which the program code comprises program code to compile the ANN model using the set of hyperparameters. Claim 21 in which the program code comprises program code to determine, by the GNN, the set of hyperparameters for the ANN model using a similarity search over a set of graph embeddings corresponding to ANN models. Claim 22 Claim 19 An apparatus, comprising: An apparatus of pre-processing for deep neural network compilation, comprising: means for receiving a representation of an artificial neural network (ANN) model; means for receiving a representation of an artificial neural network (ANN) model, the ANN including multiple nodes coupled by edges; means for generating an operator embedding to represent operators of the ANN model in an embedding space; means for generating an operator embedding to represent operators of the ANN model in an embedding space based on the position information for each node; means for processing, by a graph neural network (GNN), the operator embedding, to generate a graph embedding corresponding to the ANN model according to a learned distance metric; and means for processing, by a graph neural network (GNN), the operator embedding to generate a graph embedding corresponding to the ANN model according to a learned distance metric and based on the position information; and means for determining, by the GNN, a set of hyperparameters for the ANN model based on the graph embedding. means for determining, by the GNN, a set of hyperparameters for the ANN model based on the graph embedding. Claim 23 in which the GNN determines the graph embedding based on a metric learning objective. Claim 24 in which a distance between the graph embedding corresponding to the ANN model and a second graph embedding corresponding to a second ANN model is proportional to a relative size of the ANN model and the second ANN model. Claim 25 in which the GNN is trained based on a reconstruction loss. Claim 26 in which the graph embedding corresponding to the ANN model is unique. Claim 27 means for compiling the ANN model using the set of hyperparameters. Claim 28 means for determining, by the GNN, the set of hyperparameters for the ANN model using a similarity search over a set of graph embeddings corresponding to ANN models. Regarding claim 2 (similarly 9, 16, 23), Application No. 18/500,014 teaches all of the limitations of claim 1 (7, 15, 22, respectively), as stated in the chart. However, Application No. 18/500,014 does not explicitly teach in which the GNN determines the graph embedding based on a metric learning objective. Zhou teaches in which the GNN determines the graph embedding based on a metric learning objective (Zhou, [0053]-[0054] – teaches generating graph embeddings based on neighbor nodes; see also, Zhou, [0051] – proximal policy optimization). It would have been obvious to one of ordinary skill in the art before the filing date of the claimed invention to modify Application No. 18/500,014 with the teachings of Zhou in order to learn better and more sophisticated models in less time in the field of neural network compilation using GNNs (Zhou, [0025] – “Particular embodiments of the subject matter described in this specification can be implemented so as to realize one or more of the following advantages. The techniques described in this specification enable the creation of a compiler optimization network that outperforms state-of-the-art methods in terms of the resulting runtime of the input programs. In particular, the techniques described below outperform a network having only a graph embedding network in placement performance. In addition, the techniques described below are faster than processing large sequence lengths compared to networks having LSTM policy networks instead of attention layers. Lastly, the techniques described below handle large graphs much better than graph attention networks.”). Regarding claim 3 (similarly 10, 17, 24), Application No. 18/500,014 teaches all of the limitations of claim 1 (7, 15, 22, respectively), as stated in the chart. However, Application No. 18/500,014 does not explicitly teach in which a distance between the graph embedding corresponding to the ANN model and a second graph embedding corresponding to a second ANN model is proportional to a relative size of the ANN model and the second ANN model. Tu teaches in which a distance between the graph embedding corresponding to the ANN model and a second graph embedding corresponding to a second ANN model is proportional to a relative size of the ANN model and the second ANN model (Tu, sections 3.3-3.4 – teaches similarity between two networks is based ratio of network sizes). It would have been obvious to one of ordinary skill in the art before the filing date of the claimed invention to modify Application No. 18/500,014 with the teachings of Tu in order to automatically optimize the hyperparameters of a NE algorithm on massive networks in the field of neural network compilation using GNNs (Tu, Abstract – “Network embedding (NE) aims to embed the nodes of a network into a vector space, and serves as the bridge between machine learning and network data. Despite their widespread success, NE algorithms typically contain a large number of hyperparameters for preserving the various network properties, which must be carefully tuned in order to achieve satisfactory performance. Though automated machine learning (AutoML) has achieved promising results when applied to many types of data such as images and texts, network data poses great challenges to AutoML and remains largely ignored by the literature of AutoML. The biggest obstacle is the massive scale of real-world networks, along with the coupled node relationships that make any straightforward sampling strategy problematic. In this paper, we propose a novel framework, named AutoNE, to automatically optimize the hyperparameters of a NE algorithm on massive networks. In detail, we employ a multi-start random walk strategy to sample several small sub-networks, perform each trial of configuration selection on the sampled sub-network, and design a meta-leaner to transfer the knowledge about optimal hyperparameters from the sub-networks to the original massive network. The transferred meta-knowledge greatly reduces the number of trials required when predicting the optimal hyperparameters for the original network. Extensive experiments demonstrate that our framework can significantly outperform the existing methods, in that it needs less time and fewer trials to find the optimal hyperparameters.”). Regarding claim 4 (similarly 11, 18, 25), Application No. 18/500,014 teaches all of the limitations of claim 1 (7, 15, 22, respectively), as stated in the chart. However, Application No. 18/500,014 does not explicitly teach in which the GNN is trained based on a reconstruction loss. Yu teaches in which the GNN is trained based on a reconstruction loss (Yu, section 3.3 – teaches training based on reward function defined as reconstruction loss). It would have been obvious to one of ordinary skill in the art before the filing date of the claimed invention to modify Application No. 18/500,014 with the teachings of Yu in order to improve performance an compression ratio with fewer search steps in the field of neural network compilation using GNNs (Yu, Abstract – “Model compression aims to deploy deep neural networks (DNN) on mobile devices with limited computing and storage resources. However, most of the existing model compression methods rely on manually defined rules, which require domain expertise. DNNs are essentially computational graphs, which contain rich structural information. In this paper, we aim to find a suitable compression policy from DNNs’ structural information. We propose an automatic graph encoder-decoder model compression (AGMC) method combined with graph neural networks (GNN) and reinforcement learning (RL). We model the target DNN as a graph and use GNN to learn the DNN’s embeddings automatically. We compared our method with rule-based DNN embedding model compression methods to show the effectiveness of our method. Results show that our learning-based DNN embedding achieves better performance and a higher compression ratio with fewer search steps. We evaluated our method on over-parameterized and mobile-friendly DNNs and compared our method with handcrafted and learning-based model compression approaches. On over parameterized DNNs, such as ResNet-56, our method outperformed handcrafted and learning-based methods with 4.36% and 2.56% higher accuracy, respectively. Furthermore, on MobileNet-v2, we achieved a higher compression ratio than state-of-the-art methods with just 0.93% accuracy loss.”). Regarding claim 5 (similarly 12, 19, 26), Application No. 18/500,014 teaches all of the limitations of claim 1 (7, 15, 22, respectively), as stated in the chart. However, Application No. 18/500,014 does not explicitly teach in which the graph embedding corresponding to the ANN model is unique. Zhou teaches in which the graph embedding corresponding to the ANN model is unique (Zhou, [0061] – teaches unique model structure with unique graph embeddings; see also Zhou, [0013]-[0024] – demonstrating various examples). It would have been obvious to one of ordinary skill in the art before the filing date of the claimed invention to modify Application No. 18/500,014 with the teachings of Zhou in order to learn better and more sophisticated models in less time in the field of neural network compilation using GNNs (Zhou, [0025] – “Particular embodiments of the subject matter described in this specification can be implemented so as to realize one or more of the following advantages. The techniques described in this specification enable the creation of a compiler optimization network that outperforms state-of-the-art methods in terms of the resulting runtime of the input programs. In particular, the techniques described below outperform a network having only a graph embedding network in placement performance. In addition, the techniques described below are faster than processing large sequence lengths compared to networks having LSTM policy networks instead of attention layers. Lastly, the techniques described below handle large graphs much better than graph attention networks.”). Regarding claim 6 (similarly 13, 20, 27), Application No. 18/500,014 teaches all of the limitations of claim 1 (7, 15, 22, respectively), as stated in the chart. However, Application No. 18/500,014 does not explicitly teach compiling the ANN model using the set of hyperparameters. Zhou teaches compiling the ANN model using the set of hyperparameters (Zhou, [0071] [0073] – teaches compiling the network according to the optimization plan). It would have been obvious to one of ordinary skill in the art before the filing date of the claimed invention to modify Application No. 18/500,014 with the teachings of Zhou in order to learn better and more sophisticated models in less time in the field of neural network compilation using GNNs (Zhou, [0025] – “Particular embodiments of the subject matter described in this specification can be implemented so as to realize one or more of the following advantages. The techniques described in this specification enable the creation of a compiler optimization network that outperforms state-of-the-art methods in terms of the resulting runtime of the input programs. In particular, the techniques described below outperform a network having only a graph embedding network in placement performance. In addition, the techniques described below are faster than processing large sequence lengths compared to networks having LSTM policy networks instead of attention layers. Lastly, the techniques described below handle large graphs much better than graph attention networks.”). Regarding claim 7 (similarly 14, 21, 28), Application No. 18/500,014 teaches all of the limitations of claim 1 (7, 15, 22, respectively), as stated in the chart. However, Application No. 18/500,014 does not explicitly teach determining, by the GNN, the set of hyperparameters for the ANN model using a similarity search over a set of graph embeddings corresponding to ANN models. Tu teaches determining, by the GNN, the set of hyperparameters for the ANN model using a similarity search over a set of graph embeddings corresponding to ANN models (Tu, section 3.2 – teaches sampling representative sub-networks sharing similar properties). It would have been obvious to one of ordinary skill in the art before the filing date of the claimed invention to modify Application No. 18/500,014 with the teachings of Tu in order to automatically optimize the hyperparameters of a NE algorithm on massive networks in the field of neural network compilation using GNNs (Tu, Abstract – “Network embedding (NE) aims to embed the nodes of a network into a vector space, and serves as the bridge between machine learning and network data. Despite their widespread success, NE algorithms typically contain a large number of hyperparameters for preserving the various network properties, which must be carefully tuned in order to achieve satisfactory performance. Though automated machine learning (AutoML) has achieved promising results when applied to many types of data such as images and texts, network data poses great challenges to AutoML and remains largely ignored by the literature of AutoML. The biggest obstacle is the massive scale of real-world networks, along with the coupled node relationships that make any straightforward sampling strategy problematic. In this paper, we propose a novel framework, named AutoNE, to automatically optimize the hyperparameters of a NE algorithm on massive networks. In detail, we employ a multi-start random walk strategy to sample several small sub-networks, perform each trial of configuration selection on the sampled sub-network, and design a meta-leaner to transfer the knowledge about optimal hyperparameters from the sub-networks to the original massive network. The transferred meta-knowledge greatly reduces the number of trials required when predicting the optimal hyperparameters for the original network. Extensive experiments demonstrate that our framework can significantly outperform the existing methods, in that it needs less time and fewer trials to find the optimal hyperparameters.”). Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim(s) 1-28 is/are rejected under 35 U.S.C. 101, because the claim(s) is/are directed to an abstract idea, and because the claim elements, whether considered individually or in combination, do not amount to significantly more than the abstract idea, see Alice Corporation Pty. Ltd. V. CLS Bank International et al., 573 US 208 (2014). Regarding claim 1, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 1 is directed to a method, which is directed to a process, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) processor-implemented method of pre-processing for deep neural network compilation. The limitation of generating an operator embedding to represent operators of the ANN model in an embedding space, as drafted, is a process that, under its broadest reasonable interpretation, covers a mental process. The limitation is directed to observation, evaluation, judgment and opinion and is a process capable of being performed by a human mentally or using pen and paper. The limitation of processing ... the operator embedding, to generate a graph embedding corresponding to the ANN model according to a learned distance metric, as drafted, is a process that, under its broadest reasonable interpretation, covers a mental process. The limitation is directed to observation, evaluation, judgment and opinion and is a process capable of being performed by a human mentally or using pen and paper. The limitation of determining ... a set of hyperparameters for the ANN model based on the graph embedding, as drafted, is a process that, under its broadest reasonable interpretation, covers a mental process. The limitation is directed to observation, evaluation, judgment and opinion and is a process capable of being performed by a human mentally or using pen and paper. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind, then it falls within the "Mental Processes" grouping. Accordingly, the claim recites an abstract idea. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. The claim recites additional element(s) – processor-implemented. The additional element(s) is/are recited at a high-level of generality (i.e., as generic computer components performing generic computer functions of executing instructions on the computers) such that it amounts to no more than mere instructions to apply the exception using generic computer components (MPEP 2106.05(b)). The claim recites additional element(s) – artificial neural network (ANN) model, graph neural network (GNN). The additional element(s) is/are recited at a high-level of generality such that it amounts to no more than indicating a field of use or technological environment in which to apply the judicial exception (MPEP 2106.05(h)). The claim recites receiving a representation of an artificial neural network (ANN) model, which is simply acquiring data recited at a high level of generality. This is nothing more than insignificant extra-solution activity (MPEP 2106.05(g)). Accordingly, the additional element(s) do(es) not integrate the abstract idea into a practical application because the additional element(s) do(es) not impose any meaningful limits on practicing the abstract idea, and, therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the additional element(s) of: processor-implemented amount(s) to no more than mere instructions to apply the exception using generic computer components (MPEP 2106.05(b)) acquiring data amount(s) to no more than insignificant extra-solution activity (MPEP 2106.05(g)), wherein the insignificant extra-solution activity is the well-understood routine and conventional activit(y/ies) of receiving or transmitting data over a network and/or storing and retrieving information in memory (MPEP 2016.05(d)) artificial neural network (ANN) model, graph neural network (GNN) amount(s) to no more than indicating a field of use or technological environment in which to apply the judicial exception (MPEP 2106.05(h)) The additional element(s) do(es) not provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 2, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 2 is directed to a method, which is directed to a process, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) processor-implemented method of pre-processing for deep neural network compilation. The Step 2A Prong One Analysis for claim 1 is applicable here since claim 2 carries out the method of claim 1 but for the recitation of additional element(s) of in which the GNN determines the graph embedding based on a metric learning objective. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. In particular, the claim recites additional information regarding the GNN and the element(s) do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Accordingly, the additional element(s) do(es) not integrate the abstract idea into a practical application because the additional element(s) do(es) not impose any meaningful limits on practicing the abstract idea, and, therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the additional element(s) of additional information regarding the GNN do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Not applying the exception in a meaningful way does not provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 3, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 3 is directed to a method, which is directed to a process, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) processor-implemented method of pre-processing for deep neural network compilation. The Step 2A Prong One Analysis for claim 1 is applicable here since claim 3 carries out the method of claim 1 but for the recitation of additional element(s) of in which a distance between the graph embedding corresponding to the ANN model and a second graph embedding corresponding to a second ANN model is proportional to a relative size of the ANN model and the second ANN model. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. In particular, the claim recites additional information regarding the models and the element(s) do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Accordingly, the additional element(s) do(es) not integrate the abstract idea into a practical application because the additional element(s) do(es) not impose any meaningful limits on practicing the abstract idea, and, therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the additional element(s) of additional information regarding the model do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Not applying the exception in a meaningful way does not provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 4, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 4 is directed to a method, which is directed to a process, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) processor-implemented method of pre-processing for deep neural network compilation. The Step 2A Prong One Analysis for claim 1 is applicable here since claim 4 carries out the method of claim 1 but for the recitation of additional element(s) of in which the GNN is trained based on a reconstruction loss. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. In particular, the claim recites additional information regarding the GNN and the element(s) do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Accordingly, the additional element(s) do(es) not integrate the abstract idea into a practical application because the additional element(s) do(es) not impose any meaningful limits on practicing the abstract idea, and, therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the additional element(s) of additional information regarding the GNN do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Not applying the exception in a meaningful way does not provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 5, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 5 is directed to a method, which is directed to a process, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) processor-implemented method of pre-processing for deep neural network compilation. The Step 2A Prong One Analysis for claim 1 is applicable here since claim 5 carries out the method of claim 1 but for the recitation of additional element(s) of in which the graph embedding corresponding to the ANN model is unique. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. In particular, the claim recites additional information regarding the graph embeddings and the element(s) do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Accordingly, the additional element(s) do(es) not integrate the abstract idea into a practical application because the additional element(s) do(es) not impose any meaningful limits on practicing the abstract idea, and, therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the additional element(s) of additional information regarding the graph embeddings do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Not applying the exception in a meaningful way does not provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 6, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 6 is directed to a method, which is directed to a process, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) processor-implemented method of pre-processing for deep neural network compilation. The limitation of compiling the ANN model using the set of hyperparameters, as drafted, is a process that, under its broadest reasonable interpretation, covers a mental process. The limitation is directed to observation, evaluation, judgment and opinion and is a process capable of being performed by a human mentally or using pen and paper. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind, then it falls within the "Mental Processes" grouping. Accordingly, the claim recites an abstract idea. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. The claim does not recite any additional elements which integrate the abstract idea into a practical application and, therefore, does not impose any meaningful limits on practicing the abstract idea. Therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the claim does not recite any additional elements which provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 7, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 7 is directed to a method, which is directed to a process, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) processor-implemented method of pre-processing for deep neural network compilation. The limitation of determining ... the set of hyperparameters for the ANN model using a similarity search over a set of graph embeddings corresponding to ANN models, as drafted, is a process that, under its broadest reasonable interpretation, covers a mental process. The limitation is directed to observation, evaluation, judgment and opinion and is a process capable of being performed by a human mentally or using pen and paper. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind, then it falls within the "Mental Processes" grouping. Accordingly, the claim recites an abstract idea. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. The claim does not recite any additional elements which integrate the abstract idea into a practical application and, therefore, does not impose any meaningful limits on practicing the abstract idea. Therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the claim does not recite any additional elements which provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 8, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 8 is directed to an apparatus, which is directed to a machine, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) apparatus. The limitation of generate an operator embedding to represent operators of the ANN model in an embedding space, as drafted, is a process that, under its broadest reasonable interpretation, covers a mental process. The limitation is directed to observation, evaluation, judgment and opinion and is a process capable of being performed by a human mentally or using pen and paper. The limitation of process ... the operator embedding, to generate a graph embedding corresponding to the ANN model according to a learned distance metric, as drafted, is a process that, under its broadest reasonable interpretation, covers a mental process. The limitation is directed to observation, evaluation, judgment and opinion and is a process capable of being performed by a human mentally or using pen and paper. The limitation of determine ... a set of hyperparameters for the ANN model based on the graph embedding, as drafted, is a process that, under its broadest reasonable interpretation, covers a mental process. The limitation is directed to observation, evaluation, judgment and opinion and is a process capable of being performed by a human mentally or using pen and paper. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind, then it falls within the "Mental Processes" grouping. Accordingly, the claim recites an abstract idea. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. The claim recites additional element(s) – apparatus, memory, at least one processor. The additional element(s) is/are recited at a high-level of generality (i.e., as generic computer components performing generic computer functions of executing instructions on the computers) such that it amounts to no more than mere instructions to apply the exception using generic computer components (MPEP 2106.05(b)). The claim recites additional element(s) – artificial neural network (ANN) model, graph neural network (GNN). The additional element(s) is/are recited at a high-level of generality such that it amounts to no more than indicating a field of use or technological environment in which to apply the judicial exception (MPEP 2106.05(h)). The claim recites receive a representation of an artificial neural network (ANN) model, which is simply acquiring data recited at a high level of generality. This is nothing more than insignificant extra-solution activity (MPEP 2106.05(g)). Accordingly, the additional element(s) do(es) not integrate the abstract idea into a practical application because the additional element(s) do(es) not impose any meaningful limits on practicing the abstract idea, and, therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the additional element(s) of: apparatus, memory, at least one processor amount(s) to no more than mere instructions to apply the exception using generic computer components (MPEP 2106.05(b)) acquiring data amount(s) to no more than insignificant extra-solution activity (MPEP 2106.05(g)), wherein the insignificant extra-solution activity is the well-understood routine and conventional activit(y/ies) of receiving or transmitting data over a network and/or storing and retrieving information in memory (MPEP 2016.05(d)) artificial neural network (ANN) model, graph neural network (GNN) amount(s) to no more than indicating a field of use or technological environment in which to apply the judicial exception (MPEP 2106.05(h)) The additional element(s) do(es) not provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 9, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 9 is directed to an apparatus, which is directed to a machine, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) apparatus. The Step 2A Prong One Analysis for claim 8 is applicable here since claim 9 carries out the apparatus of claim 8 but for the recitation of additional element(s) of in which the GNN determines the graph embedding based on a metric learning objective. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. In particular, the claim recites additional information regarding the GNN and the element(s) do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Accordingly, the additional element(s) do(es) not integrate the abstract idea into a practical application because the additional element(s) do(es) not impose any meaningful limits on practicing the abstract idea, and, therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the additional element(s) of additional information regarding the GNN do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Not applying the exception in a meaningful way does not provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 10, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 10 is directed to an apparatus, which is directed to a machine, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) apparatus. The Step 2A Prong One Analysis for claim 8 is applicable here since claim 10 carries out the apparatus of claim 8 but for the recitation of additional element(s) of in which a distance between the graph embedding corresponding to the ANN model and a second graph embedding corresponding to a second ANN model is proportional to a relative size of the ANN model and the second ANN model. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. In particular, the claim recites additional information regarding the models and the element(s) do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Accordingly, the additional element(s) do(es) not integrate the abstract idea into a practical application because the additional element(s) do(es) not impose any meaningful limits on practicing the abstract idea, and, therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the additional element(s) of additional information regarding the model do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Not applying the exception in a meaningful way does not provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 11, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 11 is directed to an apparatus, which is directed to a machine, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) apparatus. The Step 2A Prong One Analysis for claim 8 is applicable here since claim 11 carries out the apparatus of claim 8 but for the recitation of additional element(s) of in which the GNN is trained based on a reconstruction loss. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. In particular, the claim recites additional information regarding the GNN and the element(s) do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Accordingly, the additional element(s) do(es) not integrate the abstract idea into a practical application because the additional element(s) do(es) not impose any meaningful limits on practicing the abstract idea, and, therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the additional element(s) of additional information regarding the GNN do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Not applying the exception in a meaningful way does not provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 12, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 12 is directed to an apparatus, which is directed to a machine, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) apparatus. The Step 2A Prong One Analysis for claim 8 is applicable here since claim 12 carries out the apparatus of claim 8 but for the recitation of additional element(s) of in which the graph embedding corresponding to the ANN model is unique. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. In particular, the claim recites additional information regarding the graph embeddings and the element(s) do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Accordingly, the additional element(s) do(es) not integrate the abstract idea into a practical application because the additional element(s) do(es) not impose any meaningful limits on practicing the abstract idea, and, therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the additional element(s) of additional information regarding the graph embeddings do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Not applying the exception in a meaningful way does not provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 13, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 13 is directed to an apparatus, which is directed to a machine, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) apparatus. The limitation of compile the ANN model using the set of hyperparameters, as drafted, is a process that, under its broadest reasonable interpretation, covers a mental process. The limitation is directed to observation, evaluation, judgment and opinion and is a process capable of being performed by a human mentally or using pen and paper. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind, then it falls within the "Mental Processes" grouping. Accordingly, the claim recites an abstract idea. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. The claim does not recite any additional elements which integrate the abstract idea into a practical application and, therefore, does not impose any meaningful limits on practicing the abstract idea. Therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the claim does not recite any additional elements which provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 14, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 14 is directed to an apparatus, which is directed to a machine, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) apparatus. The limitation of determine ... the set of hyperparameters for the ANN model using a similarity search over a set of graph embeddings corresponding to ANN models, as drafted, is a process that, under its broadest reasonable interpretation, covers a mental process. The limitation is directed to observation, evaluation, judgment and opinion and is a process capable of being performed by a human mentally or using pen and paper. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind, then it falls within the "Mental Processes" grouping. Accordingly, the claim recites an abstract idea. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. The claim does not recite any additional elements which integrate the abstract idea into a practical application and, therefore, does not impose any meaningful limits on practicing the abstract idea. Therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the claim does not recite any additional elements which provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 15, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 15 is directed to a computer-readable medium, which is directed to an article of manufacture, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) computer-readable medium. The limitation of generate an operator embedding to represent operators of the ANN model in an embedding space, as drafted, is a process that, under its broadest reasonable interpretation, covers a mental process. The limitation is directed to observation, evaluation, judgment and opinion and is a process capable of being performed by a human mentally or using pen and paper. The limitation of process ... the operator embedding, to generate a graph embedding corresponding to the ANN model according to a learned distance metric, as drafted, is a process that, under its broadest reasonable interpretation, covers a mental process. The limitation is directed to observation, evaluation, judgment and opinion and is a process capable of being performed by a human mentally or using pen and paper. The limitation of determine ... a set of hyperparameters for the ANN model based on the graph embedding, as drafted, is a process that, under its broadest reasonable interpretation, covers a mental process. The limitation is directed to observation, evaluation, judgment and opinion and is a process capable of being performed by a human mentally or using pen and paper. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind, then it falls within the "Mental Processes" grouping. Accordingly, the claim recites an abstract idea. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. The claim recites additional element(s) – computer-readable medium, program code, processor. The additional element(s) is/are recited at a high-level of generality (i.e., as generic computer components performing generic computer functions of executing instructions on the computers) such that it amounts to no more than mere instructions to apply the exception using generic computer components (MPEP 2106.05(b)). The claim recites additional element(s) – artificial neural network (ANN) model, graph neural network (GNN). The additional element(s) is/are recited at a high-level of generality such that it amounts to no more than indicating a field of use or technological environment in which to apply the judicial exception (MPEP 2106.05(h)). The claim recites receive a representation of an artificial neural network (ANN) model, which is simply acquiring data recited at a high level of generality. This is nothing more than insignificant extra-solution activity (MPEP 2106.05(g)). Accordingly, the additional element(s) do(es) not integrate the abstract idea into a practical application because the additional element(s) do(es) not impose any meaningful limits on practicing the abstract idea, and, therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the additional element(s) of: computer-readable medium, program code, processor amount(s) to no more than mere instructions to apply the exception using generic computer components (MPEP 2106.05(b)) acquiring data amount(s) to no more than insignificant extra-solution activity (MPEP 2106.05(g)), wherein the insignificant extra-solution activity is the well-understood routine and conventional activit(y/ies) of receiving or transmitting data over a network and/or storing and retrieving information in memory (MPEP 2016.05(d)) artificial neural network (ANN) model, graph neural network (GNN) amount(s) to no more than indicating a field of use or technological environment in which to apply the judicial exception (MPEP 2106.05(h)) The additional element(s) do(es) not provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 16, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 16 is directed to a computer-readable medium, which is directed to an article of manufacture, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) computer-readable medium. The Step 2A Prong One Analysis for claim 15 is applicable here since claim 16 carries out the computer-readable medium of claim 15 but for the recitation of additional element(s) of in which the GNN determines the graph embedding based on a metric learning objective. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. In particular, the claim recites additional information regarding the GNN and the element(s) do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Accordingly, the additional element(s) do(es) not integrate the abstract idea into a practical application because the additional element(s) do(es) not impose any meaningful limits on practicing the abstract idea, and, therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the additional element(s) of additional information regarding the GNN do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Not applying the exception in a meaningful way does not provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 17, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 17 is directed to a computer-readable medium, which is directed to an article of manufacture, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) computer-readable medium. The Step 2A Prong One Analysis for claim 15 is applicable here since claim 17 carries out the computer-readable medium of claim 15 but for the recitation of additional element(s) of in which a distance between the graph embedding corresponding to the ANN model and a second graph embedding corresponding to a second ANN model is proportional to a relative size of the ANN model and the second ANN model. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. In particular, the claim recites additional information regarding the models and the element(s) do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Accordingly, the additional element(s) do(es) not integrate the abstract idea into a practical application because the additional element(s) do(es) not impose any meaningful limits on practicing the abstract idea, and, therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the additional element(s) of additional information regarding the model do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Not applying the exception in a meaningful way does not provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 18, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 18 is directed to a computer-readable medium, which is directed to an article of manufacture, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) computer-readable medium. The Step 2A Prong One Analysis for claim 15 is applicable here since claim 18 carries out the computer-readable medium of claim 15 but for the recitation of additional element(s) of in which the GNN is trained based on a reconstruction loss. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. In particular, the claim recites additional information regarding the GNN and the element(s) do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Accordingly, the additional element(s) do(es) not integrate the abstract idea into a practical application because the additional element(s) do(es) not impose any meaningful limits on practicing the abstract idea, and, therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the additional element(s) of additional information regarding the GNN do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Not applying the exception in a meaningful way does not provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 19, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 19 is directed to a computer-readable medium, which is directed to an article of manufacture, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) computer-readable medium. The Step 2A Prong One Analysis for claim 15 is applicable here since claim 19 carries out the computer-readable medium of claim 15 but for the recitation of additional element(s) of in which the graph embedding corresponding to the ANN model is unique. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. In particular, the claim recites additional information regarding the graph embeddings and the element(s) do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Accordingly, the additional element(s) do(es) not integrate the abstract idea into a practical application because the additional element(s) do(es) not impose any meaningful limits on practicing the abstract idea, and, therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the additional element(s) of additional information regarding the graph embeddings do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Not applying the exception in a meaningful way does not provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 20, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 20 is directed to a computer-readable medium, which is directed to an article of manufacture, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) computer-readable medium. The limitation of compile the ANN model using the set of hyperparameters, as drafted, is a process that, under its broadest reasonable interpretation, covers a mental process. The limitation is directed to observation, evaluation, judgment and opinion and is a process capable of being performed by a human mentally or using pen and paper. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind, then it falls within the "Mental Processes" grouping. Accordingly, the claim recites an abstract idea. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. The claim does not recite any additional elements which integrate the abstract idea into a practical application and, therefore, does not impose any meaningful limits on practicing the abstract idea. Therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the claim does not recite any additional elements which provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 21, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 21 is directed to a computer-readable medium, which is directed to an article of manufacture, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) computer-readable medium. The limitation of determine ... the set of hyperparameters for the ANN model using a similarity search over a set of graph embeddings corresponding to ANN models, as drafted, is a process that, under its broadest reasonable interpretation, covers a mental process. The limitation is directed to observation, evaluation, judgment and opinion and is a process capable of being performed by a human mentally or using pen and paper. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind, then it falls within the "Mental Processes" grouping. Accordingly, the claim recites an abstract idea. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. The claim does not recite any additional elements which integrate the abstract idea into a practical application and, therefore, does not impose any meaningful limits on practicing the abstract idea. Therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the claim does not recite any additional elements which provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 22, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 22 is directed to an apparatus, which is directed to a machine, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) apparatus. The limitation of generating an operator embedding to represent operators of the ANN model in an embedding space, as drafted, is a process that, under its broadest reasonable interpretation, covers a mental process. The limitation is directed to observation, evaluation, judgment and opinion and is a process capable of being performed by a human mentally or using pen and paper. The limitation of processing ... the operator embedding, to generate a graph embedding corresponding to the ANN model according to a learned distance metric, as drafted, is a process that, under its broadest reasonable interpretation, covers a mental process. The limitation is directed to observation, evaluation, judgment and opinion and is a process capable of being performed by a human mentally or using pen and paper. The limitation of determining ... a set of hyperparameters for the ANN model based on the graph embedding, as drafted, is a process that, under its broadest reasonable interpretation, covers a mental process. The limitation is directed to observation, evaluation, judgment and opinion and is a process capable of being performed by a human mentally or using pen and paper. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind, then it falls within the "Mental Processes" grouping. Accordingly, the claim recites an abstract idea. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. The claim recites additional element(s) – apparatus. The additional element(s) is/are recited at a high-level of generality (i.e., as generic computer components performing generic computer functions of executing instructions on the computers) such that it amounts to no more than mere instructions to apply the exception using generic computer components (MPEP 2106.05(b)). The claim recites additional element(s) – artificial neural network (ANN) model, graph neural network (GNN). The additional element(s) is/are recited at a high-level of generality such that it amounts to no more than indicating a field of use or technological environment in which to apply the judicial exception (MPEP 2106.05(h)). The claim recites receiving a representation of an artificial neural network (ANN) model, which is simply acquiring data recited at a high level of generality. This is nothing more than insignificant extra-solution activity (MPEP 2106.05(g)). Accordingly, the additional element(s) do(es) not integrate the abstract idea into a practical application because the additional element(s) do(es) not impose any meaningful limits on practicing the abstract idea, and, therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the additional element(s) of: apparatus amount(s) to no more than mere instructions to apply the exception using generic computer components (MPEP 2106.05(b)) acquiring data amount(s) to no more than insignificant extra-solution activity (MPEP 2106.05(g)), wherein the insignificant extra-solution activity is the well-understood routine and conventional activit(y/ies) of receiving or transmitting data over a network and/or storing and retrieving information in memory (MPEP 2016.05(d)) artificial neural network (ANN) model, graph neural network (GNN) amount(s) to no more than indicating a field of use or technological environment in which to apply the judicial exception (MPEP 2106.05(h)) The additional element(s) do(es) not provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 23, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 23 is directed to an apparatus, which is directed to a machine, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) apparatus. The Step 2A Prong One Analysis for claim 22 is applicable here since claim 23 carries out the apparatus of claim 22 but for the recitation of additional element(s) of in which the GNN determines the graph embedding based on a metric learning objective. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. In particular, the claim recites additional information regarding the GNN and the element(s) do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Accordingly, the additional element(s) do(es) not integrate the abstract idea into a practical application because the additional element(s) do(es) not impose any meaningful limits on practicing the abstract idea, and, therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the additional element(s) of additional information regarding the GNN do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Not applying the exception in a meaningful way does not provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 24, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 24 is directed to an apparatus, which is directed to a machine, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) apparatus. The Step 2A Prong One Analysis for claim 22 is applicable here since claim 24 carries out the apparatus of claim 22 but for the recitation of additional element(s) of in which a distance between the graph embedding corresponding to the ANN model and a second graph embedding corresponding to a second ANN model is proportional to a relative size of the ANN model and the second ANN model. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. In particular, the claim recites additional information regarding the models and the element(s) do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Accordingly, the additional element(s) do(es) not integrate the abstract idea into a practical application because the additional element(s) do(es) not impose any meaningful limits on practicing the abstract idea, and, therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the additional element(s) of additional information regarding the model do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Not applying the exception in a meaningful way does not provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 25, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 25 is directed to an apparatus, which is directed to a machine, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) apparatus. The Step 2A Prong One Analysis for claim 22 is applicable here since claim 25 carries out the apparatus of claim 22 but for the recitation of additional element(s) of in which the GNN is trained based on a reconstruction loss. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. In particular, the claim recites additional information regarding the GNN and the element(s) do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Accordingly, the additional element(s) do(es) not integrate the abstract idea into a practical application because the additional element(s) do(es) not impose any meaningful limits on practicing the abstract idea, and, therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the additional element(s) of additional information regarding the GNN do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Not applying the exception in a meaningful way does not provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 26, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 26 is directed to an apparatus, which is directed to a machine, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) apparatus. The Step 2A Prong One Analysis for claim 22 is applicable here since claim 26 carries out the apparatus of claim 22 but for the recitation of additional element(s) of in which the graph embedding corresponding to the ANN model is unique. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. In particular, the claim recites additional information regarding the graph embeddings and the element(s) do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Accordingly, the additional element(s) do(es) not integrate the abstract idea into a practical application because the additional element(s) do(es) not impose any meaningful limits on practicing the abstract idea, and, therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the additional element(s) of additional information regarding the graph embeddings do(es) not apply the exception in a meaningful way (MPEP 2106.05(e)). Not applying the exception in a meaningful way does not provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 27, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 27 is directed to an apparatus, which is directed to a machine, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) apparatus. The limitation of compiling the ANN model using the set of hyperparameters, as drafted, is a process that, under its broadest reasonable interpretation, covers a mental process. The limitation is directed to observation, evaluation, judgment and opinion and is a process capable of being performed by a human mentally or using pen and paper. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind, then it falls within the "Mental Processes" grouping. Accordingly, the claim recites an abstract idea. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. The claim does not recite any additional elements which integrate the abstract idea into a practical application and, therefore, does not impose any meaningful limits on practicing the abstract idea. Therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the claim does not recite any additional elements which provide an inventive concept, and, therefore, the claim is not patent eligible. Regarding claim 28, the claim is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Analysis: Claim 28 is directed to an apparatus, which is directed to a machine, one of the statutory categories. Step 2A Prong One Analysis: The claim recites a(n) apparatus. The limitation of determining ... the set of hyperparameters for the ANN model using a similarity search over a set of graph embeddings corresponding to ANN models, as drafted, is a process that, under its broadest reasonable interpretation, covers a mental process. The limitation is directed to observation, evaluation, judgment and opinion and is a process capable of being performed by a human mentally or using pen and paper. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind, then it falls within the "Mental Processes" grouping. Accordingly, the claim recites an abstract idea. Step 2A Prong Two Analysis: With respect to the abstract idea, the judicial exception is not integrated into a practical application. The claim does not recite any additional elements which integrate the abstract idea into a practical application and, therefore, does not impose any meaningful limits on practicing the abstract idea. Therefore, the claim is directed to an abstract idea. Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the claim does not recite any additional elements which provide an inventive concept, and, therefore, the claim is not patent eligible. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-2, 5-6, 8-9, 12-13, 15-16, 19-20, 22-23, 26-27 is/are rejected under 35 U.S.C. 103 as being unpatentable over Zhou et al. (US 2023/0176840 A1 – Learned Graph Optimizations for Compilers, hereinafter referred to as “Zhou”). Regarding claim 1, Zhou teaches a processor-implemented (Zhou, [0067] – teaches using multiple computers) method of pre-processing for deep neural network compilation (Zhou, [0067] – teaches compiler optimization network to generate an optimization plan for an input program), comprising: receiving a representation of an artificial neural network (ANN) model (Zhou, [0068] – teaches receiving an operational graph; see also Zhou, [0007] – teaches that machine learning algorithms are represented by computational graphs); generating an operator embedding to represent operators of the ANN model in an embedding space (Zhou, [0053] – teaches representing the computational graph [ML algorithm] as an encoding of meta features for nodes an edges, including operation type); processing, by a graph neural network (GNN) (Zhou, [0052] – teaches the graph embedding network is a graph neural network), the operator embedding, to generate a graph embedding corresponding to the ANN model (Zhou, [0070] – teaches generating a graph embedding using a graph embedding network for the input program [ML algorithm]) according to a learned distance metric (Zhou, [0053]-[0054] – teaches generating graph embeddings based on neighbor nodes; see also, Zhou, [0051] – proximal policy optimization); and determining, by the GNN, a set of hyperparameters for the ANN model based on the graph embedding (Zhou, [0071]-[0072] – teaches using the graph embedding to output an optimization plan for the input program). Regarding claim 2, Zhou teaches all of the limitations of the method of claim 1 as noted above. Zhou further teaches in which the GNN determines the graph embedding based on a metric learning objective (Zhou, [0053]-[0054] – teaches generating graph embeddings based on neighbor nodes; see also, Zhou, [0051] – proximal policy optimization). Regarding claim 5, Zhou teaches all of the limitations of the method of claim 1 as noted above. Zhou further teaches in which the graph embedding corresponding to the ANN model is unique (Zhou, [0061] – teaches unique model structure with unique graph embeddings; see also Zhou, [0013]-[0024] – demonstrating various examples). Regarding claim 6, Zhou teaches all of the limitations of the method of claim 1 as noted above. Zhou further teaches compiling the ANN model using the set of hyperparameters (Zhou, [0071]-[0073] – teaches compiling the network according to the optimization plan). Regarding claim 8, it is the apparatus embodiment of claim 1 with similar limitations to claim 1 and is rejected using the same reasoning found in claim 1. Zhou further teaches an apparatus, comprising: a memory (Zhou, [0079] – teaches executing instructions stored in memory on a processor); and at least one processor coupled to the memory, the at least one processor configured (Zhou, [0079] – teaches executing instructions stored in memory on a processor) … Regarding claim 9, the rejection of claim 8 is incorporated herein. Further, the limitations in this claim are taught by Zhou for the reasons set forth in the rejection of claim 2. Regarding claim 12, the rejection of claim 8 is incorporated herein. Further, the limitations in this claim are taught by Zhou for the reasons set forth in the rejection of claim 5. Regarding claim 13, the rejection of claim 8 is incorporated herein. Further, the limitations in this claim are taught by Zhou for the reasons set forth in the rejection of claim 6. Regarding claim 15, it is the computer-readable medium embodiment of claim 1 with similar limitations to claim 1 and is rejected using the same reasoning found in claim 1. Zhou further teaches a non-transitory computer-readable medium having program code recorded thereon, the program code executed by a processor and comprising (Zhou, [0079] – teaches executing instructions stored in memory on a processor) … Regarding claim 16, the rejection of claim 15 is incorporated herein. Further, the limitations in this claim are taught by Zhou for the reasons set forth in the rejection of claim 2. Regarding claim 19, the rejection of claim 15 is incorporated herein. Further, the limitations in this claim are taught by Zhou for the reasons set forth in the rejection of claim 5. Regarding claim 20, the rejection of claim 15 is incorporated herein. Further, the limitations in this claim are taught by Zhou for the reasons set forth in the rejection of claim 6. Regarding claim 22, it is the apparatus embodiment of claim 1 with similar limitations to claim 1 and is rejected using the same reasoning found in claim 1. Regarding claim 23, the rejection of claim 22 is incorporated herein. Further, the limitations in this claim are taught by Zhou for the reasons set forth in the rejection of claim 2. Regarding claim 26, the rejection of claim 22 is incorporated herein. Further, the limitations in this claim are taught by Zhou for the reasons set forth in the rejection of claim 5. Regarding claim 27, the rejection of claim 22 is incorporated herein. Further, the limitations in this claim are taught by Zhou for the reasons set forth in the rejection of claim 6. Claim(s) 3, 7, 10, 14, 17, 21, 24, 28 is/are rejected under 35 U.S.C. 103 as being unpatentable over Zhou in view of Tu et al. (AutoNE: Hyperparameter Optimization for Massive Network Embedding, hereinafter referred to as “Tu”). Regarding claim 3, Zhou teaches all of the limitations of the method of claim 1 as noted above. However, Zhou does not explicitly teach in which a distance between the graph embedding corresponding to the ANN model and a second graph embedding corresponding to a second ANN model is proportional to a relative size of the ANN model and the second ANN model. Tu teaches in which a distance between the graph embedding corresponding to the ANN model and a second graph embedding corresponding to a second ANN model is proportional to a relative size of the ANN model and the second ANN model (Tu, sections 3.3-3.4 – teaches similarity between two networks is based ratio of network sizes). It would have been obvious to one of ordinary skill in the art before the filing date of the claimed invention to modify Zhou with the teachings of Tu in order to automatically optimize the hyperparameters of a NE algorithm on massive networks in the field of neural network compilation using GNNs (Tu, Abstract – “Network embedding (NE) aims to embed the nodes of a network into a vector space, and serves as the bridge between machine learning and network data. Despite their widespread success, NE algorithms typically contain a large number of hyperparameters for preserving the various network properties, which must be carefully tuned in order to achieve satisfactory performance. Though automated machine learning (AutoML) has achieved promising results when applied to many types of data such as images and texts, network data poses great challenges to AutoML and remains largely ignored by the literature of AutoML. The biggest obstacle is the massive scale of real-world networks, along with the coupled node relationships that make any straightforward sampling strategy problematic. In this paper, we propose a novel framework, named AutoNE, to automatically optimize the hyperparameters of a NE algorithm on massive networks. In detail, we employ a multi-start random walk strategy to sample several small sub-networks, perform each trial of configuration selection on the sampled sub-network, and design a meta-leaner to transfer the knowledge about optimal hyperparameters from the sub-networks to the original massive network. The transferred meta-knowledge greatly reduces the number of trials required when predicting the optimal hyperparameters for the original network. Extensive experiments demonstrate that our framework can significantly outperform the existing methods, in that it needs less time and fewer trials to find the optimal hyperparameters.”). Regarding claim 7, Zhou teaches all of the limitations of the method of claim 1 as noted above. However, Zhou does not explicitly teach determining, by the GNN, the set of hyperparameters for the ANN model using a similarity search over a set of graph embeddings corresponding to ANN models. Tu teaches determining, by the GNN, the set of hyperparameters for the ANN model using a similarity search over a set of graph embeddings corresponding to ANN models (Tu, section 3.2 – teaches sampling representative sub-networks sharing similar properties). It would have been obvious to one of ordinary skill in the art before the filing date of the claimed invention to modify Zhou with the teachings of Tu in order to automatically optimize the hyperparameters of a NE algorithm on massive networks in the field of neural network compilation using GNNs (Tu, Abstract – “Network embedding (NE) aims to embed the nodes of a network into a vector space, and serves as the bridge between machine learning and network data. Despite their widespread success, NE algorithms typically contain a large number of hyperparameters for preserving the various network properties, which must be carefully tuned in order to achieve satisfactory performance. Though automated machine learning (AutoML) has achieved promising results when applied to many types of data such as images and texts, network data poses great challenges to AutoML and remains largely ignored by the literature of AutoML. The biggest obstacle is the massive scale of real-world networks, along with the coupled node relationships that make any straightforward sampling strategy problematic. In this paper, we propose a novel framework, named AutoNE, to automatically optimize the hyperparameters of a NE algorithm on massive networks. In detail, we employ a multi-start random walk strategy to sample several small sub-networks, perform each trial of configuration selection on the sampled sub-network, and design a meta-leaner to transfer the knowledge about optimal hyperparameters from the sub-networks to the original massive network. The transferred meta-knowledge greatly reduces the number of trials required when predicting the optimal hyperparameters for the original network. Extensive experiments demonstrate that our framework can significantly outperform the existing methods, in that it needs less time and fewer trials to find the optimal hyperparameters.”). Regarding claim 10, the rejection of claim 8 is incorporated herein. Further, the limitations in this claim are taught by Zhou in view of Tu for the reasons set forth in the rejection of claim 3. Regarding claim 14, the rejection of claim 8 is incorporated herein. Further, the limitations in this claim are taught by Zhou in view of Tu for the reasons set forth in the rejection of claim 7. Regarding claim 17, the rejection of claim 15 is incorporated herein. Further, the limitations in this claim are taught by Zhou in view of Tu for the reasons set forth in the rejection of claim 3. Regarding claim 21, the rejection of claim 15 is incorporated herein. Further, the limitations in this claim are taught by Zhou in view of Tu for the reasons set forth in the rejection of claim 7. Regarding claim 24, the rejection of claim 22 is incorporated herein. Further, the limitations in this claim are taught by Zhou in view of Tu for the reasons set forth in the rejection of claim 3. Regarding claim 28, the rejection of claim 22 is incorporated herein. Further, the limitations in this claim are taught by Zhou in view of Tu for the reasons set forth in the rejection of claim 7. Claim(s) 4, 11, 18, 25 is/are rejected under 35 U.S.C. 103 as being unpatentable over Zhou in view of Yu et al. (Auto Graph Encoder-Decoder for Neural Network Pruning, hereinafter referred to as “Yu”). Regarding claim 4, Zhou teaches all of the limitations of the method of claim 1 as noted above. However, Zhou does not explicitly teach in which the GNN is trained based on a reconstruction loss. Yu teaches in which the GNN is trained based on a reconstruction loss (Yu, section 3.3 – teaches training based on reward function defined as reconstruction loss). It would have been obvious to one of ordinary skill in the art before the filing date of the claimed invention to modify Zhou with the teachings of Yu in order to improve performance an compression ratio with fewer search steps in the field of neural network compilation using GNNs (Yu, Abstract – “Model compression aims to deploy deep neural networks (DNN) on mobile devices with limited computing and storage resources. However, most of the existing model compression methods rely on manually defined rules, which require domain expertise. DNNs are essentially computational graphs, which contain rich structural information. In this paper, we aim to find a suitable compression policy from DNNs’ structural information. We propose an automatic graph encoder-decoder model compression (AGMC) method combined with graph neural networks (GNN) and reinforcement learning (RL). We model the target DNN as a graph and use GNN to learn the DNN’s embeddings automatically. We compared our method with rule-based DNN embedding model compression methods to show the effectiveness of our method. Results show that our learning-based DNN embedding achieves better performance and a higher compression ratio with fewer search steps. We evaluated our method on over-parameterized and mobile-friendly DNNs and compared our method with handcrafted and learning-based model compression approaches. On over parameterized DNNs, such as ResNet-56, our method outperformed handcrafted and learning-based methods with 4.36% and 2.56% higher accuracy, respectively. Furthermore, on MobileNet-v2, we achieved a higher compression ratio than state-of-the-art methods with just 0.93% accuracy loss.”). Regarding claim 11, the rejection of claim 8 is incorporated herein. Further, the limitations in this claim are taught by Zhou in view of Yu for the reasons set forth in the rejection of claim 4. Regarding claim 18, the rejection of claim 15 is incorporated herein. Further, the limitations in this claim are taught by Zhou in view of Yu for the reasons set forth in the rejection of claim 4. Regarding claim 25, the rejection of claim 22 is incorporated herein. Further, the limitations in this claim are taught by Zhou in view of Yu for the reasons set forth in the rejection of claim 4. Conclusion Any inquiry concerning this communication or earlier communication from the examiner should be directed to MARSHALL WERNER whose telephone number is (469) 295-9143. The examiner can normally be reached on Monday – Thursday 7:30 AM – 4:30 PM ET. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kamran Afshar, can be reached at (571) 272-7796. The fax number for the organization where this application or proceeding is assigned is (571) 273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MARSHALL L WERNER/ Primary Examiner, Art Unit 2125
Read full office action

Prosecution Timeline

Jun 06, 2023
Application Filed
Feb 13, 2026
Non-Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12585968
SYSTEM AND METHOD FOR TESTING MACHINE LEARNING
2y 5m to grant Granted Mar 24, 2026
Patent 12579111
CROSS-DOMAIN STRUCTURAL MAPPING IN MACHINE LEARNING PROCESSING
2y 5m to grant Granted Mar 17, 2026
Patent 12568890
Apparatus and Method for Controlling a Growth Environment of a Plant
2y 5m to grant Granted Mar 10, 2026
Patent 12554967
USING NEGATIVE EVIDENCE TO PREDICT EVENT DATASETS
2y 5m to grant Granted Feb 17, 2026
Patent 12547918
Stochastic Control with a Quantum Computer
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
66%
Grant Probability
99%
With Interview (+44.3%)
3y 11m
Median Time to Grant
Low
PTA Risk
Based on 200 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month