Detailed Action
This action is in response to the amendment filed 11/25/2025 for application 17/936,099, in which:
Claim 1, 9, and 17 are the independent Claims.
Claims 1, 9, and 17 have been amended.
Claims 1-20 are currently pending.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 03/09/2023 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant's arguments filed 11/25/2025 have been fully considered but they are not persuasive.
Regarding the 35 U.S.C. § 101 Rejections:
Applicant's arguments regarding the 35 U.S.C. § 101 rejections of the previous office action have been fully considered, but are unpersuasive.
Applicant disagrees (Page 10) with the 35 U.S.C. § 101 rejections of the pending claims; however, the claims have been amended to recite additional patent eligible subject matter. Specifically, the amended independent claims recite limitations that cannot be practically performed in the human mind. Even assuming, arguendo, that the claims are directed to an abstract idea the amendments to the claims integrate any such abstract idea into a practical application.
Examiner respectfully disagrees. Examiner respectfully disagrees. For the reasons given below and in the 35 U.S.C. § 101 rejections, the claims are directed to an abstract idea (Step 2A Prong 1) and do not integrate the abstract idea into a practical application (Step 2A Prong 2). The claims recites abstract ideas a-c; where the abstract ideas are evaluations/judgements that can be performed in the human mind (or by a human using pen and paper). The independent claim is no more detailed than using a method to extract specific features (mental process) via machine learning (additional element), generate performance metrics (mental process) via machine learning (additional element), and select a specific machine learning model to process a dataset (mental process). The additional elements noted within Step 2A Prong 2 are unable to amount to significantly more than the judicial exception (when evaluated individually and holistically) as they are merely applying the abstract idea on a computer or restricting the abstract idea to a specific technological environment. Thus, the additional elements are not able to integrate the abstract ideas in a practical application as they fall within MPEP 2106.05. The claims are directed towards the improvement of an abstract idea. Improvements to an abstract idea are still considered to an abstract idea. Additionally, the Claims does not reflect any improvement in the functioning of a computer or hardware processor rather the additional elements merely use a generic computer component to perform the abstract idea or restricting the abstract idea to a particular technological environment. Therefore, the claims do not integrate the judicial exception into a practical application nor amount to significantly more. For the reasons given above and in the rejections below, the rejection to all Claims (including Claim 1, similar independent claims, and all dependent Claims) are maintained. More specific details are discussed below within the 35 USC § 101 Rejections.
Regarding the 35 U.S.C. § 102 Rejections:
Applicant's arguments regarding the 35 U.S.C. § 102 rejections of the previous office action have been fully considered, but are unpersuasive.
Applicant asserts (Page 11-12), that Jiang fails to disclose extracting meta-graph features representing structural characteristics from a graph representation of a dataset within the amended independent claims. The applicant support their assertion by noting Jiang’s discussion of an MGAR model, model interactions, constructing multiple meta-graphs, utilizing CNNs and attention mechanisms to learn user/event latent factors; however, construction of multiple meta-graphs does not suggest extraction.
Examiner respectfully disagrees. Figure 2 within Jiang shows the architecture where the figure denotes ‘Extract’ which leads to the Meta-graphs also denoted within the figure. Please also note the reference noting the LDA model to extract event topics (interpreted as meta graph features) which are used within the EBSNs. Table 2 is also another example of extraction of meta graph features from a graph representation set (Figure 2: Model event topics (T)).
Applicant asserts (Page 12-13), that Jiang fails to disclose the selecting a first type of machine-learning model … limitation as recited within the independent claims. The applicant notes Jiang’s discussion of CNN based embeddings to learn user/event embeddings to incorporate learn embeddings; however, Jiang does not disclose the selecting a first type of machine-learning model … limitation.
Applicant's arguments fail to comply with 37 CFR 1.111(b) because they amount to a general allegation that the claims define a patentable invention without specifically pointing out how the language of the claims patentably distinguishes them from the references.
For at least these reasons, Jiang does not anticipate currently amended independent claims 1, 9, and 17. As dependent claims 2-8, 10-16, and 18-20 depend from one of currently amended independent claims 1, 9, and 17, Jiang also does not anticipate these dependent claims for this reason, in addition to the patentable subject matter included therein.
Applicant’s arguments regarding the other independent and dependent claims rely upon the same assertions as with respect to Claim 1, and are thus likewise unpersuasive. Therefore, for the reasons given above and in the rejections below, the rejection to all Claims (including Claim 1, similar independent claims, and all dependent Claims) are maintained. More specific details are discussed below within the 35 USC § 102 Rejections (updated for clarity and amendments).
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Regarding Claim 1:
Subject Matter Eligibility Analysis Step 1:
Claim 1 recites a method, thus a process, one of the four statutory categories of patentable subject matter.
Subject Matter Eligibility Analysis Step 2A Prong 1:
However, Claim 1 further recites the method comprising of:
extracting … meta-graph features representing structural characteristics from a graph representation of a dataset, the graph representation comprising a plurality of nodes and a plurality of edges indicating relationships between the plurality of nodes (a human being can mentally apply evaluation to extract features from a meta-graph representing specific graph characteristics)
generating … a plurality of estimated graph learning performance metrics for a plurality of machine-learning models according to the meta-graph features of the graph representation … (a human being can make a judgement for generating a plurality of estimated metrics for machine-learning models based on specific features)
selecting a first type of machine-learning model from the plurality of machine-learning models to process the dataset associated with the graph representation according to the plurality of estimated graph learning performance metrics … (a human being can mentally apply evaluation to select a type of machine-learning model to process a specific dataset)
Claim 1 thus recites an abstract idea (that falls into the “mental processes” group of abstract ideas).
Subject Matter Eligibility Analysis Step 2A Prong 2:
This judicial exception is not integrated into a practical application because the additional elements consist of:
… utilizing a graph feature machine-learning model … (to perform a mental process and the performance of an abstract idea on a computer is no more than instructions to “apply it” on a computer, by MPEP 2106.05(f))
… wherein the plurality of estimated graph learning performance metrics indicate predicted performances of the plurality of machine-learning models in executing a graph learning task on the graph representation (which is restricting the abstract idea to a Particular Technological Environment, by MPEP 2106.05(h))
… the plurality of machine-learning models comprising a plurality of different types of machine-learning models associated with graph learning tasks (which is restricting the abstract idea to a Particular Technological Environment, by MPEP 2106.05(h))
Subject Matter Eligibility Analysis Step 2B:
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because the additional elements, alone or in combination, do not provide significantly more than the abstract idea itself. Additional elements a merely applying the abstract idea on a computer (MPEP 2106.05(f)) which cannot provide significantly more. Additional elements b and c are only restricting the abstract idea to a Particular Technological Environment (MPEP 2106.05(h)) which cannot provide significantly more. Thus, the claim is subject-matter ineligible.
Regarding Claim 2:
Subject Matter Eligibility Analysis Step 1:
Dependent Claim 2 recites the method of Claim 1. Claim 1 is a method, thus a process, one of the four statutory categories of patentable subject matter.
Subject Matter Eligibility Analysis Step 2A Prong 1:
However, Claim 2 further recites the method comprising of:
extracting local structural characteristics of the plurality of nodes and the plurality of edges (a human being can mentally apply evaluation to extract local structural characteristics of nodes and edges)
extracting global structural characteristics of the plurality of nodes (a human being can mentally apply evaluation to extract global structural characteristics of nodes)
Claim 2 thus recites an abstract idea (that falls into the “mental processes” group of abstract ideas).
Subject Matter Eligibility Analysis Step 2A Prong 2:
This judicial exception is not integrated into a practical application because there are no new additional elements recited.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because there are no new additional elements recited. The judicial exception alone does not provide significantly more than the abstract idea itself. Thus, the claim is subject-matter ineligible.
Regarding Claim 3:
Subject Matter Eligibility Analysis Step 1:
Dependent Claim 3 recites the method of Claim 2. Claim 2 is a method, thus a process, one of the four statutory categories of patentable subject matter.
Subject Matter Eligibility Analysis Step 2A Prong 1:
However, Claim 3 further recites the method comprising of:
generating a feature matrix comprising a plurality of rows corresponding to the plurality of nodes and the plurality of edges of the graph representation according the local structural characteristics (a mathematical relationship between variables and/or numbers using a mathematical formula/equations)
generating a meta-graph feature vector comprising a fixed-dimension for the graph representation based on the feature matrix utilizing the global structural characteristics (a mathematical relationship between variables and/or numbers using a mathematical formula/equations)
Claim 3 thus recites an abstract idea (that falls into the “mathematical concepts” group of abstract ideas).
Subject Matter Eligibility Analysis Step 2A Prong 2:
This judicial exception is not integrated into a practical application because there are no new additional elements recited.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because there are no new additional elements recited. The judicial exception alone does not provide significantly more than the abstract idea itself. Thus, the claim is subject-matter ineligible.
Regarding Claim 4:
Subject Matter Eligibility Analysis Step 1:
Dependent Claim 4 recites the method of Claim 2. Claim 2 is a method, thus a process, one of the four statutory categories of patentable subject matter.
Subject Matter Eligibility Analysis Step 2A Prong 1:
However, Claim 4 further recites the method comprising of generating one or more latent feature vectors representing a node degree, a number of wedges, a number of triangles centered at each node of the plurality of nodes, or a frequency of triangles for each edge of the plurality of edges (a mathematical relationship between variables and/or numbers using a mathematical formula/equations). Claim 4 thus recites an abstract idea (that falls into the “mathematical concepts” group of abstract ideas).
Subject Matter Eligibility Analysis Step 2A Prong 2:
This judicial exception is not integrated into a practical application because there are no new additional elements recited.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because there are no new additional elements recited. The judicial exception alone does not provide significantly more than the abstract idea itself. Thus, the claim is subject-matter ineligible.
Regarding Claim 5:
Subject Matter Eligibility Analysis Step 1:
Dependent Claim 5 recites the method of Claim 2. Claim 2 is a method, thus a process, one of the four statutory categories of patentable subject matter.
Subject Matter Eligibility Analysis Step 2A Prong 1:
However, Claim 5 further recites the method comprising of generating one or more latent feature vectors representing an importance score of each node of the plurality of nodes, an eccentricity of each node of the plurality of nodes, or a k-core number of each node of the plurality of nodes (a mathematical relationship between variables and/or numbers using a mathematical formula/equations). Claim 5 thus recites an abstract idea (that falls into the “mathematical concepts” group of abstract ideas).
Subject Matter Eligibility Analysis Step 2A Prong 2:
This judicial exception is not integrated into a practical application because there are no new additional elements recited.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because there are no new additional elements recited. The judicial exception alone does not provide significantly more than the abstract idea itself. Thus, the claim is subject-matter ineligible.
Regarding Claim 6:
Subject Matter Eligibility Analysis Step 1:
Dependent Claim 6 recites the method of Claim 1. Claim 1 is a method, thus a process, one of the four statutory categories of patentable subject matter.
Subject Matter Eligibility Analysis Step 2A Prong 1:
However, Claim 6 further recites the method comprising of determining … based on learned mappings between meta-graph features and model graph learning performance metrics of the plurality of machine-learning models (a human being can apply mental evaluation to make a determination based on learnt mappings between features and metrics). Claim 6 thus recites an abstract idea (that falls into the “mental processes” group of abstract ideas).
Subject Matter Eligibility Analysis Step 2A Prong 2:
This judicial exception is not integrated into a practical application because there are no new additional elements recited.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because there are no new additional elements recited. The judicial exception alone does not provide significantly more than the abstract idea itself. Thus, the claim is subject-matter ineligible.
Regarding Claim 7:
Subject Matter Eligibility Analysis Step 1:
Dependent Claim 7 recites the method of Claim 1. Claim 1 is a method, thus a process, one of the four statutory categories of patentable subject matter.
Subject Matter Eligibility Analysis Step 2A Prong 1:
However, Claim 7 further recites the method comprising of:
generating a meta-graph comprising a plurality of graph nodes corresponding to graph features for a graph dataset and a plurality of model nodes corresponding to model factors for the plurality of machine-learning models (a human being can make a judgement for generating a meta-graph comprising specific representations for model and graph nodes)
generating the plurality of estimated graph learning performance metrics based on the meta-graph features and relationships between the plurality of graph nodes and the plurality of model nodes in the meta-graph (a human being can make a judgement for generating estimated graph learning performance based on specific feature and relationship constraints)
Claim 7 thus recites an abstract idea (that falls into the “mental processes” group of abstract ideas).
Subject Matter Eligibility Analysis Step 2A Prong 2:
This judicial exception is not integrated into a practical application because there are no new additional elements recited.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because there are no new additional elements recited. The judicial exception alone does not provide significantly more than the abstract idea itself. Thus, the claim is subject-matter ineligible.
Regarding Claim 8:
Subject Matter Eligibility Analysis Step 1:
Dependent Claim 8 recites the method of Claim 7. Claim 7 is a method, thus a process, one of the four statutory categories of patentable subject matter.
Subject Matter Eligibility Analysis Step 2A Prong 1:
However, Claim 8 further recites the method comprising of:
determining the graph features for the graph dataset and the model factors for the plurality of machine-learning models … (a human being can apply mental evaluation to determine the graph features for the graph data set and model factors of ML models)
… by factorizing a performance matrix comprising model graph learning performance metrics of the plurality of machine-learning models according to the graph dataset (a mathematical relationship between variables and/or numbers using a mathematical formula/equations)
Claim 8 thus recites an abstract idea (that falls into the “mathematical concepts” or “mental processes” group of abstract ideas).
Subject Matter Eligibility Analysis Step 2A Prong 2:
This judicial exception is not integrated into a practical application because there are no new additional elements recited.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because there are no new additional elements recited. The judicial exception alone does not provide significantly more than the abstract idea itself. Thus, the claim is subject-matter ineligible.
Regarding Claim 9:
Subject Matter Eligibility Analysis Step 1:
Claim 9 recites a system, thus a machine, one of the four statutory categories of patentable subject matter.
Subject Matter Eligibility Analysis Step 2A Prong 1:
However, Claim 9 further recites the system comprising of:
extracting … meta-graph features comprising structural characteristics from a graph representation of a dataset in a latent space, the graph representation comprising a plurality of nodes and a plurality of edges indicating relationships between the plurality of nodes (a human being can mentally apply evaluation to extract features from a meta-graph representing specific graph characteristics)
generating … a plurality of estimated graph learning performance metrics for the plurality of machine-learning models according to the meta-graph features of the graph representation (a human being can make a judgement for generating a plurality of estimated metrics for machine-learning models based on specific features)
selecting a first type of machine-learning model from the plurality of machine-learning models to process the dataset associated with the graph representation according to the plurality of estimated graph learning performance metrics … (a human being can mentally apply evaluation to select a type of machine-learning model to process a specific dataset)
Claim 9 thus recites an abstract idea (that falls into the “mental processes” group of abstract ideas).
Subject Matter Eligibility Analysis Step 2A Prong 2:
This judicial exception is not integrated into a practical application because the additional elements consist of:
a memory component; and a processing device coupled to the memory component, the processing device to perform operations comprising: (to perform a mental process and the performance of an abstract idea on a computer is no more than instructions to “apply it” on a computer, by MPEP 2106.05(f))
… utilizing a graph feature machine-learning model comprising parameters learned based on a graph dataset and corresponding model graph learning performances for a plurality of machine-learning models … (to perform a mental process and the performance of an abstract idea on a computer is no more than instructions to “apply it” on a computer, by MPEP 2106.05(f))
… wherein the plurality of estimated graph learning performance metrics indicate predicted performances of the plurality of machine-learning models in executing a graph learning task on the graph representation (which is restricting the abstract idea to a Particular Technological Environment, by MPEP 2106.05(h))
… the plurality of machine-learning models comprising a plurality of different types of machine-learning models associated with graph learning tasks (which is restricting the abstract idea to a Particular Technological Environment, by MPEP 2106.05(h))
Subject Matter Eligibility Analysis Step 2B:
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because the additional elements, alone or in combination, do not provide significantly more than the abstract idea itself. Additional elements a and b are merely applying the abstract idea on a computer (MPEP 2106.05(f)) which cannot provide significantly more. Additional element c and d are only restricting the abstract idea to a Particular Technological Environment (MPEP 2106.05(h)) which cannot provide significantly more. Thus, the claim is subject-matter ineligible.
Regarding Claim 10:
Subject Matter Eligibility Analysis Step 1:
Dependent Claim 10 recites the system of Claim 9. Claim 9 is a system, thus a machine, one of the four statutory categories of patentable subject matter.
Subject Matter Eligibility Analysis Step 2A Prong 1:
However, Claim 10 further recites the system comprising of:
generating … a plurality of structural feature matrices comprising local structural characteristics of the graph representation (a mathematical relationship between variables and/or numbers using a mathematical formula/equations)
extracting the meta-graph features based on the plurality of structural feature matrices (a human being can mentally apply evaluation to extract meta-graph features based on structural feature matrices)
Claim 10 thus recites an abstract idea (that falls into the “mental processes” group of abstract ideas).
Subject Matter Eligibility Analysis Step 2A Prong 2:
This judicial exception is not integrated into a practical application because there are no new additional elements recited.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because there are no new additional elements recited. The judicial exception alone does not provide significantly more than the abstract idea itself. Thus, the claim is subject-matter ineligible.
Regarding Claim 11:
Subject Matter Eligibility Analysis Step 1:
Dependent Claim 11 recites the system of Claim 10. Claim 10 is a system, thus a machine, one of the four statutory categories of patentable subject matter.
Subject Matter Eligibility Analysis Step 2A Prong 1:
However, Claim 11 further recites the system comprising of generating … a fixed-dimension meta-graph feature vector by modifying the plurality of structural feature matrices according to a set of global statistical characteristics associated with the graph representation (a mathematical relationship between variables and/or numbers using a mathematical formula/equations). Claim 11 thus recites an abstract idea (that falls into the “mathematical concepts” group of abstract ideas).
Subject Matter Eligibility Analysis Step 2A Prong 2:
This judicial exception is not integrated into a practical application because there are no new additional elements recited.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because there are no new additional elements recited. The judicial exception alone does not provide significantly more than the abstract idea itself. Thus, the claim is subject-matter ineligible.
Regarding Claim 12:
Subject Matter Eligibility Analysis Step 1:
Dependent Claim 12 recites the system of Claim 11. Claim 11 is a system, thus a machine, one of the four statutory categories of patentable subject matter.
Subject Matter Eligibility Analysis Step 2A Prong 1:
However, Claim 12 further recites the system comprising of:
generating a plurality of feature vectors by modifying the plurality of structural feature matrices via a plurality of statistical functions (a mathematical relationship between variables and/or numbers using a mathematical formula/equations)
generating a concatenated plurality of feature vectors to generate the fixed-dimension meta-graph feature vector by concatenating the plurality of feature vectors (a mathematical relationship between variables and/or numbers using a mathematical formula/equations)
Claim 12 thus recites an abstract idea (that falls into the “mathematical concepts” group of abstract ideas).
Subject Matter Eligibility Analysis Step 2A Prong 2:
This judicial exception is not integrated into a practical application because there are no new additional elements recited.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because there are no new additional elements recited. The judicial exception alone does not provide significantly more than the abstract idea itself. Thus, the claim is subject-matter ineligible.
Regarding Claim 13:
Subject Matter Eligibility Analysis Step 1:
Dependent Claim 13 recites the system of Claim 12. Claim 12 is a system, thus a machine, one of the four statutory categories of patentable subject matter.
Subject Matter Eligibility Analysis Step 2A Prong 1:
However, Claim 13 further recites the system comprising of appending one or more scalar statistical metrics determined from the graph representation to the concatenated plurality of feature vectors in the fixed-dimension meta-graph feature vector (a mathematical relationship between variables and/or numbers using a mathematical formula/equations). Claim 13 thus recites an abstract idea (that falls into the “mathematical concepts” group of abstract ideas).
Subject Matter Eligibility Analysis Step 2A Prong 2:
This judicial exception is not integrated into a practical application because there are no new additional elements recited.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because there are no new additional elements recited. The judicial exception alone does not provide significantly more than the abstract idea itself. Thus, the claim is subject-matter ineligible.
Regarding Claim 14:
Subject Matter Eligibility Analysis Step 1:
Dependent Claim 14 recites the system of Claim 9. Claim 9 is a system, thus a machine, one of the four statutory categories of patentable subject matter.
Subject Matter Eligibility Analysis Step 2A Prong 1:
However, Claim 14 further recites the system comprising of:
generating … a first estimated performance metric according to the meta-graph features (a human being can make a judgement for generating an estimated performance metric according to observed meta-graph features)
generating … a second estimated performance metric according to the meta-graph features (a human being can make a judgement for generating an estimated performance metric according to observed meta-graph features)
Claim 14 thus recites an abstract idea (that falls into the “mental processes” group of abstract ideas).
Subject Matter Eligibility Analysis Step 2A Prong 2:
This judicial exception is not integrated into a practical application because the additional elements consist of:
… for a first machine-learning model of the plurality of machine-learning models … (which is restricting the abstract idea to a Particular Technological Environment, by MPEP 2106.05(h))
… for a second machine-learning model of the plurality of machine-learning models … (which is restricting the abstract idea to a Particular Technological Environment, by MPEP 2106.05(h))
Subject Matter Eligibility Analysis Step 2B:
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because the additional elements, alone or in combination, do not provide significantly more than the abstract idea itself. Additional elements a and b are only restricting the abstract idea to a Particular Technological Environment (MPEP 2106.05(h)) which cannot provide significantly more. Thus, the claim is subject-matter ineligible.
Regarding Claim 15:
Subject Matter Eligibility Analysis Step 1:
Dependent Claim 15 recites the system of Claim 14. Claim 14 is a system, thus a machine, one of the four statutory categories of patentable subject matter.
Subject Matter Eligibility Analysis Step 2A Prong 1:
However, Claim 15 further recites the system comprising of selecting the first machine-learning model in response to determining that the first estimated performance metric is higher than the second estimated performance metric (a human being can mentally apply evaluation to select a machine-learning model based on a determination of a performance metric being higher than another performance metric). Claim 15 thus recites an abstract idea (that falls into the “mental processes” group of abstract ideas).
Subject Matter Eligibility Analysis Step 2A Prong 2:
This judicial exception is not integrated into a practical application because there are no new additional elements recited.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because there are no new additional elements recited. The judicial exception alone does not provide significantly more than the abstract idea itself. Thus, the claim is subject-matter ineligible.
Regarding Claim 16:
Subject Matter Eligibility Analysis Step 1:
Dependent Claim 16 recites the system of Claim 9. Claim 9 is a system, thus a machine, one of the four statutory categories of patentable subject matter.
Subject Matter Eligibility Analysis Step 2A Prong 1:
However, Claim 16 further recites the system comprising of:
extracting … a plurality of sets of meta-graph features for training graph representations in the graph dataset (a human being can mentally apply evaluation to extract features from a meta-graph for training)
generating, for the plurality of machine-learning models, a plurality of sets of ground-truth graph learning performance metrics according to the plurality of sets of meta-graph features (a human being can make a judgement for generating estimated metrics for machine-learning models based on specific metrics)
learning the parameters of the graph feature machine-learning model by determining mappings between the plurality of sets of meta-graph features and the plurality of sets of ground-truth graph learning performance metrics (a human being can apply mental evaluation to learn parameters by making a determination based on mappings between features and metrics)
Claim 16 thus recites an abstract idea (that falls into the “mental processes” group of abstract ideas).
Subject Matter Eligibility Analysis Step 2A Prong 2:
This judicial exception is not integrated into a practical application because there are no new additional elements recited.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because there are no new additional elements recited. The judicial exception alone does not provide significantly more than the abstract idea itself. Thus, the claim is subject-matter ineligible.
Regarding Claim 17:
Subject Matter Eligibility Analysis Step 1:
Claim 17 recites a non-transitory computer-readable medium, thus a manufacture, one of the four statutory categories of patentable subject matter.
Subject Matter Eligibility Analysis Step 2A Prong 1:
However, Claim 17 further recites the non-transitory computer-readable medium comprising of:
extracting … meta-graph features comprising local structural characteristics and global structural characteristics from a graph representation of a dataset in a latent space, the graph representation comprising a plurality of nodes and a plurality of edges indicating relationships between the plurality of nodes (a human being can mentally apply evaluation to extract features from a meta-graph representing structural characteristics)
generating … a plurality of estimated graph learning performance metrics for a plurality of machine-learning models according to the meta-graph features of the graph representation and learned mappings between the meta-graph features and graph learning performance metrics of the plurality of machine-learning models (a human being can make a judgement for generating estimated metrics for machine-learning models based on features and mappings between specific constraints)
selecting a first type of machine-learning model from the plurality of machine-learning models to process the dataset associated with the graph representation according to the plurality of estimated graph learning performance metrics … (a human being can mentally apply evaluation to select a type of machine-learning model to process a specific dataset)
Claim 17 thus recites an abstract idea (that falls into the “mental processes” group of abstract ideas).
Subject Matter Eligibility Analysis Step 2A Prong 2:
This judicial exception is not integrated into a practical application because the additional elements consist of:
storing executable instructions, which when executed by a processing device, cause the processing device to perform operations comprising: (to perform a mental process and the performance of an abstract idea on a computer is no more than instructions to “apply it” on a computer, by MPEP 2106.05(f))
… utilizing a graph feature machine-learning model … (to perform a mental process and the performance of an abstract idea on a computer is no more than instructions to “apply it” on a computer, by MPEP 2106.05(f))
wherein the plurality of estimated graph learning performance metrics indicate predicted performances of the plurality of machine-learning models in a graph learning task for the graph representation (which is restricting the abstract idea to a Particular Technological Environment, by MPEP 2106.05(h))
… the plurality of machine-learning models comprising a plurality of different types of machine-learning models associated with graph learning tasks (which is restricting the abstract idea to a Particular Technological Environment, by MPEP 2106.05(h))
Subject Matter Eligibility Analysis Step 2B:
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because the additional elements, alone or in combination, do not provide significantly more than the abstract idea itself. Additional elements a and b merely applying the abstract idea on a computer (MPEP 2106.05(f)) which cannot provide significantly more. Additional element c and d are only restricting the abstract idea to a Particular Technological Environment (MPEP 2106.05(h)) which cannot provide significantly more. Thus, the claim is subject-matter ineligible.
Regarding Claim 18:
Subject Matter Eligibility Analysis Step 1:
Dependent Claim 18 recites the system of Claim 17. Claim 17 is a system, thus a machine, one of the four statutory categories of patentable subject matter.
Subject Matter Eligibility Analysis Step 2A Prong 1:
However, Claim 18 further recites the system comprising of:
generating … a plurality of structural feature matrices comprising local structural characteristics of the graph representation (a mathematical relationship between variables and/or numbers using a mathematical formula/equations)
generating … a fixed-dimension meta-graph feature vector by modifying the plurality of structural feature matrices according to a set of global statistical characteristics associated with the graph representation (a mathematical relationship between variables and/or numbers using a mathematical formula/equations)
Claim 18 thus recites an abstract idea (that falls into the “mathematical concepts” group of abstract ideas).
Subject Matter Eligibility Analysis Step 2A Prong 2:
This judicial exception is not integrated into a practical application because there are no new additional elements recited.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because there are no new additional elements recited. The judicial exception alone does not provide significantly more than the abstract idea itself. Thus, the claim is subject-matter ineligible.
Regarding Claim 19:
Subject Matter Eligibility Analysis Step 1:
Dependent Claim 19 recites the system of Claim 17. Claim 17 is a system, thus a machine, one of the four statutory categories of patentable subject matter.
Subject Matter Eligibility Analysis Step 2A Prong 1:
However, Claim 19 further recites the system comprising of generating the plurality of estimated graph learning performance metrics based on learned mappings between meta-graph features of training graph representations of a graph dataset and graph learning performance metrics of the plurality of machine-learning models corresponding to the training graph representations (a human being can make a judgement for generating estimated metrics for machine-learning models based on mappings between features and metrics with specific constraints). Claim 19 thus recites an abstract idea (that falls into the “mental processes” group of abstract ideas).
Subject Matter Eligibility Analysis Step 2A Prong 2:
This judicial exception is not integrated into a practical application because there are no new additional elements recited.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because there are no new additional elements recited. The judicial exception alone does not provide significantly more than the abstract idea itself. Thus, the claim is subject-matter ineligible.
Regarding Claim 20:
Subject Matter Eligibility Analysis Step 1:
Dependent Claim 20 recites the system of Claim 17. Claim 17 is a system, thus a machine, one of the four statutory categories of patentable subject matter.
Subject Matter Eligibility Analysis Step 2A Prong 1:
However, Claim 20 further recites the system comprising of selecting the first type of machine-learning model of the plurality of machine-learning models corresponding to a highest estimated performance metric of the plurality of estimated graph learning performance metrics (a human being can mentally apply evaluation to select a specific type of machine-learning model associated with the highest performance metric). Claim 20 thus recites an abstract idea (that falls into the “mental processes” group of abstract ideas).
Subject Matter Eligibility Analysis Step 2A Prong 2:
This judicial exception is not integrated into a practical application because there are no new additional elements recited.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception because there are no new additional elements recited. The judicial exception alone does not provide significantly more than the abstract idea itself. Thus, the claim is subject-matter ineligible.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Jiang et al., “A novel meta-graph-based attention model for event recommendation”.
Regarding Claim 1:
A method comprising: extracting, utilizing a graph feature machine-learning model, meta-graph features representing structural characteristics from a graph representation of a dataset, the graph representation comprising a plurality of nodes and a plurality of edges indicating relationships between the plurality of nodes;
(Jiang, Page 14663, Fig. 1; Page 14665, Fig. 2; Page 14666, Column 2, Paragraph 1, “… MGAR is mainly composed of three parts, as illustrated in Fig. 2. First, we construct multiple meta-graphs according to the EBSNs graph and compute multiple commuting matrices based on meta-graphs to express the user preferences information …”; Page 14667, Column 1, Paragraph 2, “… we adopt Latent Dirichlet Allocation (LDA) [44] model to extract event topics, which are applied as a new object type T of the EBSNs graph”. Fig. 2 shows utilizing the Meta-Graph-based Attention Recommendation model (interpreted by the examiner as a graph feature machine-learning model) for the recommendation method; where the ‘Data Preparation’ section shows the creation of ‘Commuting matrices’ based on ‘Meta-graphs’ (which represent the structural characteristics of a graph representation as they express the user preferences information). The ‘Meta-graph’ are extracted as noted within Fig. 2 as the Model Event topics are extracted via LDA model for meta graph features. Thus, extracting meta-graph features representing the meta-graph structure which contains a plurality of nodes and edges (shown in Fig 1 b & c))).
generating, utilizing the graph feature machine-learning model, a plurality of estimated graph learning performance metrics for a plurality of machine-learning models according to the meta-graph features of the graph representation, wherein the plurality of estimated graph learning performance metrics indicate predicted performances of the plurality of machine-learning models in a graph learning task on the graph representation; and
(Jiang, Page 14665, Fig. 2; Page 14670, Algorithm 1; Page 14666, Column 2, Paragraph 1, “… Then, we propose a CNN-based embedding approach to learn the user/event embeddings taking commuting matrices as input and an attention mechanism to incorporate the learnt embeddings. Finally, we employ a factorization machine framework to predict ratings by considering the concatenated user and event embeddings as input”. Fig. 2 shows within the ‘Meta-graph Embedding generation’ section shows the utilization of a plurality of CNN machine-learning models each outputting a fused latent vector (F) which are used within the ‘Rating Prediction’ section. ^ruiej within Fig. 2 is the generation of a plurality of estimated graph learning performance metrics as the predicted ratings are for set of user ui to set of event ej where each predicted rating is the likelihood for a user to be interested in an event (thus, interpreted by the examiner as predicted performances). Each rating indicates the predicted performances for each graph learning task shown within the Fig 2: Meta-graph Embedding Generation section; where the CNN-based embedding approach is used to learn the embeddings and then incorporate the learnt embeddings and then finally using a Factorization Machine framework to predict performances based on the Fused latent vectors from the plurality of machine-learning models).
selecting a first type of machine-learning model from the plurality of machine-learning models to process the dataset associated with the graph representation according to the plurality of estimated graph learning performance metrics, the plurality of machine-learning models comprising a plurality of different types of machine-learning models associated with graph learning tasks.
(Jiang, Page 14665, Fig. 2; Page 14676, Column 2, Paragraph 1, “… MGAR can select the appropriate meta-graph which contributes the most for each user through a feature fusion mechanism”; Page 14667, Column 1, Paragraph 1, “In our MGAR model, we employ an advanced CNN-based approach and attention mechanism to learn embeddings from multi-meta-graphs”. Fig. 2 shows the MGAR model which selects a first type of machine-learning model to process data as it selects the meta-graph that is the most influential/contributes the most (thus, interpreted by the examiner as graph representation according to the plurality of estimate graph learning performance metrics as the model selects the highest rating prediction event for each user). The first type is interpreted by the examiner as the one that offers the most contribution out of the different types of machine-learning models; where each machine learning model is different based on feature fusion and different meta-graphs (which are different graph learning tasks) for different prediction events).
Regarding Claim 2:
Jiang teaches the method of Claim 1:
wherein extracting the meta-graph features comprises:
extracting local structural characteristics of the plurality of nodes and the plurality of edges; and
(Jiang, Page 14663, Fig 1(c); Page 14665, Fig. 2: Commuting Matrix. Fig. 2 shows the generation of the Commuting Matrix which extracts the local structural characteristics of the nodes, edges and even path count of the meta-graphs (Fig 1(c))).
extracting global structural characteristics of the plurality of nodes.
(Jiang, Page 14665, Fig. 2: Commuting Matrix -> [Fui and Fej]. To extract the global structural characteristic, Fig. 2 shows the ‘Meta-graph Embedding Generation’ which creates Latent Fused Vectors Fui and Fej ; thus, extracting global structural characteristics of the plurality of nodes as the Fused Latent Vectors contain the information of the Commuting Matrices across all meta-graphs with applying the attention mechanism to reflect the global view of how the different models/structures/graphs affect node performance and patterns).
Regarding Claim 3:
Jiang teaches the method of Claim 2:
wherein extracting the meta-graph features comprises:
generating a feature matrix comprising a plurality of rows corresponding to the plurality of nodes and the plurality of edges of the graph representation according the local structural characteristics; and
(Jiang, Page 14663, Fig 1(c); Page 14665, Fig. 2: Commuting Matrix; Page 14667, Column 2, Paragraph 2, “… PathCount [11] can be used to describe the preference of source node for target node in a metapath … commuting matrix CP for meta-path P is calculated as the latent interaction frequency between objects from A1 and Al along with the semantic of meta-path P”. The Commuting Matrix has rows ui and columns ej; u = user and e = event which are the nodes within the meta-graphs (ex: Fig. 1(c)); thus, the Commuting Matrix is a generated feature matrix with rows corresponding to the plurality of nodes and the plurality of edges of the meta-graph representation (which are based on the local structural characteristics)).
generating a meta-graph feature vector comprising a fixed-dimension for the graph representation based on the feature matrix utilizing the global structural characteristics.
(Jiang, Page 14665, Fig. 2: Commuting Matrix -> Fui and Fej; Page 14671, Equation 14 & 15,
PNG
media_image1.png
52
339
media_image1.png
Greyscale
&
PNG
media_image2.png
30
90
media_image2.png
Greyscale
(15)”. d within Equation 14 denotes the fixed-dimension of the input vector; where Equation 15 is used to get the rating predictor; where xui,ej the xth element of the concatenated vector (of the two fused latent vectors which describe the global structural characteristics) of user/event interaction embedding; thus, generating a meta-graph feature vector comprising a fixed-dimension for the graph representation based on the feature matrix utilizing the global structural characteristics).
Regarding Claim 4:
Jiang teaches the method of Claim 2:
wherein extracting the local structural characteristics comprises generating one or more latent feature vectors representing a node degree …
(Jiang, Page 14667, Column 2, Paragraph 2, “The latent interactions between users and events … We tend to leverage the frequency of latent interactions to encode users and events. PathCount [11] can be used to describe the preference of source node for target node in a metapath … commuting matrix CP for meta-path P is calculated as the latent interaction frequency between objects from A1 and Al along with the semantic of meta-path P”. The examiner is interpreting node degree as simply the number of nodes connected to the source node; thus, PathCount teaches the node degree of each and every node as the commuting matrix CP for meta-path P is calculated as the latent interaction frequency between objects from A1 and Al along meta-path P )
Regarding Claim 5:
Jiang teaches the method of Claim 2:
wherein extracting the global structural characteristics comprises generating one or more latent feature vectors representing an importance score of each node of the plurality of nodes …
(Jiang, Page 14665, Fig. 2: Commuting Matrix -> Fui and Fej; Page 14670, Column 2, Equations 12 & 13:
PNG
media_image3.png
49
130
media_image3.png
Greyscale
(12)
PNG
media_image4.png
42
113
media_image4.png
Greyscale
(13). Fui and Fej are the fused latent vectors which are generated based off the Commuting Matrix which represent an importance score of the features as the fused latent vector represents the weighted user/event for each node across the meta-graphs; thus, the generated fused latent features vectors represent the importance-weighted structures within all meta-graphs).
Regarding Claim 6:
Jiang teaches the method of Claim 1:
wherein generating the plurality of estimated graph learning performance metrics comprises
determining, utilizing the graph feature machine-learning model, based on learned mappings between meta-graph features and model graph learning performance metrics of the plurality of machine-learning models.
(Jiang, Page 14665, Fig. 2: Commuting Matrix -> Fui and Fej. ^ruiej within Fig. 2 is the graph learning performance metrics which are determined based on the commuting matrix which are the mappings for the Meta-graph features and the model graph learning performance metrics (which are shown within the ‘Meta-graph Embedding Generation’ section; where the CNNs are used to help determine based on the learned mappings between the features and the different model structures used to embed and finally analyze the performance metrics within the Rating Prediction section)).
Regarding Claim 7:
Jiang teaches the method of Claim 1:
wherein generating the plurality of estimated graph learning performance metrics comprises:
generating a meta-graph comprising a plurality of graph nodes corresponding to graph features for a graph dataset and a plurality of model nodes corresponding to model factors for the plurality of machine-learning models; and
(Jiang, Page 14665, Fig. 2. Fig. 2 shows the generation of multiple meta-graph embeddings which represent different meta-graph structures corresponding to the graph features for a graph dataset is (as the Commuting Matrix is generated from the input ESBN/Meta-graphs)).
generating the plurality of estimated graph learning performance metrics based on the meta-graph features and relationships between the plurality of graph nodes and the plurality of model nodes in the meta-graph.
(Jiang, Page 14665, Fig. 2; Page 14665, Fig. 2; Page 14676, Column 2, Paragraph 1, “… MGAR can select the appropriate meta-graph which contributes the most for each user through a feature fusion mechanism”; Page 14667, Column 1, Paragraph 1, “In our MGAR model, we employ an advanced CNN-based approach and attention mechanism to learn embeddings from multi-meta-graphs”. Fig. 2 shows the MGAR model which selects the most influential machine-learning model to process data as it selects the meta-graph (Fig. 2 shows the plurality of metrics being generated via embeddings); thus, implying the outputs that are generated from the CNN-based approach and rated within Rating Predictor are based on the meta-graph features and relationships between the model structure nodes and meta-graph nodes).
Regarding Claim 8:
Jiang teaches the method of Claim 7:
wherein generating the meta-graph comprises
determining the graph features for the graph dataset and the model factors for the plurality of machine-learning models by factorizing a performance matrix comprising model graph learning performance metrics of the plurality of machine-learning models according to the graph dataset.
(Jiang, Page 14671, Column 1, Paragraph 2, “Our rating predictor is built based … Factorization Machine (FM) model [24], which can effectively model the second-order feature interactions. In FM, the rating of user ui on item ej is defined as follows
PNG
media_image5.png
49
363
media_image5.png
Greyscale
”; Page 14676, Column 2, Paragraph 1, “… MGAR can select the appropriate meta-graph which contributes the most for each user through a feature fusion mechanism”; The factorizing by the FM model of Jiang, teaches determining the highest rated value for prediction; thus, Equation 14 shows determining the graph features (already encoded within the embeddings) for the most relevant (performance metric based) event for each user).
Regarding Claim 9:
A system comprising: a memory component; and a processing device coupled to the memory component, the processing device to perform operations comprising:
(Jiang, Page 14673, Column 1, Paragraph 1, “We implement the proposed MGAR model based on Pytorch … experiments run in a Linux server with one GPU (NVIDIA Geforce RTX-2080 Ti) and 20 CPU cores (Intel(R) Core(TM) i9-9820X*20)”. The MGAR Model is implied to be implemented on a computing system as the model is based on Pytorch which implies a processor, memory component, and non-transitory computer program as they are inherent within a system that utilizes source code for processing attention based recommendation).
extracting, utilizing a graph feature machine-learning model comprising parameters learned based on a graph dataset and corresponding model graph learning performances for a plurality of machine-learning models, meta-graph features comprising structural characteristics from a graph representation of a dataset in a latent space, the graph representation comprising a plurality of nodes and a plurality of edges indicating relationships between the plurality of nodes;
(Jiang, Page 14663, Fig. 1; Page 14665, Fig. 2; Page 14666, Column 2, Paragraph 1, “… MGAR is mainly composed of three parts, as illustrated in Fig. 2. First, we construct multiple meta-graphs according to the EBSNs graph and compute multiple commuting matrices based on meta-graphs to express the user preferences information …”; Page 14671, Equation 14,
PNG
media_image1.png
52
339
media_image1.png
Greyscale
”; Page 14667, Column 1, Paragraph 2, “… we adopt Latent Dirichlet Allocation (LDA) [44] model to extract event topics, which are applied as a new object type T of the EBSNs graph”. Fig. 2 shows utilizing the Meta-Graph-based Attention Recommendation model (interpreted by the examiner as a graph feature machine-learning model) for the recommendation method which uses parameters learned based on the graph dataset (which can be seen within Equation 14); where the ‘Data Preparation’ section shows the creation of ‘Commuting matrices’ based on ‘Meta-graphs’ which represent the structural characteristics of a graph representation as they express the user preferences information. The ‘Meta-graph’ are extracted as noted within Fig. 2 as the Model Event topics are extracted via LDA model for meta graph features. Thus, extracting meta-graph features representing the meta-graph structure which contains a plurality of nodes and edges (shown in Fig 1 b & c))).
generating, utilizing the graph feature machine-learning model, a plurality of estimated graph learning performance metrics for the plurality of machine-learning models according to the meta-graph features of the graph representation, wherein the plurality of estimated graph learning performance metrics indicate predicted performances of the plurality of machine-learning models in executing a graph learning task on the graph representation; and
(Jiang, Page 14665, Fig. 2; Page 14670, Algorithm 1; Page 14666, Column 2, Paragraph 1, “… Then, we propose a CNN-based embedding approach to learn the user/event embeddings taking commuting matrices as input and an attention mechanism to incorporate the learnt embeddings. Finally, we employ a factorization machine framework to predict ratings by considering the concatenated user and event embeddings as input”. Fig. 2 shows within the ‘Meta-graph Embedding generation’ section shows the utilization of a plurality of CNN machine-learning models each outputting a fused latent vector (F) which are used within the ‘Rating Prediction’ section. ^ruiej within Fig. 2 is the generation of a plurality of estimated graph learning performance metrics as the predicted ratings are for set of user ui to set of event ej where each predicted rating is the likelihood for a user to be interested in an event (thus, interpreted by the examiner as predicted performances). Each rating indicates the predicted performances for each graph learning task shown within the Fig 2: Meta-graph Embedding Generation section; where the CNN-based embedding approach is used to learn the embeddings and then incorporate the learnt embeddings and then finally using a Factorization Machine framework to predict performances based on the Fused latent vectors from the plurality of machine-learning models).
selecting a first type of machine-learning model from the plurality of machine-learning models to process the dataset associated with the graph representation according to the plurality of estimated graph learning performance metrics, the plurality of machine-learning models comprising a plurality of different types of machine-learning models associated with graph learning tasks.
(Jiang, Page 14665, Fig. 2; Page 14676, Column 2, Paragraph 1, “… MGAR can select the appropriate meta-graph which contributes the most for each user through a feature fusion mechanism”; Page 14667, Column 1, Paragraph 1, “In our MGAR model, we employ an advanced CNN-based approach and attention mechanism to learn embeddings from multi-meta-graphs.”. Fig. 2 shows the MGAR model which selects a first type of machine-learning model to process data as it selects the meta-graph that is the most influential/contributes the most (thus, interpreted by the examiner as graph representation according to the plurality of estimate graph learning performance metrics as the model selects the highest rating prediction event for each user). The first type is interpreted by the examiner as the one that offers the most contribution out of the different types of machine-learning models; where each machine learning model is different based on feature fusion and different meta-graphs (which are different graph learning tasks) for different prediction events).
Regarding Claim 10:
Jiang teaches the system of Claim 9:
wherein extracting the meta-graph features comprises:
generating, utilizing the graph feature machine-learning model, a plurality of structural feature matrices comprising local structural characteristics of the graph representation; and
(Jiang, Page 14663, Fig 1(c); Page 14665, Fig. 2: Commuting Matrix. Fig. 2 shows the generation of the Commuting Matrix which extracts the local structural characteristics of the nodes, edges and even path count of the meta-graphs (Fig 1(c))).
extracting the meta-graph features based on the plurality of structural feature matrices.
(Jiang, Page 14665, Fig. 2: Commuting Matrix -> [Fui and Fej]. To extract the global structural characteristic, Fig. 2 shows the ‘Meta-graph Embedding Generation’ which creates Latent Fused Vectors Fui and Fej ; thus, extracting based on structural characteristics from the features as the Fused Latent Vectors contain the information of the Commuting Matrices across all meta-graphs with applying the attention mechanism).
Regarding Claim 11:
Jiang teaches the system of Claim 10:
wherein extracting the meta-graph features comprises
generating, utilizing the graph feature machine-learning model, a fixed-dimension meta-graph feature vector by modifying the plurality of structural feature matrices according to a set of global statistical characteristics associated with the graph representation.
(Jiang, Page 14671, Equation 14 & 15,
PNG
media_image1.png
52
339
media_image1.png
Greyscale
&
PNG
media_image2.png
30
90
media_image2.png
Greyscale
(15)”.)
(Jiang, Page 14665, Fig. 2: Commuting Matrix -> Fui and Fej; Page 14671, Equation 14 & 15,
PNG
media_image1.png
52
339
media_image1.png
Greyscale
&
PNG
media_image2.png
30
90
media_image2.png
Greyscale
(15)”. d within Equation 14 denotes the fixed-dimension of the input vector and w0 denotes a global bias; where Equation 15 is used to get the rating predictor; where xui,ej the xth element of the concatenated vector (of the two fused latent vectors which describe the global statistical characteristics) of user/event interaction embedding; thus, generating a fixed-dimension meta-graph feature vector by modifying (via concatenation)).
Regarding Claim 12:
Jiang teaches the system of Claim 11:
wherein generating the fixed-dimension meta-graph feature vector comprises:
generating a plurality of feature vectors by modifying the plurality of structural feature matrices via a plurality of statistical functions; and
(Jiang, Page 14665, Fig. 2: Fig. 2 shows the generating of a plurality of feature vectors by modifying the structural feature matrices with the use of CNNs-attention based approach to statistically transform the commuting matrix (as the examiner interprets the CNNs applying the learned convolutional filters which are statistical functions)).
generating a concatenated plurality of feature vectors to generate the fixed-dimension meta-graph feature vector by concatenating the plurality of feature vectors.
(Jiang, Page 14671, Equation 15,
PNG
media_image2.png
30
90
media_image2.png
Greyscale
(15)”; Page 14666, Column 2, Paragraph 1, “… Then, we propose a CNN-based embedding approach to learn the user/event embeddings taking commuting matrices as input and an attention mechanism to incorporate the learnt embeddings. Finally, we employ a factorization machine framework to predict ratings by considering the concatenated user and event embeddings as input”. The two Fused Latent feature vectors are concatenated to generate the fixed-dimension meta graph feature vector).
Regarding Claim 13:
Jiang teaches the system of Claim 12:
wherein extracting the meta-graph features further comprises
appending one or more scalar statistical metrics determined from the graph representation to the concatenated plurality of feature vectors in the fixed-dimension meta-graph feature vector.
(Jiang, Page 14665, Fig. 2: Commuting Matrix -> Fui and Fej; Page 14671, Equation 14 & 15,
PNG
media_image1.png
52
339
media_image1.png
Greyscale
&
PNG
media_image2.png
30
90
media_image2.png
Greyscale
(15)”. d within Equation 14 denotes the fixed-dimension of the input vector and w0 denotes a global bias; where Equation 15 is used to get the rating predictor; where xui,ej the xth element of the concatenated vector (of the two fused latent vectors) of user/event interaction embedding. The adjustable weights within Equation 14 are interpreted as the scalar statistical metrics determined from the graph representation as they are scalar values that append/adjust the concatenated feature vector).
Regarding Claim 14:
Jiang teaches the system of Claim 9:
wherein generating the plurality of estimated graph learning performance metrics comprises:
generating, for a first machine-learning model of the plurality of machine-learning models, a first estimated performance metric according to the meta-graph features; and generating, for a second machine-learning model of the plurality of machine-learning models, a second estimated performance metric according to the meta-graph features.
(Jiang, Page 14665, Fig. 2: Fig. 2 shows the generating for multiple machine-learning models estimated performance metrics according to the commuting matrix (meta-graph features); as the Rating Predictor compares the performance metrics to find the most influential).
Regarding Claim 15:
Jiang teaches the system of Claim 14:
wherein selecting the machine-learning model to process the dataset associated with the graph representation comprises
selecting the first machine-learning model in response to determining that the first estimated performance metric is higher than the second estimated performance metric.
(Jiang, Page 14665, Fig. 2; Page 14676, Column 2, Paragraph 1, “… MGAR can select the appropriate meta-graph which contributes the most for each user through a feature fusion mechanism”; Page 14667, Column 1, Paragraph 1, “In our MGAR model, we employ an advanced CNN-based approach and attention mechanism to learn embeddings from multi-meta-graphs.”. Fig. 2 shows the MGAR model which selects a machine-learning model to process the dataset as it selects the meta-graph that is the most influential/contributes the most (the model selects the highest rating prediction event for each user; thus, the system will select the first machine-learning model if the system determines that the first estimated performance metric is higher than the second estimated performance metric).
Regarding Claim 16:
Jiang teaches the system of Claim 9:
wherein the processing device further performs operations comprising:
extracting, utilizing the graph feature machine-learning model, a plurality of sets of meta-graph features for training graph representations in the graph dataset;
(Jiang, Page 14672, Table 5; Page14673, Column 2, Paragraph 3, “… MGAR model … learns multiple aspects of latent factors of users and events guided by meta-graphs …”. Table 5 shows meta-graph datasets and their meta-graph relations; thus, implying extracting meta-graph features for training graph representations in the graph dataset).
generating, for the plurality of machine-learning models, a plurality of sets of ground-truth graph learning performance metrics according to the plurality of sets of meta-graph features; and
(Jiang, Page 14665, Fig. 2; Page 14670, Algorithm 1; Page 14666, Column 2, Paragraph 1, “… Then, we propose a CNN-based embedding approach to learn the user/event embeddings taking commuting matrices as input and an attention mechanism to incorporate the learnt embeddings. Finally, we employ a factorization machine framework to predict ratings by considering the concatenated user and event embeddings as input”. Fig. 2 shows within the ‘Meta-graph Embedding generation’ section shows the utilization of a plurality of CNN machine-learning models each outputting a fused latent vector (F) which are used within the ‘Rating Prediction’ section. ^ruiej within Fig. 2 is the generation of a plurality graph learning performance metrics for the training set (thus, ground-truth as the prior data is the baseline for the trained CNNs within Fig. 2) as the predicted ratings are for set of user ui to set of event ej where each predicted rating is the likelihood for a user to be interested in an event (predicted performances). Each rating indicates the predicted performances for each graph learning task shown within the Fig 2: Meta-graph Embedding Generation section; where the CNN-based embedding approach is used to learn the embeddings and then incorporate the learnt embeddings and then finally using a Factorization Machine framework to predict performances based on the Fused latent vectors from the plurality of machine-learning models).
learning the parameters of the graph feature machine-learning model by determining mappings between the plurality of sets of meta-graph features and the plurality of sets of ground-truth graph learning performance metrics.
(Jiang, Page 14665, Fig. 2: Commuting Matrix -> Fui and Fej. ^ruiej within Fig. 2 is the graph learning performance metrics which are determined based on the commuting matrix which are the mappings for the Meta-graph features and the graph learning performance metrics (which are shown within the ‘Meta-graph Embedding Generation’ section; where the CNNs (which are used to learn the parameters as CNNs innately use convolutional filters which are trainable parameters for learning) are used to help determine based on the learned mappings between the features and the different model structures used to embed and finally analyze the performance metrics within the Rating Prediction section)).
Regarding Claim 17:
A non-transitory computer-readable medium storing executable instructions, which when executed by a processing device, cause the processing device to perform operations comprising:
(Jiang, Page 14673, Column 1, Paragraph 1, “We implement the proposed MGAR model based on Pytorch … experiments run in a Linux server with one GPU (NVIDIA Geforce RTX-2080 Ti) and 20 CPU cores (Intel(R) Core(TM) i9-9820X*20)”. The MGAR Model is implied to be implemented on a computing system as the model is based on Pytorch which implies a processor, memory component, and non-transitory computer program as they are inherent within a system that utilizes source code for processing attention based recommendation).
extracting, utilizing a graph feature machine-learning model, meta-graph features comprising local structural characteristics and global structural characteristics from a graph representation of a dataset in a latent space, the graph representation comprising a plurality of nodes and a plurality of edges indicating relationships between the plurality of nodes;
(Jiang, Page 14663, Fig. 1; Page 14665, Fig. 2: Commuting Matrix -> [Fui and Fej]; Page 14666, Column 2, Paragraph 1, “… MGAR is mainly composed of three parts, as illustrated in Fig. 2. First, we construct multiple meta-graphs according to the EBSNs graph and compute multiple commuting matrices based on meta-graphs to express the user preferences information …”; Page 14667, Column 1, Paragraph 2, “… we adopt Latent Dirichlet Allocation (LDA) [44] model to extract event topics, which are applied as a new object type T of the EBSNs graph”. Fig. 2 shows utilizing the Meta-Graph-based Attention Recommendation model (interpreted by the examiner as a graph feature machine-learning model) for the recommendation method; where the ‘Data Preparation’ section shows the creation of ‘Commuting matrices’ based on ‘Meta-graphs’ which represent the structural characteristics of a graph representation as they express the user preferences information; thus, extracting meta-graph features representing the meta-graph structure which contains a plurality of nodes and edges (shown in Fig 1 b & c)). Fig. 2 shows the generation of the Commuting Matrix which extracts the local structural characteristics of the nodes, edges and even path count of the meta-graphs (Fig 1(c))). The ‘Meta-graph’ are extracted as noted within Fig. 2 as the Model Event topics are extracted via LDA model for meta graph features. To extract the global structural characteristic, Fig. 2 shows the ‘Meta-graph Embedding Generation’ which creates Latent Fused Vectors Fui and Fej ; thus, extracting global structural characteristics of the plurality of nodes as the Fused Latent Vectors contain the information of the Commuting Matrices across all meta-graphs with applying the attention mechanism to reflect the global view of how the different models/structures/graphs affect node performance and patterns).
generating, utilizing the graph feature machine-learning model, a plurality of estimated graph learning performance metrics for a plurality of machine-learning models according to the meta-graph features of the graph representation and learned mappings between the meta-graph features and graph learning performance metrics of the plurality of machine-learning models, wherein the plurality of estimated graph learning performance metrics indicate predicted performances of the plurality of machine-learning models in executing a graph learning task on the graph representation; and
(Jiang, Page 14665, Fig. 2; Page 14670, Algorithm 1; Page 14666, Column 2, Paragraph 1, “… Then, we propose a CNN-based embedding approach to learn the user/event embeddings taking commuting matrices as input and an attention mechanism to incorporate the learnt embeddings. Finally, we employ a factorization machine framework to predict ratings by considering the concatenated user and event embeddings as input”. Fig. 2 shows within the ‘Meta-graph Embedding generation’ section shows the utilization of a plurality of CNN machine-learning models each outputting a fused latent vector (F) which are used within the ‘Rating Prediction’ section. ^ruiej within Fig. 2 is the generation of a plurality of estimated graph learning performance metrics as the predicted ratings are for set of user ui to set of event ej where each predicted rating is the likelihood for a user to be interested in an event (thus, interpreted by the examiner as predicted performances). Each rating indicates the predicted performances for each graph learning task shown within the Fig 2: Meta-graph Embedding Generation section; where the CNN-based embedding approach is used to learn the embeddings and then incorporate the learnt embeddings and then finally using a Factorization Machine framework to predict performances based on the Fused latent vectors from the plurality of machine-learning models. The MGAR model selects the most influential machine-learning model to process data (Fig. 2 shows the plurality of metrics being generated via embeddings); thus, implying the outputs that are generated from the CNN-based approach and rated within Rating Predictor are based on the mappings between meta-graph features and learned mappings (interpreted as the commuting matrix and the trained CNN models, respectively)).
selecting a first type of machine-learning model from the plurality of machine-learning models to process the dataset associated with the graph representation according to the plurality of estimated graph learning performance metrics, the plurality of machine-learning models comprising a plurality of different types of machine-learning models associated with graph learning tasks.
(Jiang, Page 14665, Fig. 2; Page 14676, Column 2, Paragraph 1, “… MGAR can select the appropriate meta-graph which contributes the most for each user through a feature fusion mechanism”; Page 14667, Column 1, Paragraph 1, “In our MGAR model, we employ an advanced CNN-based approach and attention mechanism to learn embeddings from multi-meta-graphs”. Fig. 2 shows the MGAR model which selects a first type of machine-learning model to process data as it selects the meta-graph that is the most influential/contributes the most (thus, interpreted by the examiner as graph representation according to the plurality of estimate graph learning performance metrics as the model selects the highest rating prediction event for each user. The first type is interpreted by the examiner as the one that offers the most contribution out of the different types of machine-learning models; where each machine learning model is different based on feature fusion and different meta-graphs (which are different graph learning tasks) for different prediction events).
Regarding Claim 18:
Jiang teaches the non-transitory computer-readable medium of Claim 17:
wherein extracting the meta-graph features comprises:
generating, utilizing the graph feature machine-learning model, a plurality of structural feature matrices comprising local structural characteristics of the graph representation; and
(Jiang, Page 14663, Fig 1(c); Page 14665, Fig. 2: Commuting Matrix. Fig. 2 shows the generation of the Commuting Matrix which extracts the local structural characteristics of the nodes, edges and even path count of the meta-graphs (Fig 1(c))).
generating, utilizing the graph feature machine-learning model, a fixed-dimension meta-graph feature vector by modifying the plurality of structural feature matrices according to a set of global statistical characteristics associated with the graph representation.
(Jiang, Page 14671, Equation 14 & 15,
PNG
media_image1.png
52
339
media_image1.png
Greyscale
&
PNG
media_image2.png
30
90
media_image2.png
Greyscale
(15)”.)
(Jiang, Page 14665, Fig. 2: Commuting Matrix -> Fui and Fej; Page 14671, Equation 14 & 15,
PNG
media_image1.png
52
339
media_image1.png
Greyscale
&
PNG
media_image2.png
30
90
media_image2.png
Greyscale
(15)”. d within Equation 14 denotes the fixed-dimension of the input vector and w0 denotes a global bias; where Equation 15 is used to get the rating predictor; where xui,ej the xth element of the concatenated vector (of the two fused latent vectors which describe the global statistical characteristics) of user/event interaction embedding; thus, generating a fixed-dimension meta-graph feature vector by modifying (via concatenation)).
Regarding Claim 19:
Jiang teaches the non-transitory computer-readable medium of Claim 17:
wherein generating the plurality of estimated graph learning performance metrics comprises
generating the plurality of estimated graph learning performance metrics based on learned mappings between meta-graph features of training graph representations of a graph dataset and graph learning performance metrics of the plurality of machine-learning models corresponding to the training graph representations.
(Jiang, Page 14665, Fig. 2; Page 14665, Fig. 2; Page 14676, Column 2, Paragraph 1, “… MGAR can select the appropriate meta-graph which contributes the most for each user through a feature fusion mechanism”; Page 14667, Column 1, Paragraph 1, “In our MGAR model, we employ an advanced CNN-based approach and attention mechanism to learn embeddings from multi-meta-graphs”. Fig. 2 shows the MGAR model which selects the most influential machine-learning model to process data as it selects the meta-graph (Fig. 2 shows the plurality of metrics being generated via embeddings); thus, implying the outputs that are generated from the CNN-based approach and rated within Rating Predictor are based on the mappings between meta-graph features and graph learning performance metrics (commuting matrix and the trained CNN models (interpreted as training graph representations), respectively)).
Regarding Claim 20:
Jiang teaches the non-transitory computer-readable medium of Claim 17:
wherein selecting the machine-learning model to process the data associated with the graph representation comprises
selecting the first type of a machine-learning model of the plurality of machine-learning models corresponding to a highest estimated performance metric of the plurality of estimated graph learning performance metrics.
(Jiang, Page 14665, Fig. 2; Page 14676, Column 2, Paragraph 1, “… MGAR can select the appropriate meta-graph which contributes the most for each user through a feature fusion mechanism”. Fig. 2 shows the generating for multiple machine-learning models estimated performance metrics according to the commuting matrix (meta-graph features). The Rating Predictor compares the performance metrics to find the most influential; thus, selecting the first type of machine-learning model with the highest estimated performance metric of the plurality of estimated graph learning performance metrics).
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to IBRAHIM RAHMAN whose telephone number is (703)756-1646. The examiner can normally be reached M-F 8am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kakali Chaki can be reached at (571) 272-3719. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/I.R./ Examiner, Art Unit 2122
/KAKALI CHAKI/ Supervisory Patent Examiner, Art Unit 2122