DETAILED ACTION
This action is responsive to the Application 18/091,244 filed on 12/29/2022. Claims 1-20 are pending in the case.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 01/16/2025 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Domestic Benefit
Domestic Benefit of provisional application data 12/31/2021 is acknowledged
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-9, 12-17 and 20 are rejected under 35 U.S.C. 101 because they claimed invention is directed to an abstract idea without significantly more.
Regarding Claim 1:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites determining label space similarities between the targets as represented in the label space which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user judging/determining similarities. See 2106.04.(a)(2).III.C.
The claim recites determining feature space similarities between the inputs as represented in the feature space which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user judging/determining similarities. See 2106.04.(a)(2).III.C.
The claim recites determining a loss based on differences between the label space similarities and feature space similarities that correspond to each other, wherein the loss is based on differences between rankings of the label space similarities and rankings of the feature space similarities which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user determining how poor a prediction is compared to the actual result and determining how different the prediction is compared to the result. See 2106.04.(a)(2).III.C
Subject Matter Eligibility Analysis Step 2A Prong 2:
obtaining a regression dataset comprising multiple pairs(recites insignificant extra-solution activity of data gathering (see MPEP 2106.05(g)))
wherein the pairs respectively comprise inputs and corresponding targets, and wherein the inputs are represented in a feature space and the targets are represented in a label space of continuous values(merely specifies a particular technological environment in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h)))
training an artificial neural network based on the loss(recites insignificant extra-solution and well understood, routine, and conventional activity of training based on a loss(see MPEP 2106.05(g))
Subject Matter Eligibility Analysis Step 2B:
Additional element (a) obtaining a network input is well understood, routine, and conventional activity of “transmitting or receiving data over a network" (see MPEP 2106.05(d)(II)(i) using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 ).
Additional element (b) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f).
Additional elements (c) recites a well understood and conventional practice of a neural network based on the loss as quoted from Minimizing the Maximal Loss: How and Why (Abstract, “A commonly used learning rule is to approximately minimize the average loss over the training set. Other learning algorithms, such as AdaBoost and hard-SVM, aim at minimizing the maximal loss over the training set. The average loss is more popular, particularly in deep learning, due to three main reasons. First, it can be conveniently minimized using online algorithms”)
The additional element(s) (a) (b) and (c) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible.
Regarding Claim 2:
The rejection of claim 1 is incorporated and further claim recites further additional
elements/limitations:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites obtained by applying a first similarity function in the label space across the targets which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user solving a mathematical formula. See 2106.04.(a)(2).III.C and Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C)).
The claim recites obtained by applying a second similarity function in the feature space across the inputs which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user solving a mathematical formula. See 2106.04.(a)(2).III.C and Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C)).
Subject Matter Eligibility Analysis Step 2A Prong 2:
wherein the label space similarities are represented as a first pairwise similarity matrix(merely specifies a particular technological environment in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h)))
wherein the feature space similarities are represented as a second pairwise similarity matrix(merely specifies a particular technological environment in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h)))
Subject Matter Eligibility Analysis Step 2B:
Additional elements (a) and (b) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation merely specifies a field of use in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h)).
The additional element(s) (a) and (b) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible.
Regarding Claim 3:
The rejection of claim 2 is incorporated and further claim recites further additional
elements/limitations:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites wherein the first and second similarity functions differ which is an abstract idea (Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A)))).
Subject Matter Eligibility Analysis Step 2A Prong 2:
The claim does not contain elements that would warrant a Step 2A Prong 2 analysis.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible.
Regarding Claim 4:
The rejection of claim 3 is incorporated and further claim recites further additional
elements/limitations:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites first similarity function comprises negative absolute distance which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))).
The claim recites the second similarity function comprises a cosine similarity which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))).
Subject Matter Eligibility Analysis Step 2A Prong 2:
The claim does not contain elements that would warrant a Step 2A Prong 2 analysis.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible.
Regarding Claim 5:
The rejection of claim 1 is incorporated and further claim recites further additional
elements/limitations:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites wherein the loss based on the differences between the label space similarities and feature space similarities is determined as L, wherein L comprises
PNG
media_image1.png
61
209
media_image1.png
Greyscale
wherein Sy denotes the first pairwise similarity matrix, Sz denotes the second pairwise similarity matrix, [i,:] denotes an ith row of the matrices, rk denotes a ranking function, and ℓ penalizes differences between the pairwise similarity matrices which is an abstract idea (Mathematical Formulas or Equations (see MPEP 2106.04(a)(2)(I)(B)))).
Subject Matter Eligibility Analysis Step 2A Prong 2:
The claim does not contain elements that would warrant a Step 2A Prong 2 analysis.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible.
Regarding Claim 6:
The rejection of claim 5 is incorporated and further claim recites further additional
elements/limitations:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites determines mean squared error between
PNG
media_image2.png
29
75
media_image2.png
Greyscale
and
PNG
media_image3.png
30
60
media_image3.png
Greyscale
which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))).
Subject Matter Eligibility Analysis Step 2A Prong 2:
The claim does not contain elements that would warrant a Step 2A Prong 2 analysis.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible.
Regarding Claim 7:
The rejection of claim 5 is incorporated and further claim recites further additional
elements/limitations:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites wherein training the artificial neural network comprises determining
PNG
media_image4.png
45
199
media_image4.png
Greyscale
which is an abstract idea (Mathematical Formulas or Equations (see MPEP 2106.04(a)(2)(I)(B)))).
The claim recites wherein
PNG
media_image5.png
51
103
media_image5.png
Greyscale
wherein λ denotes interpolation strength and a denotes
PNG
media_image6.png
7
2
media_image6.png
Greyscale
or
PNG
media_image6.png
7
2
media_image6.png
Greyscale
which is an abstract idea (Mathematical Formulas or Equations (see MPEP 2106.04(a)(2)(I)(B)))).
Subject Matter Eligibility Analysis Step 2A Prong 2:
The claim does not contain elements that would warrant a Step 2A Prong 2 analysis.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible.
Regarding Claim 8:
The rejection of claim 1 is incorporated and further claim recites further additional
elements/limitations:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim does not contain elements that would warrant a Step 2A Prong 1 analysis.
Subject Matter Eligibility Analysis Step 2A Prong 2:
wherein the artificial neural network is trained based on minimizing the loss(recites insignificant extra-solution and well understood, routine, and conventional activity of training based on a loss(see MPEP 2106.05(g))
Subject Matter Eligibility Analysis Step 2B:
Additional elements (a) recites a well understood and conventional practice of a neural network based on the loss as quoted from Minimizing the Maximal Loss: How and Why (Abstract, “A commonly used learning rule is to approximately minimize the average loss over the training set. Other learning algorithms, such as AdaBoost and hard-SVM, aim at minimizing the maximal loss over the training set. The average loss is more popular, particularly in deep learning, due to three main reasons. First, it can be conveniently minimized using online algorithms”)
The additional element(s) (c) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible.
Regarding Claim 9:
The rejection of claim 1 is incorporated and further claim recites further additional
elements/limitations:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim does not contain elements that would warrant a Step 2A Prong 1 analysis.
Subject Matter Eligibility Analysis Step 2A Prong 2:
wherein the regression dataset is imbalanced(merely specifies a particular technological environment in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h)))
Subject Matter Eligibility Analysis Step 2B:
Additional elements (a) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation merely specifies a field of use in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h)).
The additional element(s) (a) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible.
Regarding Claim 12:
The rejection of claim 1 is incorporated and further claim recites further additional
elements/limitations:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites …determine a label corresponding to the label space based on the data point which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user making a judgement/determination. See 2106.04.(a)(2).III.C.
Subject Matter Eligibility Analysis Step 2A Prong 2:
obtaining a data point of a type corresponding to the feature space(recites insignificant extra-solution activity of data gathering (see MPEP 2106.05(g)))
applying the artificial neural network to…(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f)))
Subject Matter Eligibility Analysis Step 2B:
Additional element (a) obtaining a network input is well understood, routine, and conventional activity of “transmitting or receiving data over a network" (see MPEP 2106.05(d)(II)(i) using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 ).
Additional elements (b) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f).
The additional element(s) (a) and (b) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible.
Regarding Claim 13:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites determining label space similarities between the targets as represented in the label space which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user judging/determining similarities. See 2106.04.(a)(2).III.C.
The claim recites determining feature space similarities between the inputs as represented in the feature space which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user judging/determining similarities. See 2106.04.(a)(2).III.C.
The claim recites determining a loss based on differences between the label space similarities and feature space similarities that correspond to each other, wherein the loss is based on differences between rankings of the label space similarities and rankings of the feature space similarities which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user determining how poor a prediction is compared to the actual result and determining how different the prediction is compared to the result. See 2106.04.(a)(2).III.C
Subject Matter Eligibility Analysis Step 2A Prong 2:
obtaining a regression dataset comprising multiple pairs(recites insignificant extra-solution activity of data gathering (see MPEP 2106.05(g)))
wherein the pairs respectively comprise inputs and corresponding targets, and wherein the inputs are represented in a feature space and the targets are represented in a label space of continuous values(merely specifies a particular technological environment in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h)))
training an artificial neural network based on the loss(recites insignificant extra-solution and well understood, routine, and conventional activity of training based on a loss(see MPEP 2106.05(g))
Subject Matter Eligibility Analysis Step 2B:
Additional element (a) obtaining a network input is well understood, routine, and conventional activity of “transmitting or receiving data over a network" (see MPEP 2106.05(d)(II)(i) using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 ).
Additional element (b) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f).
Additional elements (c) recites a well understood and conventional practice of a neural network based on the loss as quoted from Minimizing the Maximal Loss: How and Why (Abstract, “A commonly used learning rule is to approximately minimize the average loss over the training set. Other learning algorithms, such as AdaBoost and hard-SVM, aim at minimizing the maximal loss over the training set. The average loss is more popular, particularly in deep learning, due to three main reasons. First, it can be conveniently minimized using online algorithms”)
The additional element(s) (a) (b) and (c) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible.
Regarding Claim 14:
The rejection of claim 13 is incorporated and, further, is rejected under the same rationale as set forth in the rejection of claim 2.
Regarding Claim 15:
The rejection of claim 2 is incorporated and further claim recites further additional
elements/limitations:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites wherein the first and second similarity functions differ which is an abstract idea (Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A)))).
The claim recites first similarity function comprises negative absolute distance which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))).
The claim recites the second similarity function comprises a cosine similarity which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))).
Subject Matter Eligibility Analysis Step 2A Prong 2:
The claim does not contain elements that would warrant a Step 2A Prong 2 analysis.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible.
Regarding Claim 16:
The rejection of claim 13 is incorporated and further claim recites further additional
elements/limitations:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites wherein the loss based on the differences between the label space similarities and feature space similarities is determined as L, wherein L comprises
PNG
media_image1.png
61
209
media_image1.png
Greyscale
wherein Sy denotes the first pairwise similarity matrix, Sz denotes the second pairwise similarity matrix, [i,:] denotes an ith row of the matrices, rk denotes a ranking function, and ℓ penalizes differences between the pairwise similarity matrices which is an abstract idea (Mathematical Formulas or Equations (see MPEP 2106.04(a)(2)(I)(B)))).
The claim recites determines mean squared error between
PNG
media_image2.png
29
75
media_image2.png
Greyscale
and
PNG
media_image3.png
30
60
media_image3.png
Greyscale
which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))).
The claim recites wherein training the artificial neural network comprises determining
PNG
media_image4.png
45
199
media_image4.png
Greyscale
which is an abstract idea (Mathematical Formulas or Equations (see MPEP 2106.04(a)(2)(I)(B)))).
The claim recites wherein
PNG
media_image5.png
51
103
media_image5.png
Greyscale
wherein λ denotes interpolation strength and a denotes
PNG
media_image6.png
7
2
media_image6.png
Greyscale
or
PNG
media_image6.png
7
2
media_image6.png
Greyscale
which is an abstract idea (Mathematical Formulas or Equations (see MPEP 2106.04(a)(2)(I)(B)))).
Subject Matter Eligibility Analysis Step 2A Prong 2:
The claim does not contain elements that would warrant a Step 2A Prong 2 analysis.
Subject Matter Eligibility Analysis Step 2B:
The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible.
Regarding Claim 17:
The rejection of claim 13 is incorporated and, further, is rejected under the same rationale as set forth in the rejection of claim 9.
Regarding Claim 20:
Subject Matter Eligibility Analysis Step 2A Prong 1:
The claim recites determining label space similarities between the targets as represented in the label space which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user judging/determining similarities. See 2106.04.(a)(2).III.C.
The claim recites determining feature space similarities between the inputs as represented in the feature space which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user judging/determining similarities. See 2106.04.(a)(2).III.C.
The claim recites determining a loss based on differences between the label space similarities and feature space similarities that correspond to each other, wherein the loss is based on differences between rankings of the label space similarities and rankings of the feature space similarities which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user determining how poor a prediction is compared to the actual result and determining how different the prediction is compared to the result. See 2106.04.(a)(2).III.C
Subject Matter Eligibility Analysis Step 2A Prong 2:
obtaining the regression dataset from the database, wherein the pairs respectively comprise inputs and corresponding targets,(recites insignificant extra-solution activity of data gathering (see MPEP 2106.05(g)))
wherein the inputs are represented in a feature space and the targets are represented in a label space of continuous values (merely specifies a particular technological environment in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h)))
training an artificial neural network based on the loss(recites insignificant extra-solution and well understood, routine, and conventional activity of training based on a loss(see MPEP 2106.05(g))
a processor(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f)))
database storing a regression dataset comprising multiple pairs that is communicatively coupled to the processor(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f)))
a memory that is communicatively coupled to the processor(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f)))
Subject Matter Eligibility Analysis Step 2B:
Additional element (a) obtaining a network input is well understood, routine, and conventional activity of “transmitting or receiving data over a network" (see MPEP 2106.05(d)(II)(i) using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 ).
Additional element (b) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f).
Additional elements (c) recites a well understood and conventional practice of a neural network based on the loss as quoted from Minimizing the Maximal Loss: How and Why (Abstract, “A commonly used learning rule is to approximately minimize the average loss over the training set. Other learning algorithms, such as AdaBoost and hard-SVM, aim at minimizing the maximal loss over the training set. The average loss is more popular, particularly in deep learning, due to three main reasons. First, it can be conveniently minimized using online algorithms”)
Additional elements (d) (e) and (f) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f).
The additional element(s) (a) (b) (c) (e) and (f) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1-9, 12-17 and 20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Bai et al.(“SimGNN: A Neural Network Approach to Fast Graph Similarity Computation”, henceforth known as Bai).
Regarding Claim 1:
Bai discloses obtaining a regression dataset comprising multiple pairs Page 385, Col 1. Paragraph 1, “where each training data point is a pair of graphs together with their true similarity score”) wherein the pairs respectively comprise inputs and corresponding targets and wherein the inputs are represented in a feature space and the targets are represented in a label space(Page 385, Col 1. Paragraph 1, “More specifically, we design a neural network-based function that maps a pair of graphs into a similarity score. At the training stage, the parameters involved in this function will be learned by minimizing the difference between the predicted similarity scores and the ground truth…” where the pair of graphs used from the dataset to create the predicted similarities scores are considered inputs in a feature space and ground truth are considered corresponding targets represented in a label space as they are the targets of the predicted similarity with ground truths being the actual labels that define a label space for a machine learning task), of continuous values(Page 389, Col 1. Paragraph 1, “We then adopt the exponential function λ(x) = e−x to transform the normalized GED into a similarity score in the range of (0, 1]” where each training example being a pair of graphs with a continuous similarity score(0 to, and including, 1) is considered continuous values in a regression dataset)
Bai discloses determining label space similarities between the targets as represented in the label space (Page 388, Col. 2, Paragraph 9,“To transform ground-truth GEDs into ground-truth similarity scores to train our model, we first normalize the GEDs according to nGED(G1,G2)
=
G
E
D
(
G
1
,
G
2
)
(
|
G
1
|
+
|
G
2
|
)
/
2
where |Gi | denotes the number of nodes of Gi . We then adopt the exponential function λ(x) = e−x to transform the normalized GED into a similarity score in the range of (0, 1]” where the ground-truths are considered label’s as ground truths are the labeled correct output for feature inputs and the ground truths being converted to a score based on similarities is considered determining label space similarities between targets as represented in the label space) and determining feature space similarities between the inputs as represented in the feature space(Bai, Page 386, Col. 1, Paragraph 6, “Now we introduce our proposed approach SimGNN in detail, which is an end-to-end neural network based approach that attempts to learn a function to map a pair of graphs into a similarity score” where the pair of graphs input into a model correspond to features space representations as input into such models are considered features and mapping a pair of input graphs to similarity score is considered determining feature space similarities between the inputs as represented in the feature space)
Bai discloses determining a loss based on differences between the label space similarities and feature space similarities that correspond to each other, wherein the loss is based on differences between rankings of the label space similarities and rankings of the feature space similarities(Bai, Page 387, Col. 2, Paragraph 1, “In the end, one score, šij ∈ R, is predicted, and it is compared against the ground-truth similarity score using the following mean squared error loss function:
PNG
media_image7.png
53
208
media_image7.png
Greyscale
where D is the set of training graph pairs, and s(Gi, Gj) is the ground-truth similarity between Gi and Gj” where šij is the feature similarity score for the graph pairs as it represents the feature space similarity score between two graphs which are considered features input into the model and s(Gi, Gj) is a label space similarity score as it represents the ground-truth similarity score where the ground-truths are considered label’s as ground truths are the labeled correct output for feature inputs and the model is trained with MSE loss function against the ground-truth similarity s(Gi, Gj) and šij)
Bai discloses training an artificial neural network based on the loss(Bai, Page 387, Col. 2, Paragraph 1, “In the end, one score, šij ∈ R, is predicted, and it is compared against the ground-truth similarity score using the following mean squared error loss function” where using the mean squared error loss function on the model is considered training a neural network based on less)
Regarding Claim 2:
The rejection of claim 1 is incorporated and further claim recites further additional elements/limitations:
Bai discloses wherein the label space similarities are represented as a first pairwise similarity matrix obtained by applying a first similarity function in the label space across the targets(Page 388, Col. 2, Paragraph 9,“To transform ground-truth GEDs into ground-truth similarity scores to train our model, we first normalize the GEDs according to nGED(G1,G2)
=
G
E
D
(
G
1
,
G
2
)
(
|
G
1
|
+
|
G
2
|
)
/
2
where |Gi | denotes the number of nodes of Gi . We then adopt the exponential function λ(x) = e−x to transform the normalized GED into a similarity score in the range of (0, 1]” where the transformation the normalized GED into a similarity score
S
i
j
=
e
-
x
=
e
-
n
G
E
D
G
i
,
G
j
=
e
-
G
E
D
G
i
,
G
j
G
i
+
G
j
|
/
2
∈ (0,1]
where Sij is the similarity score for i-th and j-th graph in the dataset and the formula is considered a first similarity function(See Also: Page 387, Col. 2, Paragraph 1, “In the end, one score, šij ∈ R, is predicted, and it is compared against the ground-truth similarity score using the following mean squared error loss function: [Equation 4] where D is the set of training graph pairs, and s(Gi, Gj) is the ground-truth similarity between Gi and Gj” where the set of graphs {G1…GN} compute the ground truth similarity for every pair corresponds to a similarity matrix where i and j correspond to the i-th and j-th element in the dataset respectively)
Bai discloses and wherein the feature space similarities are represented as a second pairwise similarity matrix obtained by applying a second similarity function in the feature space across the inputs(Bai, 387, Col. 2, Paragraph 3, “As illustrated in the bottom data flow of Fig. 3, if Gi has Ni nodes and Gj has Nj nodes, there would be NiNj pairwise interaction scores, obtained by S = σ(UiUjT)” where S ∈ (0,1)Ni,Nj holds the predicted similarity for every node in Gi vs. every node in Gj(See Also: Bai, 388, Col. 2, Paragraph 6, “This can potentially be reduced by node sampling to construct the similarity matrix S”)
Regarding Claim 3:
The rejection of claim 2 is incorporated and further claim recites further additional elements/limitations:
Bai discloses wherein the first and second similarity functions differ(Bai, 387, Col. 2, Paragraph 3, “As illustrated in the bottom data flow of Fig. 3, if Gi has Ni nodes and Gj has Nj nodes, there would be NiNj pairwise interaction scores, obtained by S = σ(UiUjT)” where S ∈ (0,1)Ni,Nj holds the predicted similarity for every node in Gi vs. every node in Gj and the function S = σ(UiUjT) is considered different than
S
i
j
=
e
-
G
E
D
G
i
,
G
j