DETAILED ACTION
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
This action is responsive to the Application filed on 04/12/2023. Claims 1-20 are pending in the case. Claims 1, 8, and 15 are independent claims.
Claim Rejections - 35 U.S.C. § 101
35 U.S.C. § 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. § 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1: Claims 1-7 are directed towards the statutory category of a machine. Claims 8-14 are directed towards the statutory category of a process. Claims 15-20 are directed towards the statutory category of an article of manufacture.
With respect to claim 1:
2A Prong 1: This claim is directed to a judicial exception.
clustering the destination node embeddings generated by the GNN resulting in first groups of destination node embeddings (mental process and/or mathematical concept);
removing, from the destination node embeddings, embeddings from a noise group of the first groups resulting in signal destination node embeddings (mental process);
clustering the signal destination node embeddings resulting in second groups of destination node embeddings (mental process and/or mathematical concept); and
identifying a pattern in the destination node embeddings and source node embeddings based on the second groups of destination node embeddings, the source node embeddings, and the destination node embeddings (mental process).
2A Prong 2: This judicial exception is not integrated into a practical application.
Additional elements:
A device comprising: processing circuitry; and a memory including instructions that, when executed by the processing circuitry, cause the processing circuitry to perform operations comprising (merely reciting the words "apply it" (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea, as discussed in MPEP § 2106.05(f)); and
receiving, from a graph neural network (GNN), source node embeddings and destination node embeddings (adding insignificant extra-solution activity to the judicial exception, as discussed in MPEP § 2106.05(g)).
2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception.
Additional elements:
A device comprising: processing circuitry; and a memory including instructions that, when executed by the processing circuitry, cause the processing circuitry to perform operations comprising (merely reciting the words "apply it" (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea, as discussed in MPEP § 2106.05(f)); and
receiving, from a graph neural network (GNN), source node embeddings and destination node embeddings (MPEP 2106.05(d) indicates that merely “storing and retrieving information in memory” and/or "receiving or transmitting data over a network" are well‐understood, routine, conventional functions when they are claimed in a merely generic manner (as it is in the present claim). Thereby, a conclusion that the claimed step is well-understood, routine, conventional activity is supported under Berkheimer).
With respect to claim 2:
2A Prong 1: This claim is directed to a judicial exception.
reducing dimensionality of the destination node embeddings before clustering the destination node embeddings (mental process and/or mathematical concept).
2A Prong 2: This judicial exception is not integrated into a practical application.
2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception.
With respect to claim 3:
2A Prong 1: This claim is directed to a judicial exception.
removing embeddings from the noise group of the first groups includes identifying a group of the first groups with an average deviation that satisfies a specified criterion (mental process and/or mathematical concept).
2A Prong 2: This judicial exception is not integrated into a practical application.
2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception.
With respect to claim 4:
2A Prong 1: This claim is directed to a judicial exception.
concatenating respective source node embeddings and respective destination node embeddings resulting in concatenated embeddings and wherein identifying the pattern includes using the concatenated embeddings (mental process).
2A Prong 2: This judicial exception is not integrated into a practical application.
2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception.
With respect to claim 5:
2A Prong 1: This claim is directed to a judicial exception.
updating the source node embeddings and destination node embeddings based on the dynamic graph data (mental process).
2A Prong 2: This judicial exception is not integrated into a practical application.
Additional elements:
receiving dynamic graph data (adding insignificant extra-solution activity to the judicial exception, as discussed in MPEP § 2106.05(g)).
2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception.
Additional elements:
receiving dynamic graph data (MPEP 2106.05(d) indicates that merely “storing and retrieving information in memory” and/or "receiving or transmitting data over a network" are well‐understood, routine, conventional functions when they are claimed in a merely generic manner (as it is in the present claim). Thereby, a conclusion that the claimed step is well-understood, routine, conventional activity is supported under Berkheimer).
With respect to claim 6:
2A Prong 1: This claim is directed to a judicial exception.
identifying the pattern includes using a trained decoder to classify based on the second groups, source node embeddings, destination node embeddings, dynamic graph data, and partition data (mental process – high level classifier).
2A Prong 2: This judicial exception is not integrated into a practical application.
2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception.
With respect to claim 7:
2A Prong 1: This claim is directed to a judicial exception.
the partition data is user-specified and indicates a form of the pattern to be identified (mental process).
2A Prong 2: This judicial exception is not integrated into a practical application.
2B: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception.
The remaining claims 8-20 are rejected under 35 U.S.C. § 101 because the claimed invention is directed to an abstract idea without significantly more for at least the same reasons as those given above with respect to claims 1-7 with only the addition of generic computer components under step 2A prong 1. Under the broadest reasonable interpretation, these limitations are process steps that cover mental processes including an observation, evaluation, judgment or opinion that could be performed in the human mind or with the aid of pencil and paper but for the recitation of a generic computer component. If a claim, under its broadest reasonable interpretation, covers a mental process but for the recitation of generic computer components, then it falls within the "Mental Process" grouping of abstract ideas. A person would readily be able to perform this process either mentally or with the assistance of pen and paper. See MPEP § 2106.04(a)(2). Limitations that merely reciting the words "apply it" (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea, as discussed in MPEP § 2106.05(f). These additional elements do not integrate the judicial exception into a practical application under step 2A prong 2. Refer to MPEP §2106.04(d). Moreover, the limitations are merely reciting the words "apply it" (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea, as discussed in MPEP § 2106.05(f). These additional elements do not recite any additional elements/limitations that amount to significantly more. Accordingly, the claimed invention recites an abstract idea without significantly more.
Claim Rejections - 35 U.S.C. § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. §§ 102 and 103 (or as subject to pre-AIA 35 U.S.C. §§ 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. § 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant are advised of the obligation under 37 C.F.R. § 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. § 102(b)(2)(C) for any potential 35 U.S.C. § 102(a)(2) prior art against the later invention.
Claims 1, 2, 5, 8, 9, 12, 15, 16, and 19 are rejected under 35 U.S.C. § 103 as being unpatentable over Shi et al. (U.S. Pat. App. Pub. No. 2023/0049817, hereinafter Shi) in view of Rossi et al. (U.S. Pat. App. Pub. No. 2020/0342006, hereinafter Rossi) and Al-Rfou′ et al. (U.S. Pat. No. 11,455,512, hereinafter Al-Rfou′).
As to independent claims 1, 8, and 15, Shi teaches:
A device comprising (Title and abstract):
processing circuitry (Figure 6, processor 604); and
a memory including instructions that, when executed by the processing circuitry, cause the processing circuitry to perform operations comprising (Figure 6, main memory 606):
receiving, from a graph neural network (GNN),… node embeddings and… node embeddings (Figure 5, initialize embeddings for each node of a graph neural network 520);….
Shi does not appear to expressly teach clustering the… node embeddings generated by the GNN resulting in first groups of… node embeddings; removing, from the… node embeddings, embeddings from a noise group of the first groups resulting in signal… node embeddings; clustering the signal… node embeddings resulting in second groups of… node embeddings; and identifying a pattern in the… node embeddings and… node embeddings based on the second groups of… node embeddings, the… node embeddings, and the… node embeddings.
Rossi teaches clustering the… node embeddings generated by the GNN resulting in first groups of… node embeddings (Paragraph 108, "A clustering including a subset of nodes of the heterogeneous graph is then identified based on a typed graphlet conductance score and output (block 516)."); removing, from the… node embeddings, embeddings from a noise group of the first groups resulting in signal… node embeddings (Paragraph 7, "the nodes of the cluster, or partition, can be removed from the heterogeneous graph such that the graph can be re-analyzed to identify additional clusters, based on the same or a different typed graphlet"); clustering the signal… node embeddings resulting in second groups of… node embeddings (Figure 1, clustering 130); and identifying a pattern in the… node embeddings and… node embeddings based on the second groups of… node embeddings, the… node embeddings, and the… node embeddings (Paragraph 8, "identify clusters of the entities in the network based on dependencies and connectivity patterns among the nodes in the graph that represents the network").
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the graph neural network machine-learning embedding of Shi to include the graph clustering techniques of Rossi to identify variations in type connectivity patterns for nodes of a graph (see Rossi at paragraph 2).
Shi does not appear to expressly teach source node embeddings and destination node embeddings.
Al-Rfou′ teaches source node embeddings and destination node embeddings (Column 5, lines 41 and 42, "the source node embedding 114 and the destination node embedding").
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the graph neural network machine-learning embedding of Shi to include the graph techniques of Al-Rfou′ to consume fewer computational resources (e.g., memory and computing power) while achieving improved performance (see Al-Rfou′ at column 3, lines 29, 30, and 44).
As to dependent claims 2, 9, and 16, Shi further teaches reducing dimensionality of the… node embeddings before clustering the… node embeddings (Paragraph 33, "embeddings are low-dimensional, learned continuous vector representations of discrete variables. Embeddings are useful because they can reduce the dimensionality of categorical variables and meaningfully represent categories in the transformed space").
Shi does not appear to expressly teach source node embeddings and destination node embeddings.
Al-Rfou′ teaches source node embeddings and destination node embeddings (Column 5, lines 41 and 42, "the source node embedding 114 and the destination node embedding").
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the graph neural network machine-learning embedding of Shi to include the graph techniques of Al-Rfou′ to consume fewer computational resources (e.g., memory and computing power) while achieving improved performance (see Al-Rfou′ at column 3, lines 29, 30, and 44).
As to dependent claims 5, 12, and 19, Shi further teaches updating the… node embeddings and… node embeddings based on the dynamic graph data (Paragraph 68, "updated one or more times at each layer in the GNN").
Shi does not appear to expressly teach receiving dynamic graph data.
Rossi teaches receiving dynamic graph data (Paragraph 5, "the heterogeneous graph can be representative of a complex or dynamic network").
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the graph neural network machine-learning embedding of Shi to include the graph clustering techniques of Rossi to identify variations in type connectivity patterns for nodes of a graph (see Rossi at paragraph 2).
Shi does not appear to expressly teach source node embeddings and destination node embeddings.
Al-Rfou′ teaches source node embeddings and destination node embeddings (Column 5, lines 41 and 42, "the source node embedding 114 and the destination node embedding").
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the graph neural network machine-learning embedding of Shi to include the graph techniques of Al-Rfou′ to consume fewer computational resources (e.g., memory and computing power) while achieving improved performance (see Al-Rfou′ at column 3, lines 29, 30, and 44).
Subject Matter Allowable Over the Prior Art
Claims 3, 4, 6, 7, 10, 11, 13, 14, 17, 18, and 20 are allowable over the prior art. A search of the prior art did not surface any references that could reasonably render those claims obvious.
Conclusion
It is noted that any citation to specific pages, columns, lines, or figures in the prior art references and any interpretation of the references should not be considered to be limiting in any way. A reference is relevant for all it contains and may be relied upon for all that it would have reasonably suggested to one having ordinary skill in the art. In re Heck, 699 F.2d 1331, 1332-33, 216 U.S.P.Q. 1038, 1039 (Fed. Cir. 1983) (quoting In re Lemelson, 397 F.2d 1006, 1009, 158 U.S.P.Q. 275, 277 (C.C.P.A. 1968)).
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Casey R. Garner whose telephone number is 571-272-2467. The examiner can normally be reached Monday to Friday, 8am to 5pm, Eastern Time.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alexey Shmatov can be reached on 571-270-3428. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from Patent Center and the Private Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from Patent Center or Private PAIR. Status information for unpublished applications is available through Patent Center and Private PAIR to authorized users only. Should you have questions about access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) Form at https://www.uspto.gov/patents/uspto-automated- interview-request-air-form.
/Casey R. Garner/Primary Examiner, Art Unit 2123