Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statement(s) submitted on 09/19/2022, 01/07/2023, 08/21/2023, 09/19/2025 is/are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement(s) is/are being considered by the examiner.
Status of Claims
The present application is being examined under the claims filed on 11/13/2025.
Claims 1, 3-8, 10-15, 17-20 are rejected. Claims 1, 3-8, 10-15, 17-20 are pending.
Prior Art References
Short Name
Reference
Cozzo
Cozzo, E., De Arruda, G.F., Rodrigues, F.A. and Moreno, Y., 2018. Multiplex networks: basic formalism and structural properties (Vol. 2, pp. 483-503). Berlin: Springer.
Wu
Wu, Z., Pan, S., Chen, F., Long, G., Zhang, C. and Yu, P.S., 2020. A comprehensive survey on graph neural networks. IEEE transactions on neural networks and learning systems, 32(1), pp.4-24.
Corso
Corso, G., Cavalleri, L., Beaini, D., Liò, P. and Veličković, P., 2020. Principal neighbourhood aggregation for graph nets. Advances in neural information processing systems, 33, pp.13260-13271.
Cozzo2
Cozzo, E. 2016. Multiplex networks Structure and Dynamics.
Response to Arguments – 35 U.S.C. 103
Applicant remarks:
“However, Cozzo nowhere describes a Graph Neural Network (GNN) and any message passing or alternating message passing performed by a GNN layer, where for each sub-unit of two parallel sub-units of a unit of the GNN layer, a respective supra-walk matrix dictates that information from the message passing walks is exchanged first within a planar connection followed by across the planar connection or first across the planar connection followed by within the planar connection.”
Examiner response:
Applicant’s arguments have been considered and they are persuasive. Cozzo does teach “message passing walks” and the “supra-walk” matrix (Cozzo 14, “Here, supra-walk is defined as a walk on a multiplex network in which, either before or after each intra-layer step, a walk can either continue on the same layer or change to an adjacent layer.”). Cozzo does not as clearly teach “exchange[d] first within a planar connection followed by across the planar connection or first across the planar connection followed by within the planar connection.” as expanded on by the claim amendment. Cozzo2 does however teach this as indicated by the mapping for“for each sub-unit of the two parallel sub-units, a respective supra-walk matrix, of a plurality of supra-walk matrices, dictates that information from the message passing walks is exchanged first within a planar connection followed by across the planar connection or first across the planar connection followed by within the planar connection” in Figure 1.1. Refer to the updated claim mapping of claim 1.
Applicant remarks:
“However, Corso nowhere describes a multiplexed graph comprising a plurality of planes or supra-walk matrices that define how information is exchanged across planes. Therefore, Corso does not teach or suggest that the aggregation of the sub-units is solved by an aggregation function that combines representations of the plurality of supra-walk matrices, where a respective supra-walk matrix of the plurality of supra-walk matrices dictates that information from the message passing walks is exchanged first within a planar connection followed by across the planar connection or first across the planar connection followed by within the planar connection.
Thus, the higher degree graphs of Corso are not equivalent to the multiplexed graphs comprising within and across planar connections.”
Examiner response:
Applicant’s arguments have been considered but they are not persuasive. While Corso does not directly describe a multiplexed graph, it is the combination of Cozzo and Corso that teaches the original claim language, and Cozzo2 that teaches the amended portion. Refer to the updated claim mapping of claim 1.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 3-8, 10-15, 17-20 are rejected under 35 U.S.C. 103 as being unpatentable over Cozzo in view of Wu in further view Corso in further view of Cozzo2.
In reference to claim 1.
Cozzo teaches:
— “1. A computer-implemented method to solve a machine learning task, the computer-implemented method comprising: receiving a set of data comprising a set of nodes, a set of edges, and a set of relation types; transforming a set of samples from the set of data into a multiplexed graph, wherein the transforming includes creating a plurality of planes, each plane, of the plurality of planes. comprises the set of nodes and the set of edges, and the set of edges of a respective plane of the plurality of planes is associated with a corresponding relation type from the set of relation types (Cozzo 8-9, “We are now in the position to say that a multiplex network is represented by the quadruple M = (V, L, P, M): - the node set V represents the components of the system, - the layer set L represents different types of relations or interactions in the system, - the participation graph GP encodes the information about what node takes part in a particular type of relation and defines the representative of each component in each type of relation, i.e., the node-layer pair, -the layer-graphs M represent the networks of interactions of a particular type between the components, i.e., the networks of representatives of the components of the system.”; Cozzo Fig. 2.1);”
PNG
media_image1.png
399
769
media_image1.png
Greyscale
— “alternating message passing walks within and across the plurality of planes of the multiplexed graph using a graph neural network (GNN) layer of a GNN (Cozzo 14, “Here, supra-walk is defined as a walk on a multiplex network in which, either before or after each intra-layer step, a walk can either continue on the same layer or change to an adjacent layer.”)”
Wu teaches:
— “wherein the GNN layer has a plurality of units, each unit, of the plurality of units, outputs an aggregation of two parallel sub-units (Wu Fig. 2(a), “A graph convolutional layer encapsulates each node’s hidden representation by aggregating feature information from its neighbors”)”
— “and the aggregation of the two parallel sub-units is based on an aggregation function that combines representations of the plurality of supra-walk matrices (Wu Fig. 2(a), “A graph convolutional layer encapsulates each node’s hidden representation by aggregating feature information from its neighbors”);”
— “and training a set of weights of the GNN for the machine learning task, based on a task-specific supervision and the alternating of the message passing walks (Wu 4, “Training Frameworks. Many GNNs (e.g., ConvGNNs) can be trained in a (semi-) supervised or purely unsupervised way within an end-to-end learning framework, depending on the learning tasks and label information available at hand.”, Examiner notes that such tasks are also described in Wu; Wu 4, “Node-level”, “Edge-level”, “Graph-level”).”
Motivation to combine Cozzo and Wu.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the present application to combine Cozzo and Wu. Cozzo discloses a formal reference of multiplex network graphs. Wu discloses a survey of graph neural networks. One would be motivated to combine these references because it would be obvious to one of ordinary skill in the art to apply the algorithms of Wu to the graph networks elaborated on in Cozzo. Further, MPEP 2143 sets forth the Supreme Court rationales for obviousness including: (C) Use of known technique to improve similar devices (methods, or products) in the same way.
Corso teaches:
— “each sub-unit of the two parallel sub-units comprises a typed GNN layer that allows different permutations of connectivity patterns between intra-planar nodes of the [multiplexed] graph and inter-planar nodes of the [multiplexed] graph (Corso 4, “We combine the aggregators and scalers presented in previous sections obtaining the Principal Neighbourhood Aggregation (PNA). This is a general and flexible architecture, which in our tests we used with four neighbour-aggregations with three degree-scalers each, as summarized in Equation 7.”);”
Motivation to combine Cozzo, Wu, and Corso.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the present application to combine Cozzo, Wu and Corso. Cozzo, Wu discloses the application of graph neural networks to multiplex graphs. Corso discloses principal neighborhood aggregation for graph networks. One would be motivated to combine these references because it would be obvious to one of ordinary skill in the art to apply the aggregation methods of Corso to the pattern recognition methodology taught in Corso. Further, MPEP 2143 sets forth the Supreme Court rationales for obviousness including: (B) Simple substitution of one known element for another to obtain predictable results.
Cozzo2 teaches:
— “for each sub-unit of the two parallel sub-units, a respective supra-walk matrix, of a plurality of supra-walk matrices, dictates that information from the message passing walks is exchanged first within a planar connection followed by across the planar connection or first across the planar connection followed by within the planar connection (Cozzo2 Figure 1.1. A-bar is the supra-walk matrix and the grey sub-matrices are the inter-plane walks and the white sub-matrices are the intra-plane walks.),”
PNG
media_image2.png
277
936
media_image2.png
Greyscale
Motivation to combine Cozzo, Wu, Corso, and Cozzo2.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the present application to combine Cozzo, Wu, Corso, Cozzo2. Cozzo, Wu, Corso discloses the application of graph neural networks to multiplex graphs. Cozzo2 added details regarding supra-walk matrices. One would be motivated to combine these references because it would be obvious to one of ordinary skill in the art to apply the added details regarding supra-walk matrices in Cozzo2 to the less detailed discussion of Cozzo. Further, MPEP 2143 sets forth the Supreme Court rationales for obviousness including: (B) Simple substitution of one known element for another to obtain predictable results.
In reference to claim 3.
Wu teaches:
— “3. The computer-implemented method of claim 1, wherein the machine learning task is a prediction of atleast one of of a graph level (Wu 4, “Graph-level outputs relate to the graph classification task.”), an edge-level (Wu 4, “Edge-level outputs relate to the edge classification and link prediction tasks.”), or a node-level label of the set of provided samples (Wu 4, “Node-level outputs relate to node regression and node classification tasks.”).”
In reference to claim 4.
Corso teaches:
— “4. The computer-implemented method of claim 1, wherein the aggregation of the two parallel sub-units is solved by a concatenation (Corso pg. 5 and equations 7 and 8, “U reduces the size of the concatenated message”).”
PNG
media_image3.png
163
676
media_image3.png
Greyscale
PNG
media_image4.png
116
701
media_image4.png
Greyscale
In reference to claim 5.
Wu teaches:
— “5. The computer-implemented method of claim 1, wherein the aggregation of the two parallel sub-units is solved by at least one of a minimum, a maximum, or an average (Wu 9, “The aggregation function should be invariant to the permutations of node orderings such as a mean, sum or max function.”, Examiner notes that, as described by Wu, one of ordinary skill in the art would interpret the min function to also be a possible aggregation function since it too is permutation invariant.).”
In reference to claim 6.
Wu and Corso teach:
— “6. The computer-implemented method of claim 1, wherein the GNN is one of a graph isomorphism network (GIN) (Wu 9, “However, Graph Isomorphism Network (GIN) finds that previous MPNN-based methods are incapable of distinguishing different graph structures based on the graph embedding they produced. To amend this drawback, GIN adjusts the weight of the central node by a learnable parameter [Symbol font/0x65]k.”), a graph convolutional network (GCN) (Wu Fig. 2(a)), or a partial neighborhood aggregation network (PNA) (Corso Abstract, “We extend this theoretical framework to include continuous features—which occur regularly in real-world input domains and within the hidden layers of GNNs—and we demonstrate the requirement for multiple aggregation functions in this context. Accordingly, we propose Principal Neighbourhood Aggregation (PNA), a novel architecture combining multiple aggregators with degree-scalers (which generalize the sum aggregator).”).”
In reference to claim 7.
Corso teaches:
— “7. The computer-implemented method of claim 1, wherein the plurality of units are arranged serially in cascade (Corso Figure 2).”
PNG
media_image5.png
323
914
media_image5.png
Greyscale
In reference to claim 8, 10-15, 17-20
Claims 8, 10-15, 17-20 are substantially similar to claims 1, 3-7 and thus are rejected using the same art.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to CODY RYAN GILLESPIE whose telephone number is (571)272-1331. The examiner can normally be reached M-F, 8 AM - 5 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Viker A Lamardo can be reached at 5172705871. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CODY RYAN GILLESPIE/Examiner, Art Unit 2147
/VIKER A LAMARDO/Supervisory Patent Examiner, Art Unit 2147