Prosecution Insights
Last updated: April 19, 2026
Application No. 18/363,431

COMPUTER-READABLE RECORDING MEDIUM STORING DETERMINATION PROGRAM, DETERMINATION METHOD, AND INFORMATION PROCESSING DEVICE

Non-Final OA §101§102§103§112
Filed
Aug 01, 2023
Examiner
ILES, TYLER EDWARD
Art Unit
2122
Tech Center
2100 — Computer Architecture & Software
Assignee
Fujitsu Limited
OA Round
1 (Non-Final)
67%
Grant Probability
Favorable
1-2
OA Rounds
3y 3m
To Grant
99%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
2 granted / 3 resolved
+11.7% vs TC avg
Strong +50% interview lift
Without
With
+50.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
21 currently pending
Career history
24
Total Applications
across all art units

Statute-Specific Performance

§101
29.5%
-10.5% vs TC avg
§103
42.6%
+2.6% vs TC avg
§102
15.6%
-24.4% vs TC avg
§112
12.3%
-27.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 3 resolved cases

Office Action

§101 §102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA. This action is in response to an application filed on August 1 st , 2023 . Claims 1-8 are pending in the current application. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b ) CONCLUSION.— The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the appl icant regards as his invention. Claim s 1, 2, 3, 7, and 8 recites the limitation " the specified second graph data ". The “in a case” claim language makes “ specified second graph data ” a contingent limitation, and so the broadest reasonable interpretation may not require “a specified second graph data”. Therefore, t here is insufficient antecedent basis for this limitation in the “outputting” step s of claim s 1, 2, 3, 7, and 8 . Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim(s) 1-8 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding claim 1 , Under Step 1 of the Subject Matter Eligibility Test of Products and Processes, the claim is directed towards a manufacture, which is one of the four statutory categories. Next, under a Step 2A Prong 1 Analysis, the claim recites the following limitation, which are interpreted to be, under the broadest reasonable interpretation, abstract ideas : … performing predetermined determination processing on the first graph data . (mental process) in a case where all the first scores of the acquired one or more first scores are less than a threshold, referring to a storage device, to specify among one or more pieces of graph data , second graph data that is a second determination result different from the acquired first determination result, (mental process) Therefore, we have to examine the claim under Step 2A prong 2, which considers the additional elements within the claim. The claim’s additional elements are: acquiring a first determination result of first graph data acquiring one or more first scores regarding a predetermined feature of the first graph data using a trained machine learning model the one or more first scores including one or more scores representing a basis of the first determination result of the first graph data the trained machine learning model being a trained model configured to output, in response to obtaining graph data, one or more scores regarding the predetermined feature of the graph data the storage device being a device that stores, for each piece of graph data of the one or more pieces of graph data, a determination result obtained by performing the predetermined determination processing on the piece of graph data; outputting, in association with the acquired first determination result, information regarding the predetermined feature of the specified second graph data. The limitations, “ acquiring a first determination result of first graph data ”, “ acquiring one or more first scores regarding a predetermined feature of the first graph data ”, and “ outputting, in association with the acquired first determination result, information regarding the predetermined feature of the specified second graph data ”, are considered to be insignificant extra-solution activity. (See MPEP 2106.05(g)) “ using a trained machine learning model ”, is interpreted to be mere instructions to apply a judicial exception, as it instructs to use a trained ML model as a tool to acquire a score. (See MPEP 2106.05(f)) and “the one or more first scores including one or more scores representing a basis of the first determination result of the first graph data”, “the trained machine learning model being a trained model configured to output, in response to obtaining graph data, one or more scores regarding the predetermined feature of the graph data”, and “the storage device being a device that stores, for each piece of graph data of the one or more pieces of graph data, a determination result obtained by performing the predetermined determination processing on the piece of graph data;”, are all limitations considered to be merely indicating the particular technological environment or field of use, and “generally links” the particular trained machine learning model, the scores, and storage device to the abstract idea. (See MPEP 2106.05(h)) Therefore, these additional elements do not integrate the abstract idea into a practical application. The claim is directed to an abstract idea. Under a Step 2B analysis, the claim’s additional elements do not amount to significantly more than the judicial exception as explained above in Step 2A prong 2. Additionally, “ acquiring a first determination result of first graph data ”, “ acquiring one or more first scores regarding a predetermined feature of the first graph data ”, and “ outputting, in association with the acquired first determination result, information regarding the predetermined feature of the specified second graph data ”, are considered to be well understood, routine , and conv ention al, as the limitations are considered to be mere data gathering and/or sending or receiving data over a network. (See MPEP 2106.05( d)(ii)) Therefore, the claim is ineligible. Regarding claim 7, Under Step 1 of the Subject Matter Eligibility Test of Products and Processes, the claim is directed towards a process, which is one of the four statutory categories. Next, under a Step 2A Prong 1 Analysis, the claim recites the following limitation, which are interpreted to be, under the broadest reasonable interpretation, abstract ideas : … performing predetermined determination processing on the first graph data . (mental process) in a case where all the first scores of the acquired one or more first scores are less than a threshold, referring to a storage device, to specify among one or more pieces of graph data , second graph data that is a second determination result different from the acquired first determination result, (mental process) Therefore, we have to examine the claim under Step 2A prong 2, which considers the additional elements within the claim. The claim’s additional elements are: a computer acquiring a first determination result of first graph data acquiring one or more first scores regarding a predetermined feature of the first graph data using a trained machine learning model the one or more first scores including one or more scores representing a basis of the first determination result of the first graph data the trained machine learning model being a trained model configured to output, in response to obtaining graph data, one or more scores regarding the predetermined feature of the graph data the storage device being a device that stores, for each piece of graph data of the one or more pieces of graph data, a determination result obtained by performing the predetermined determination processing on the piece of graph data; outputting, in association with the acquired first determination result, information regarding the predetermined feature of the specified second graph data. The limitations, “ acquiring a first determination result of first graph data ”, “ acquiring one or more first scores regarding a predetermined feature of the first graph data ”, and “ outputting, in association with the acquired first determination result, information regarding the predetermined feature of the specified second graph data ”, are considered to be insignificant extra-solution activity. (See MPEP 2106.05(g)) A computer and “ using a trained machine learning model ”, is interpreted to be mere instructions to apply a judicial exception, as it instructs to use a trained ML model as a tool to acquire a score and to use a computer to perform the abstract idea. (See MPEP 2106.05(f)) and “the one or more first scores including one or more scores representing a basis of the first determination result of the first graph data”, “the trained machine learning model being a trained model configured to output, in response to obtaining graph data, one or more scores regarding the predetermined feature of the graph data”, and “the storage device being a device that stores, for each piece of graph data of the one or more pieces of graph data, a determination result obtained by performing the predetermined determination processing on the piece of graph data;”, are all limitations considered to be merely indicating the particular technological environment or field of use, and “generally links” the particular trained machine learning model, the scores, and storage device to the abstract idea. (See MPEP 2106.05(h)) Therefore, these additional elements do not integrate the abstract idea into a practical application. The claim is directed to an abstract idea. Under a Step 2B analysis, the claim’s additional elements do not amount to significantly more than the judicial exception as explained above in Step 2A prong 2. Additionally, “ acquiring a first determination result of first graph data ”, “ acquiring one or more first scores regarding a predetermined feature of the first graph data ”, and “ outputting, in association with the acquired first determination result, information regarding the predetermined feature of the specified second graph data ”, are considered to be well understood, routine, and conventional, as the limitations are considered to be mere data gathering and/or sending or receiving data over a network. (See MPEP 2106.05(d)(ii)) Therefore, the claim is ineligible. Regarding claim 8, Under Step 1 of the Subject Matter Eligibility Test of Products and Processes, the claim is directed towards a machine, which is one of the four statutory categories. Next, under a Step 2A Prong 1 Analysis, the claim recites the following limitation, which are interpreted to be, under the broadest reasonable interpretation, abstract ideas : … performing predetermined determination processing on the first graph data . (mental process) in a case where all the first scores of the acquired one or more first scores are less than a threshold, referring to a storage device, to specify among one or more pieces of graph data , second graph data that is a second determination result different from the acquired first determination result, (mental process) Therefore, we have to examine the claim under Step 2A prong 2, which considers the additional elements within the claim. The claim’s additional elements are: a hardware processor acquiring a first determination result of first graph data acquiring one or more first scores regarding a predetermined feature of the first graph data using a trained machine learning model the one or more first scores including one or more scores representing a basis of the first determination result of the first graph data the trained machine learning model being a trained model configured to output, in response to obtaining graph data, one or more scores regarding the predetermined feature of the graph data the storage device being a device that stores, for each piece of graph data of the one or more pieces of graph data, a determination result obtained by performing the predetermined determination processing on the piece of graph data; outputting, in association with the acquired first determination result, information regarding the predetermined feature of the specified second graph data. The limitations, “ acquiring a first determination result of first graph data ”, “ acquiring one or more first scores regarding a predetermined feature of the first graph data ”, and “ outputting, in association with the acquired first determination result, information regarding the predetermined feature of the specified second graph data ”, are considered to be insignificant extra-solution activity. (See MPEP 2106.05(g)) A hardware processor and “ using a trained machine learning model ”, is interpreted to be mere instructions to apply a judicial exception, as it instructs to use a trained ML model as a tool to acquire a score, and a processor to perform the abstract idea. (See MPEP 2106.05(f)) and “the one or more first scores including one or more scores representing a basis of the first determination result of the first graph data”, “the trained machine learning model being a trained model configured to output, in response to obtaining graph data, one or more scores regarding the predetermined feature of the graph data”, and “the storage device being a device that stores, for each piece of graph data of the one or more pieces of graph data, a determination result obtained by performing the predetermined determination processing on the piece of graph data;”, are all limitations considered to be merely indicating the particular technological environment or field of use, and “generally links” the particular trained machine learning model, the scores, and storage device to the abstract idea. (See MPEP 2106.05(h)) Therefore, these additional elements do not integrate the abstract idea into a practical application. The claim is directed to an abstract idea. Under a Step 2B analysis, the claim’s additional elements do not amount to significantly more than the judicial exception as explained above in Step 2A prong 2. Additionally, “ acquiring a first determination result of first graph data ”, “ acquiring one or more first scores regarding a predetermined feature of the first graph data ”, and “ outputting, in association with the acquired first determination result, information regarding the predetermined feature of the specified second graph data ”, are considered to be well understood, routine, and conventional, as the limitations are considered to be mere data gathering and/or sending or receiving data over a network. (See MPEP 2106.05(d)(ii)) Therefore, the claim is ineligible. Regarding claim 2, the claim recites “ calculating one or more second scores regarding the predetermined feature of the specified second graph data by using the trained machine learning model, wherein, in the processing of outputting, the second determination result stored in the storage unit, the specified second graph data, and the calculated one or more second scores are output in association with the acquired first determination result. ”, The “ calculating one or more second scores regarding the predetermined feature of the specified second graph data ”, is interpreted to be, under the broadest reasonable interpretation, a “mental process”, which is a grouping of abstract idea. “ using the trained machine learning model ”, is interpreted to be mere instruction to apply a judicial exception, as it instruct to use the machine learning model to calculate the scores. (See MPEP 2106.05(f)) Lastly, “ in the processing of outputting, the second determination result stored in the storage unit, the specified second graph data, and the calculated one or more second scores are output in association with the acquired first determination result ” is a limitation that is considered to be merely indicating the field of use and particular technological environment, and “generally links” the second determination result, the second graph data, and the calculated one or more second scores to the output with the acquired first determination result. (See MPEP 2106.05(h)) There fore the claim is rejected on the same basis as claim 1. Regarding claim 3, the claim recites “ the outputting includes outputting, in association with the acquired first determination result, the second determination result stored in the storage device, the specified second graph data, the calculated one or more second scores, the first graph data, and the acquired one or more first scores. ” The limitation, as drafted, merely indicates the field of use and particular technological environment, and “generally links” the second determination result, the second graph data, and the calculated one or more second scores to the output with the acquired first determination result. (See MPEP 2106.05(h)) There fore, the claim is rejected on the same basis as claim 2. Regarding claim 4, the claim recites “ the outputting includes in a case where at least any one first score of the acquired one or more first scores is equal to or greater than the threshold, outputting, in association with the acquired first determination result, the first graph data and the acquired one or more first scores are output. ” The limitation, as drafted, in interpreted to be, under the broadest reasonable interpretation, a “mental process” which is a grouping of abstract idea. Therefore the claim is rejected on the same basis as claim 1. Regarding claim 5, the claim recites “ the trained machine learning model calculates, in response to obtaining the graph data, a score that corresponds to each element of an adjacency matrix with respect to the graph data, the calculated score being a score regarding a predetermined feature of the graph data and representing a basis of a determination result of the graph data by the predetermined determination processing. ” The limitation, as drafted, in interpreted to be, under the broadest reasonable interpretation, a “mental process” which is a grouping of abstract idea. Therefore the claim is rejected on the same basis as claim 1. Regarding claim 6, the claim recites “ calculating, for each piece of graph data of the one or more pieces of graph data, an index value that indicates magnitude of a difference between the graph data and the first graph data, wherein the specifying includes in a case where all the first scores of the acquired one or more first scores are less than the threshold, referring to the storage device, to specify, among the one or more pieces of graph data, the second graph data that is a second determination result different from the acquired first determination result and that has the calculated index value less than a reference value. ” The limitation, as drafted, in interpreted to be, under the broadest reasonable interpretation, a “mental process” which is a grouping of abstract idea. Therefore the claim is rejected on the same basis as claim 1. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis ( i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale , or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1 -5, 7, and 8 are rejected under 35 U.S.C. 102 (a)(2) as being anticipated by Orhan et al. (Herein referred to as Orhan) (U.S . Patent Application No. US 20240049272 A1 ) Regarding claim 1, Orhan teaches a non-transitory computer-readable recording medium storing a determination program for causing a computer to execute processing (“ Any process described herein may be implemented as a non-transitory computer-readable medium including instructions configured, when executed, to cause one or more processors to carry out the process (e.g., to carry out the method). ” Paragraph 26) comprising: acquiring a first determination result of first graph data by performing predetermined determination processing on the first graph data (“ the processor may obtain a first graph 1111 including, or based on, a first sub-band channel information including channel information for a first sub-band and device information, and provide the first graph 1111 to an input layer of a first GNN layer 1121 … each GNN layer 1121, 1122, 1123, 1124 may be configured to obtain an attention score (at least) associated with each communication device at each layer based on input features of nodes associated with the respective communication device and input features of the sub-band node. ” Paragraph 138 and 145) (The first graph data is input to a first GNN layer, which generates a score which corresponds to a first determination result.) acquiring one or more first scores regarding a predetermined feature of the first graph data by using a trained machine learning model the one or more first scores including one or more scores representing a basis of the first determination result of the first graph data, the trained machine learning model being a trained model configured to output, in response to obtaining graph data, one or more scores regarding the predetermined feature of the graph data; (“ Therefore, each GNN layer 1121, 1122, 1123, 1124 may be configured to determine an attention score for (at least) each MIMO layer node associated with a MIMO layer of one of the communication devices based on attention AI/ML parameters (that may include learnable parameters) of the respective GNN layer and output of the respective GNN layer. The attention score may represent the importance of the nodes associated with each MIMO layer of one of the communication devices . ” Paragraph 145) (The first GNN layer determines the first scores.) in a case where all the first scores of the acquired one or more first scores are less than a threshold, referring to a storage device, to specify among one or more pieces of graph data, second graph data that is a second determination result different from the acquired first determination result, (“f or each sub-band, the processor may obtain a second graph 1112, a third graph 1113, and a fourth graph 1114 … In example 4, the subject matter of any one of examples 1 to 3 can optionally include that the processor is further configured to determine, for the communication resource, the one or more communication devices from the plurality of communication devices based on a threshold applied to the determined scores. ”, Paragraph 138 and 182) ( Claim 1 recites the following contingent limitations: “ referring to a storage device, to specify among one or more pieces of graph data, second graph data that is a second determination result different from the acquired first determination result”. These limitations are contingent because they recite steps that are only required to be performed if a certain condition is met. (i.e. in a case where all the first scores of the acquired one or more first scores are less than a threshold.) The l imitation , “to specify among one or more pieces of graph data, second graph data that is a second determination result different from the acquired first determination result ”, only needs to be performed if the first scores do not meet or exceed a threshold. The condition is mutually exclusive , t herefore, the BRI of claim 1 requires that only one limitation is taught. Either: a storage device is referred to for specified second graph data in a case where all the first scores do not meet or exceed a threshold; or in a case where one or more first scores met or exceed a threshold, a storage device is not referred to for a second graph data. The latter is taught in Orhan) the storage device being a device that stores, for each piece of graph data of the one or more pieces of graph data, a determination result obtained by performing the predetermined determination processing on the piece of graph data (“ The memory 602 may be configured to store channel information 605 of the plurality of communication devices. ” Paragraph 72) (Channel information includes scores for the communications devices, evidenced further in Paragraph 181.) and outputting, in association with the acquired first determination result, information regarding the predetermined feature of the specified second graph data. (“f or each sub-band, the processor may obtain a second graph 1112, a third graph 1113, and a fourth graph 1114, and may provide the inputs to input layer of a second GNN 1122 layer, a third GNN 1123 layer, and a fourth GNN 1124 layer, respectively … Once the iterations of the GNN layers 1121, 1122, 1123, 1124 as provided herein are finalized, a layer (e.g. an output layer) of the GNN layers 1121, 1122, 1123, 1124 may output 1130 the attention scores for each communication device. In various examples, the output of the last attention computation may represent the importance of nodes coupling the sub-band node. Accordingly, each attention score associated with the coupling of a node to the sub-band node, about which various examples are provided herein, may represent a likelihood for the communication device to be allocated for the designated communication resource. In various examples, the output 1130 of the last attention computation may represent the importance of edges coupled to the sub-band node. ”, Paragraphs 138 and 147) Regarding claim 7, Orhan teaches A determination method implemented by a computer, the determination method comprising : acquiring, in a hardware processor of the computer, a first determination result of first graph data by performing predetermined determination processing on the first graph data; (“ the processor may obtain a first graph 1111 including, or based on, a first sub-band channel information including channel information for a first sub-band and device information, and provide the first graph 1111 to an input layer of a first GNN layer 1121 … each GNN layer 1121, 1122, 1123, 1124 may be configured to obtain an attention score (at least) associated with each communication device at each layer based on input features of nodes associated with the respective communication device and input features of the sub-band node. ” Paragraph 138 and 145) (The first graph data is input to a first GNN layer, which generates a score which corresponds to a first determination result.) acquiring, in the hardware processor of the computer, one or more first scores regarding a predetermined feature of the first graph data by using a trained machine learning model, the one or more first scores including one or more scores representing a basis of the first determination result of the first graph data, the trained machine learning model being a trained model configured to output, in response to obtaining graph data, one or more scores regarding the predetermined feature of the graph data (“ Therefore, each GNN layer 1121, 1122, 1123, 1124 may be configured to determine an attention score for (at least) each MIMO layer node associated with a MIMO layer of one of the communication devices based on attention AI/ML parameters (that may include learnable parameters) of the respective GNN layer and output of the respective GNN layer. The attention score may represent the importance of the nodes associated with each MIMO layer of one of the communication devices . ” Paragraph 145) (The first GNN layer determines the first scores.) in a case where all the first scores of the acquired one or more first scores are less than a threshold, referring to a storage device, to specify among one or more pieces of graph data, second graph data that is a second determination result different from the acquired first determination result, (“f or each sub-band, the processor may obtain a second graph 1112, a third graph 1113, and a fourth graph 1114 … In example 4, the subject matter of any one of examples 1 to 3 can optionally include that the processor is further configured to determine, for the communication resource, the one or more communication devices from the plurality of communication devices based on a threshold applied to the determined scores. ”, Paragraph 138 and 182) ( Claim 7 recites the following contingent limitations: “ referring to a storage device, to specify among one or more pieces of graph data, second graph data that is a second determination result different from the acquired first determination result”. These limitations are contingent because they recite steps that are only required to be performed if a certain condition is met. (i.e. in a case where all the first scores of the acquired one or more first scores are less than a threshold.) The l imitation , “to specify among one or more pieces of graph data, second graph data that is a second determination result different from the acquired first determination result ”, only needs to be performed if the first scores do not meet or exceed a threshold. The condition is mutually exclusive , t herefore, the BRI of claim 7 requires that only one limitation is taught. Either: a storage device is referred to for specified second graph data in a case where all the first scores do not meet or exceed a threshold; or in a case where one or more first scores met or exceed a threshold, a storage device is not referred to for a second graph data. The latter is taught in Orhan) the storage device being a device that stores, for each piece of graph data of the one or more pieces of graph data, a determination result obtained by performing the predetermined determination processing on the piece of graph data (“ The memory 602 may be configured to store channel information 605 of the plurality of communication devices. ” Paragraph 72) (Channel information includes scores for the communications devices, evidenced further in Paragraph 181.) and outputting, in association with the acquired first determination result, information regarding the predetermined feature of the specified second graph data. (“f or each sub-band, the processor may obtain a second graph 1112, a third graph 1113, and a fourth graph 1114, and may provide the inputs to input layer of a second GNN 1122 layer, a third GNN 1123 layer, and a fourth GNN 1124 layer, respectively … Once the iterations of the GNN layers 1121, 1122, 1123, 1124 as provided herein are finalized, a layer (e.g. an output layer) of the GNN layers 1121, 1122, 1123, 1124 may output 1130 the attention scores for each communication device. In various examples, the output of the last attention computation may represent the importance of nodes coupling the sub-band node. Accordingly, each attention score associated with the coupling of a node to the sub-band node, about which various examples are provided herein, may represent a likelihood for the communication device to be allocated for the designated communication resource. In various examples, the output 1130 of the last attention computation may represent the importance of edges coupled to the sub-band node. ”, Paragraphs 138 and 147) Regarding claim 8 , Orhan teaches a n information processing apparatus comprising a hardware processor configured to perform determination processing (“ Any process described herein may be implemented as a non-transitory computer-readable medium including instructions configured, when executed, to cause one or more processors to carry out the process (e.g., to carry out the method). ” Paragraph 26) (The medium and processor comprise the apparatus.) including: acquiring a first determination result of first graph data by performing predetermined determination processing on the first graph data; (“ the processor may obtain a first graph 1111 including, or based on, a first sub-band channel information including channel information for a first sub-band and device information, and provide the first graph 1111 to an input layer of a first GNN layer 1121 … each GNN layer 1121, 1122, 1123, 1124 may be configured to obtain an attention score (at least) associated with each communication device at each layer based on input features of nodes associated with the respective communication device and input features of the sub-band node. ” Paragraph 138 and 145 ) (The first graph data is input to a first GNN layer, which generates a score which corresponds to a first determination result.) acquiring one or more first scores regarding a predetermined feature of the first graph data by using a trained machine learning model, the one or more first scores including one or more scores representing a basis of the first determination result of the first graph data, the trained machine learning model being a trained model configured to output, in response to obtaining graph data, one or more scores regarding the predetermined feature of the graph data; (“ Therefore, each GNN layer 1121, 1122, 1123, 1124 may be configured to determine an attention score for (at least) each MIMO layer node associated with a MIMO layer of one of the communication devices based on attention AI/ML parameters (that may include learnable parameters) of the respective GNN layer and output of the respective GNN layer. The attention score may represent the importance of the nodes associated with each MIMO layer of one of the communication devices . ” Paragraph 1 45 ) (The first GNN layer determines the first scores .) in a case where all the first scores of the acquired one or more first scores are less than a threshold, referring to a storage device, to specify among one or more pieces of graph data, second graph data that is a second determination result different from the acquired first determination result, (“f or each sub-band, the processor may obtain a second graph 1112, a third graph 1113, and a fourth graph 1114 … In example 4, the subject matter of any one of examples 1 to 3 can optionally include that the processor is further configured to determine, for the communication resource, the one or more communication devices from the plurality of communication devices based on a threshold applied to the determined scores. ”, Paragraph 138 and 182) ( Claim 8 recites the following contingent limitations: “ referring to a storage device, to specify among one or more pieces of graph data, second graph data that is a second determination result different from the acquired first determination result”. These limitations are contingent because they recite steps that are only required to be performed if a certain condition is met. (i.e. in a case where all the first scores of the acquired one or more first scores are less than a threshold.) The l imitation , “to specify among one or more pieces of graph data, second graph data that is a second determination result different from the acquired first determination result ”, only needs to be performed if the first scores do not meet or exceed a threshold. The condition is mutually exclusive , t herefore, the BRI of claim 8 requires that only one limitation is taught. Either: a storage device is referred to for specified second graph data in a case where all the first scores do not meet or exceed a threshold; or in a case where one or more first scores met or exceed a threshold, a storage device is not referred to for a second graph data. The latter is taught in Orhan) the storage device being a device that stores, for each piece of graph data of the one or more pieces of graph data, a determination result obtained by performing the predetermined determination processing on the piece of graph data (“ The memory 602 may be configured to store channel information 605 of the plurality of communication devices. ” Paragraph 72) (Channel information includes scores for the communications devices, evidenced further in Paragraph 181.) and outputting, in association with the acquired first determination result, information regarding the predetermined feature of the specified second graph data. (“f or each sub-band, the processor may obtain a second graph 1112, a third graph 1113, and a fourth graph 1114, and may provide the inputs to input layer of a second GNN 1122 layer, a third GNN 1123 layer, and a fourth GNN 1124 layer, respectively … Once the iterations of the GNN layers 1121, 1122, 1123, 1124 as provided herein are finalized, a layer (e.g. an output layer) of the GNN layers 1121, 1122, 1123, 1124 may output 1130 the attention scores for each communication device. In various examples, the output of the last attention computation may represent the importance of nodes coupling the sub-band node. Accordingly, each attention score associated with the coupling of a node to the sub-band node, about which various examples are provided herein, may represent a likelihood for the communication device to be allocated for the designated communication resource. In various examples, the output 1130 of the last attention computation may represent the importance of edges coupled to the sub-band node. ”, Paragraph s 138 and 147 ) Regarding claim 2, Orhan teaches the non-transitory computer-readable recording medium according to claim 1, the processing further comprising: calculating one or more second scores regarding the predetermined feature of the specified second graph data by using the trained machine learning model, wherein, in the processing of outputting, the second determination result stored in the storage unit, the specified second graph data, and the calculated one or more second scores are output in association with the acquired first determination result. (“ each GNN layer 1121, 1122, 1123, 1124 may be configured to obtain an attention score for each communication device a.sub.s ,u,p.sup.l as provided in the attention matrix A.sub.s.sup.l with an aggregation function applied to all attention scores obtained for the respective communication device for all sub-bands and all MIMO layers. Accordingly, each GNN layer 1121, 1122, 1123, 1124 may obtain attention scores a.sub.s ,u,p in a manner that every attention score obtained for nodes associated with the same communication device at multiple MIMO layers and multiple sub-bands may have the same value … The output of the GNN 1120 may include a plurality of graphs for each sub-band. ”, Paragraph s 158 and 163 ; See also Figures 11 and 12) (The GNN layer associated with the second graph data (1122 in Orhan) calculated an attention score which is output as a second score, and second determination result. Each sub-band not only generates the score, but also a plurality of graphs, which corresponds to graph data.) Regarding claim 3, Orhan teaches the non-transitory computer-readable recording medium according to claim 2, wherein the outputting includes outputting, in association with the acquired first determination result, the second determination result stored in the storage device, the specified second graph data, the calculated one or more second scores, the first graph data, and the acquired one or more first scores. (“ each GNN layer 1121, 1122, 1123, 1124 may be configured to obtain an attention score for each communication device a.sub.s ,u,p.sup.l as provided in the attention matrix A.sub.s.sup.l with an aggregation function applied to all attention scores obtained for the respective communication device for all sub-bands and all MIMO layers. Accordingly, each GNN layer 1121, 1122, 1123, 1124 may obtain attention scores a.sub.s ,u,p in a manner that every attention score obtained for nodes associated with the same communication device at multiple MIMO layers and multiple sub-bands may have the same value … The output of the GNN 1120 may include a plurality of graphs for each sub-band. ”, Paragraphs 158 and 163; See also Figures 11 and 12) Regarding claim 4, Orhan teaches the non-transitory computer-readable recording medium according to claim 1, wherein the outputting includes in a case where at least any one first score of the acquired one or more first scores is equal to or greater than the threshold, outputting, in association with the acquired first determination result, the first graph data and the acquired one or more first scores are output. ( Referring back to FIG. 11 together with FIG. 12, the processor may perform a thresholding operation 1140 to select one or more communication devices to be scheduled for the communication resource based on the attention scores determined by the GNN 1120. The thresholding operation may be based on a predefined threshold value (e.g. scheduling UEs having attention scores above the predefined threshold value) or may be based on any other thresholding methods (e.g. adaptive thresholding methods, averaging, etc.). In accordance with various aspects provided herein, the GNN 1120 may also be configured to perform a thresholding operation, and in such example, the GNN 1120 may output information representing, or indicating, the communication devices to be allocated or scheduled for the respective communication resource. ”, Paragraph 166) ( Claim 4 recites the following contingent limitations: “ outputting, in association with the acquired first determination result, the first graph data and the acquired one or more first scores are output ”. These limitations are contingent because they recite steps that are only required to be performed if a certain condition is met. (i.e. in a case where at least any one first score of the acquired one or more first scores is equal to or greater than a threshold .) The l imitation , “to specify among one or more pieces of graph data, second graph data that is a second determination result different from the acquired first determination result outputting, in association with the acquired first determination result, the first graph data and the acquired one or more first scores are output ”, only needs to be performed if the first scores meet or exceed a threshold. The condition is mutually exclusive , t herefore, the BRI of claim 4 requires that outputting the first graph data and the acquired one or more first scores is done when a threshold is met or exceeded, which Orhan teaches. ) Regarding claim 5, Orhan teaches the non-transitory computer-readable recording medium according to claim 1, wherein the trained machine learning model calculates, in response to obtaining the graph data, a score that corresponds to each element of an adjacency matrix with respect to the graph data, the calculated score being a score regarding a predetermined feature of the graph data and representing a basis of a determination result of the graph data by the predetermined determination processing. (“ In example 27, the subject matter of example 26, can optionally include that the processor is further configured to obtain the score adjacency matrix based on the input data and machine learning model parameters for attention calculation. In example 28, the subject matter of any one of examples 26 or 27, can optionally include that the determined score for each of the plurality of communication devices is based on the score parameter that is calculated by the last layer of the one or more GNN layers. ”, Paragraph 189) Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Orhan in view of . Regarding claim 6 , Orhan teaches the non-transitory computer-readable recording medium according to claim 1, the processing further including: wherein the specifying includes in a case where all the first scores of the acquired one or more first scores are less than the threshold, referring to the storage device, to specify, among the one or more pieces of graph data, the second graph data that is a second determination result different from the acquired first determination result and that has the calculated index value less than a reference value. (“f or each sub-band, the processor may obtain a second graph 1112, a third graph 1113, and a fourth graph 1114 … In example 4, the subject matter of any one of examples 1 to 3 can optionally include that the processor is further configured to determine, for the communication resource, the one or more communication devices from the plurality of communication devices based on a threshold applied to the determined scores. ”, Paragraph 138 and 182) ( Claim 6 recites the following contingent limitations: “ referring to the storage device, to specify, among the one or more pieces of graph data, the second graph data that is a second determination result different from the acquired first determination result and that has the calculated index value less than a reference value. ”. These limitations are contingent because they recite steps that are only required to be performed if a certain condition is met. (i.e. in a case where all the first scores of the acquired one or more first scores are less than the threshold .) The l imitation , “ to specify, among the one or more pieces of graph data, the second graph data that is a second determination result different from the acquired first determination result and that has the calculated index value less than a reference value ”, only needs to be performed if the first scores are less than a threshold. The condition is mutually exclusive , t herefore, the BRI of claim 4 requires that either: the storage device is referred to as to specify the second graph data and calculated index value less than a reference value , or the process is not performed when the conditions are not met, the latter which Orhan teaches , and can be easily configured to work with the index value of Liu .) However, Orhan does not explicitly teach calculating, for each piece of graph data of the one or more pieces of graph data, an index value that indicates magnitude of a difference between the graph data and the first graph data . Liu teaches calculating, for each piece of graph data of the one or more pieces of graph data, an index value that indicates magnitude of a difference between the graph data and the first graph data, (“ We restrict the counterfactual to be a strict subgraph of Gi. Let the difference between Gi and ˜ Gi be denoted by ∆( Gi, ˜Gi) …”, pg. 2, right column, bottom paragraph) ( ∆(Gi, ˜Gi) corresponds to an index value that indicates a difference between Gi (a graph data) and ˜ Gi (a first graph data) Combined with the graph data of Orhan, the limitation is taught.) Therefore, it would have been considered obvious to one of ordinary skill in the art, prior to the filing date of the current application, to combine the graph data, GNN’s and scores of Orhan , with the calculating for an index value as described in Liu. One would have been motivated to combine the two teachings, prior to the application’s filing date as taking the difference of graph data helps to indicate counterfactual relevance, as disclosed in Liu. (“ Let the difference between Gi and ˜ Gi be denoted by ∆( Gi, ˜Gi), the size of which is represented by |∆(Gi, ˜ Gi)|, so that Gi = ˜ Gi +∆(Gi, ˜Gi) means adding ∆(Gi, ˜Gi) to ˜Gi reconstructs Gi. The class distributions of vi generated by the target GNN model on Gi and ˜Gi are denoted by yi and ˜ yi , respectively. We define the counterfactual relevance [28] of the tuple (Gi, ˜ Gi) when explaining yi as [See Equation 5 below]… The smaller the |ν(Gi)|, the more faithful ∆( Gi, ˜Gi) is circled by the dashed line. The larger the| µ( Gi, ˜Gi)| (Eq. (5)), the more counterfactual relevance. ”, pgs.2-3; See also Equation 5 on pg. 2, or directly below and Fig. 2 on pg. 3.) Equation 5 of Liu Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to FILLIN "Examiner name" \* MERGEFORMAT Tyler E Iles whose telephone number is FILLIN "Phone number" \* MERGEFORMAT (571)272-5442 . The examiner can normally be reached FILLIN "Work Schedule?" \* MERGEFORMAT 9:00am - 5:00pm . Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, FILLIN "SPE Name?" \* MERGEFORMAT Kakali Chaki can be reached at FILLIN "SPE Phone?" \* MERGEFORMAT (571) 272-3719 . The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more in
Read full office action

Prosecution Timeline

Aug 01, 2023
Application Filed
Mar 12, 2026
Non-Final Rejection — §101, §102, §103 (current)

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
67%
Grant Probability
99%
With Interview (+50.0%)
3y 3m
Median Time to Grant
Low
PTA Risk
Based on 3 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month