Prosecution Insights
Last updated: April 19, 2026
Application No. 18/481,383

EVIDENCE-BASED OUT-OF-DISTRIBUTION DETECTION ON MULTI-LABEL GRAPHS

Final Rejection §101§103
Filed
Oct 05, 2023
Examiner
SASS, KIMBERLY A.
Art Unit
3686
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
NEC Laboratories America Inc.
OA Round
2 (Final)
52%
Grant Probability
Moderate
3-4
OA Rounds
3y 8m
To Grant
99%
With Interview

Examiner Intelligence

Grants 52% of resolved cases
52%
Career Allow Rate
102 granted / 195 resolved
At TC average
Strong +54% interview lift
Without
With
+53.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 8m
Avg Prosecution
35 currently pending
Career history
230
Total Applications
across all art units

Statute-Specific Performance

§101
38.8%
-1.2% vs TC avg
§103
33.5%
-6.5% vs TC avg
§102
5.7%
-34.3% vs TC avg
§112
17.8%
-22.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 195 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims This action is in response to the reply filed 11/04/2025. Claim 18 was canceled 11/04/2025. Claim 21 was added 11/04/2025. Claims 1, 7, 11, 16, and 19 were amended 11/04/2025. Claims 1-17, 19-21 are currently pending and have been examined. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-17, 19-21 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more. Claims 1-17, 19-21 are drawn to a method, system, and computer program product which are statutory categories of invention (Step 1: YES). Independent claims 1, 11, and 19 recite out-of-distribution detection of nodes in a graph, comprising: collecting evidence to quantify predictive uncertainty of diverse labels of nodes in a graph of nodes and edges using positive evidence from labels of training; generating multi-label opinions including belief and disbelief for the diverse labels; combining the opinions into a joint belief by employing a comultiplication operation of binomial opinions; classifying the joint belief to detect out-of-distribution nodes of the graph; and performing a corrective action responsive to a detection of an out-of-distribution node. The recited limitations, as drafted, under their broadest reasonable interpretation, cover certain methods of organizing human activity between a user and a patient, as reflected in the specification, which states that “The prediction includes disease classifications and out-of­distribution detections (e.g., detection of new diseases). All of this information can be provided to medical professionals 512 over a network or medical computer system 511…The medical professionals 512 can make medical decisions 514 based on this information. The medical professionals 512 can also use this information to update patient data and make the system models more accurate and efficient.” (see: specification paragraph 88). If a claim limitation, under its broadest reasonable interpretation, covers managing personal behavior or relationships or interactions between people, then it falls within the “Certain Methods of Organizing Human Activity” grouping of abstract ideas. The present claims cover certain methods of organizing human activity because they address “By effectively distinguishing OOD nodes, users with potential interests, for example, can be identified for better recommendations or unknown functions of proteins can be discovered for pharmaceutical research. In a particularly useful embodiment, medical information can be employed in a graphical setting where each node can include a patient or user or characteristics of a patient or user. Multiple labels for each patient may need to be evaluated to ensure all of the patient's medical conditions are properly classified.” (see: specification paragraph 19). Accordingly, the claims recite an abstract idea(s) (Step 2A Prong One: YES).” Further, the recited limitations, as drafted, under the broadest reasonable interpretation, cover mathematical relationships by comultiplication of binomial data. If a claim limitation, under its broadest reasonable interpretation, covers mathematical relationships or mathematical calculations, then it falls within the “Mathematical Concepts” grouping of abstract ideas. Accordingly, the claims recite an abstract idea (Step 2A Prong One: YES). The judicial exception is not integrated into a practical application. The claims are abstract but for the inclusion of the additional elements including “computer-implemented”, “nodes of a multi-label evidential graph neural network having a stack of graph convolutional layers, plurality of fully connected layers, and a plurality of rectified linear unit layers;”, “system”, “hardware processor”, “memory” and “computer program product”, “non-transitory computer readable storage medium”, are recited at a high level of generality (e.g., that the calculating and classifying is performed using generic computer components with instructions are executed to perform the claimed limitations). Such that they amount to no more than mere instructions to apply the exception using generic computer components. See: MPEP 2106.05(f). Hence, the additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. Accordingly, the claims are directed to an abstract idea (Step 2A Prong Two: NO). The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, using the additional elements to perform the abstract idea amounts to no more than mere instructions to apply the exception using generic components. Mere instructions to apply an exception using a generic component cannot provide an inventive concept. See MPEP 2106.05(f). Further, the claimed additional elements, identified above, are not sufficient to amount to significantly more than the judicial exception because they are generic components that are configured to perform well-understood, routine, and conventional activities previously known to the industry. See MPEP 2106.05(d). Said additional elements are recited at a high level of generality and provide conventional functions that do not add meaningful limits to practicing the abstract idea. The originally filed specification supports this conclusion at Figures 5-6 and Paragraph 26, where “Embodiments may include a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or any instruction execution system. A computer-usable or computer readable medium may include any apparatus that stores, communicates, propagates, or transports the program for use by or in connection with the instruction execution system, apparatus, or device. The medium can be magnetic, optical, electronic, electromagnetic, infrared, or semiconductor system ( or apparatus or device) or a propagation medium. The medium may include a computer-readable storage medium such as a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk, etc.” Paragraph 27, where “Each computer program may be tangibly stored in a machine-readable storage media or device ( e.g., program memory or magnetic disk) readable by a general or special purpose programmable computer, for configuring and controlling operation of a computer when the storage media or device is read by the computer to perform the procedures described herein. The inventive system may also be considered to be embodied in a computer-readable storage medium, configured with a computer program, where the storage medium so configured causes a computer to operate in a specific and predefined manner to perform the functions described herein.” Paragraph 28, where “A data processing system suitable for storing and/or executing program code may include at least one processor coupled directly or indirectly to memory elements through a system bus. The memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code to reduce the number of times code is retrieved from bulk storage during execution. Input/output or 1/0 devices (including but not limited to keyboards, displays, pointing devices, etc.) may be coupled to the system either directly or through intervening 1/0 controllers.” Paragraph 48, “Graph neural networks (GNNs) 208 provide a feasible way to extend deep learning methods into the non-Euclidean domain including graphs and manifolds. The most representative models are, according to the types of aggregators, e.g., Graph Convolutional Network (GCN), Graph Attention Networks (GAT), and GraphSage.” Paragraph 49, where “It is possible to apply GNN s 208 to various types of training frameworks, including (semi) supervised or unsupervised learning, depending on the learning tasks supervised learning for node-level classification. Assuming a network with partial nodes labeled and others unlabeled, GNN s 208 can learn a model that effectively identifies the labels for the unlabeled nodes. In this case, an end-to-end framework can be built by stacking graph convolutional layers 210 followed by fully connected FC layers 214.” Paragraph 55, where “Multi-Label Evidential Graph Neural Networks (ML-EGNNs) are built by stacking graph convolutional layers in GNN 208 and two fully connected layers (FCs) 212 with ReLU layers, which are taken as the positive and negative evidence vectors (214 and 216, respectively) for Beta distribution. Predictions of the neural network are treated as subjective opinions and learn the function that collects evidence by a deterministic neural network from data.” Viewing the limitations as an ordered combination, the claims simply instruct the additional elements to implement the concept described above in the identification of abstract idea with route, conventional activity specified at a high level of generality in a particular technological environment. Hence, the claims as a whole, considering the additional elements individually and as an ordered combination, do not amount to significantly more than the abstract idea (Step 2B: NO). Dependent claims 2-10, 12-17 and 20-21 when analyzed as a whole, considering the additional elements individually and/or as an ordered combination, are held to be patent ineligible under 35 U.S.C. 101 because the additional recited limitations fail to establish that the claims are directed to an abstract idea without significantly more. Claim 2, 4-9, 12, 14-18 and 20 recite generating data nodes and values based on calculations to train generic machine learning models implemented on the generically recited computing device as shown in the parent claims above. Claims 3 and 13 further recite “minimizing beta loss in accordance with evidential deep learning”, is recited at a high level of generality as shown in paragraphs 48-49 (e.g., that the calculating and classifying is performed using generic machine learning on generic computer components with instructions are executed to perform the claimed limitations). Such that they amount to no more than mere instructions to apply the exception using generic computer components. See: MPEP 2106.05(f). Claim 10 further recites “wherein the multi-label evidential graph neural network applies evidential deep learning” is recited at a high level of generality as shown in paragraphs 48-49 (e.g., that the calculating and classifying is performed using generic machine learning on generic computer components with instructions are executed to perform the claimed limitations). Such that they amount to no more than mere instructions to apply the exception using generic computer components. See: MPEP 2106.05(f). These claims fail to remedy the deficiencies of their parent claims above, and therefore rejected for at least the same rationale as applied to their parent claims above, and incorporated herein. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-3, 6-13, 16-17, 20-21 is/are rejected under 35 U.S.C. 103 as being unpatentable over Tran (US 11,646,119 B2) in view of Li (US 2017/0293836 A1). CLAIM 1- Tran teaches the limitations of: out-of-distribution detection of nodes in a graph, comprising: (Tran teaches determining the detection of negation pairs (i.e., data obtained from nodes) from a graph of the deep learning model (col 44 lines 17-32, col 27 lines 5-18,Figures 6A-6D) wherein the specification paragraph 4 recites that out-of-distribution detection of nodes in a graph includes collecting evidence to quantify predictive uncertainty and Tran is using the detection of negative pairs of nodes to determine predictive uncertainty) collecting evidence to quantify predictive uncertainty of diverse labels of nodes in a graph of nodes and edges using positive evidence from labels of training nodes of a … graph neural network having a stack of graph convolutional layers, plurality of fully connected layers, and a plurality of rectified linear unit layers; (Tran teaches that data is collected to determine that the findings of the neural network are quantified at numerical values of 0 or 1 to label the nodes as corresponding to a positive prediction and uses the graph of the nodes and edges and uses fully connected ReLu (rectified linear unit layers) in its calculations which is how the specification paragraph 55 teaches under the broadest reasonable interpretation (col 64 lines 34-44, col 53 lines 45-67, Figures 6A-6D, col 4 lines 1-21, col 39 (layer (type))) generating multi-label opinions including belief and disbelief for the diverse labels; (Tran teaches labeling the data as positive or false positive correlations (i.e., belief and disbelief) (col 61 lines 8-22, col combining the opinions into a joint belief by employing a comultiplication operation of binomial opinions; (Tran teaches producing the multi-label classifications as an output using sigmoid activation which is then multiplied by a linear activation (i.e., binomial opinions as defined in the specification paragraph 53 is a linear threshold of classifications) to create a more accurate model through combination of deriving zero for all negative values (col 45 lines 24-35, col 50 lines 34-35)) classifying the joint belief to detect out-of-distribution nodes of the graph; (Tran teaches that uncertainty of the model AUROC is combined with the uncertainty of the radiologist AUROC to detect the nodes of the graph that are forming negative pairs of data (i.e., nodes) (col 52 lines 6-20, col 27 lines 19-33, col 10 lines 54-60)) and performing a corrective action responsive to a detection of an out-of-distribution node (Tran teaches that the system detects that a finding has been missed (i.e., out of distribution node) and that corrective information is needed to improve the deep learning model (col 16 lines 29-35, col 4 lines 25-45, col 7 lines 22-33) Tran does teach using a neural network with graph and multi labeling capabilities, however does not explicitly refer to the neural network as a multi-label evidential graph neural network, however Li teaches: multi-label evidential graph neural network (Li teaches a semi-supervised recurrent neural network that uses partially labelled sequenced data (i.e., multi label evidential data) to add a graph regularizer to the aggregate layer (para [0038-0039])) It would have been obvious to one of ordinary skill in the art at the time the invention was filed to modify the system of Tran to integrate the application of providing a neural network that uses labelled sequenced data and a graph regularizer of Li with the motivation of improving labeling of customer data and user usability of receiving more appropriate data items on customer profiles (see: Li, paragraphs 3-4). CLAIM 2- Tran in view of Li teach the limitations of claim 1 as recited above. Regarding claim 2, Tran further recites: wherein collecting evidence to quantify predictive uncertainty includes predicting positive and negative evidence vectors from the … graph neural network (Tran teaches that data is collected to determine that the findings of the neural network are quantified at numerical values of 0 or 1 to label the nodes as corresponding to a positive prediction and negative prediction in the form of feature vectors and uses the graph of the nodes and edges to determine data (col 64 lines 34-44, col 53 lines 45-67, Figures 6A-6D, col 4 lines 1-21, col 7 lines 3-17, col 27 lines 19-33, col 14 lines 52-64)) Tran does teach using a neural network with graph and multi labeling capabilities, however does not explicitly refer to the neural network as a multi-label evidential graph neural network, however Li teaches: multi-label evidential graph neural network (Li teaches a semi-supervised recurrent neural network that uses partially labelled sequenced data (i.e., multi label evidential data) to add a graph regularizer to the aggregate layer (para [0038-0039])) It would have been obvious to one of ordinary skill in the art at the time the invention was filed to modify the system of Tran to integrate the application of providing a neural network that uses labelled sequenced data and a graph regularizer of Li with the motivation of improving labeling of customer data and user usability of receiving more appropriate data items on customer profiles (see: Li, paragraphs 3-4). CLAIM 3- Tran in view of Li teach the limitations of claim 2 as recited above. Regarding claim 3, Tran further recites: wherein predicting the positive and negative evidence vectors includes generating a beta distribution using the positive and negative evidence vectors (Tran teaches that data is collected to determine that the findings of the neural network are quantified at numerical values of 0 or 1 to label the nodes as corresponding to a positive prediction and negative prediction in the form of feature vectors and uses the graph of the nodes and edges to determine data sets beta parameters (col 64 lines 34-44, col 53 lines 45-67, Figures 6A-6D, col 4 lines 1-21, col 7 lines 3-17, col 27 lines 19-33, col 14 lines 52-64, col 46 lines 6-26, col 39 lines 56-66)) wherein the beta distribution is used to train the … graph neural network by minimizing beta loss in accordance with evidential deep learning (Tran teaches that the beta parameters use loss weighting to minimize loss when training the deep neural network (col 46 lines 6-26, col 26 lines 41-53)) Tran does teach using a neural network with graph and multi labeling capabilities, however does not explicitly refer to the neural network as a multi-label evidential graph neural network, however Li teaches: multi-label evidential graph neural network (Li teaches a semi-supervised recurrent neural network that uses partially labelled sequenced data (i.e., multi label evidential data) to add a graph regularizer to the aggregate layer (para [0038-0039])) It would have been obvious to one of ordinary skill in the art at the time the invention was filed to modify the system of Tran to integrate the application of providing a neural network that uses labelled sequenced data and a graph regularizer of Li with the motivation of improving labeling of customer data and user usability of receiving more appropriate data items on customer profiles (see: Li, paragraphs 3-4). CLAIM 6- Tran in view of Li teach the limitations of claim 1. Regarding claim 6, Tran teaches: wherein classifying the joint belief to detect out-of-distribution nodes of the graph includes determining whether the joint belief exceeds a threshold value for a given node to determine if the node is out-of-distribution (Tran teaches classifying that uncertainty of the model AUROC is combined with the uncertainty of the radiologist AUROC to detect the nodes of the graph that are forming negative pairs of data (i.e., nodes) includes determining that a threshold value is exceeded to indicate that the data is negative (col 52 lines 6-20, col 27 lines 19-33, col 10 lines 54-60, col 12 lines 48-67, Figure 8F)) CLAIM 7- Tran in view of Li teach the limitations of claim 1. Regarding claim 7, Tran teaches: wherein the nodes include patient information, the corrective action includes: identifying a disease prediction for patient without labels (Tran teaches outputting a confidence of the prediction is output (i.e., a prediction without labels) as a visual finding to detect negative pairs (i.e., corrective actions) (col 27 lines 5-18)); alerting medical personnel of the out-of-distribution node; (Tran teaches outputting the information of the probability of a visual finding in the medical image of a patient (i.e., patient information) and indicating on the display the detection of the findings or the absence of findings (col 27 lines 6-35, Figure 8F)) and making a medical decision based on the out-of-distribution node (Tran teaches that based on the presence or absence of findings based on the node data to determine diagnostic accuracy (i.e., medical decision) (col 27 lines 6-35, Figure 8F, col 69 lines 35-55)) CLAIM 8- Tran in view of Li teach the limitations of claim 1. Regarding claim 8, Tran teaches: further comprising optimizing through training the … graph neural network by minimizing total loss which includes a beta loss component (Tran teaches that the beta parameters use loss weighting to minimize loss when training the deep neural network (col 46 lines 6-26, col 26 lines 41-53))and a positive evidence loss component (Tran teaches that data is collected to determine that the findings of the neural network are quantified at numerical values of 0 or 1 to label the nodes as corresponding to a positive prediction and negative prediction in the form of feature vectors and uses the graph of the nodes and edges to determine data sets beta parameters (col 64 lines 34-44, col 53 lines 45-67, Figures 6A-6D, col 4 lines 1-21, col 7 lines 3-17, col 27 lines 19-33, col 14 lines 52-64, col 46 lines 6-26, col 39 lines 56-66)) Tran does teach using a neural network with graph and multi labeling capabilities, however does not explicitly refer to the neural network as a multi-label evidential graph neural network, however Li teaches: multi-label evidential graph neural network (Li teaches a semi-supervised recurrent neural network that uses partially labelled sequenced data (i.e., multi label evidential data) to add a graph regularizer to the aggregate layer (para [0038-0039])) It would have been obvious to one of ordinary skill in the art at the time the invention was filed to modify the system of Tran to integrate the application of providing a neural network that uses labelled sequenced data and a graph regularizer of Li with the motivation of improving labeling of customer data and user usability of receiving more appropriate data items on customer profiles (see: Li, paragraphs 3-4). CLAIM 9- Tran in view of Li teach the limitations of claim 1. Regarding claim 9, Tran teaches: wherein the corrective action includes: applying a label to the out-of-distribution node. (Tran teaches outputting the information of the probability of a visual finding in the medical image of a patient (i.e., patient information) and indicating on the display the detection of the findings or the absence of findings (i.e., nodes) through labeling of the patient image (col 27 lines 6-35, Figure 8F)) CLAIM 10- Tran in view of Li teach the limitations of claim 1. Regarding claim 10, Li teaches wherein the multi-label evidential graph neural network applies evidential deep learning (Li teaches a semi-supervised recurrent neural network that uses partially labelled sequenced data (i.e., multi label evidential data) to add a graph regularizer to the aggregate layer and uses deep learning (para [0038-0039, 0073])) It would have been obvious to one of ordinary skill in the art at the time the invention was filed to modify the system of Tran to integrate the application of providing a neural network that uses labelled sequenced data and a graph regularizer of Li with the motivation of improving labeling of customer data and user usability of receiving more appropriate data items on customer profiles (see: Li, paragraphs 3-4). CLAIMS 11-13, 16-17- Claims 11-13, 16-17 are significantly similar to claims 1-3, 7-9 respectively. The claims are rejected upon the respective prior art above. CLAIMS 19-20- Claims 19-20 are significantly similar to claims 1 and 7 respectively and are rejected upon the same prior art as claims 1 and 7. CLAIM 21- Tran in view of Li teach the limitations of claim 1 as recited above. Regarding claim 21, Tran further recites: Wherein the fully connected layers of the …graph neural network include two fully connected layers that each receive an output of the stack of graph convolutional layers and that provide outputs to respective rectified linear unit layers of the rectified leaner unit layers (Tran teaches that data is collected to determine that the findings of the neural network are quantified at numerical values of 0 or 1 to label the nodes as corresponding to a positive prediction and uses the graph of the nodes and edges and uses fully connected ReLu (rectified linear unit layers) in its calculations which is how the specification paragraph 55 teaches under the broadest reasonable interpretation (col 64 lines 34-44, col 53 lines 45-67, Figures 6A-6D, col 4 lines 1-21, col 39 (layer (type))) Tran does teach using a neural network with graph and multi labeling capabilities, however does not explicitly refer to the neural network as a multi-label evidential graph neural network, however Li teaches: multi-label evidential graph neural network (Li teaches a semi-supervised recurrent neural network that uses partially labelled sequenced data (i.e., multi label evidential data) to add a graph regularizer to the aggregate layer (para [0038-0039])) It would have been obvious to one of ordinary skill in the art at the time the invention was filed to modify the system of Tran to integrate the application of providing a neural network that uses labelled sequenced data and a graph regularizer of Li with the motivation of improving labeling of customer data and user usability of receiving more appropriate data items on customer profiles (see: Li, paragraphs 3-4). Claim(s) 4 and 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Tran (US 11,646,119 B2) in view of Li (US 2017/0293836 A1) and further in view of “Introduction to Machine Learning, Neural Networks, and Deep Learning” (2020, Choi et. Al; herein referred to as Choi). CLAIM 4- Tran in view of Li teach the limitations of claim 1 as recited above. Regarding claim 4, Tran in view of Li do not explicitly teach, however Choi teaches: wherein generating multi-label opinions includes computing for sample i, class k: PNG media_image1.png 127 622 media_image1.png Greyscale (Choi teaches a positive predictive value and a negative predictive value that is calculated through a characteristic curve point of 1 or 0 (i.e., 1-1) based on vectors of graphed matrix data (page 5, page 9)) It would have been obvious to one of ordinary skill in the art at the time the invention was filed to modify the medical neural network system of Tran in view of Li to integrate the application of integrating deep learning methodology for higher-level pattern recognition of Choi with the motivation of improving image classification (see: Choi, page 2). CLAIM 14- Claim 14 is significantly similar to claim 4 and is rejected upon the same prior art as claim 4. Claim(s) 5 and 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Tran (US 11,646,119 B2) in view of Li (US 2017/0293836 A1) and further in view of Sallee (US 11,037,027 B2) CLAIM 5- Tran in view of Li teach the limitations of claim 1. Regarding claim 5, Tran in view of Li do not explicitly teach, however Sallee teaches: wherein combining the opinions into a PNG media_image2.png 66 628 media_image2.png Greyscale (Sallee teaches a probabilistic union function that adds the probability of two variables and subtracts the combination of the two variables together which is what is taught by the specification paragraph 39 in which the “v” is denoted as a union symbol based on the calculations (col 14 lines 60-67)) It would have been obvious to one of ordinary skill in the art at the time the invention was filed to modify the medical neural network system of Tran in view of Li to integrate the application of integrating probabilistic functions to the neural network of Sallee with the motivation of creating improved imaging recognition in more efficient ways (see: Sallee, col 1). CLAIM 15- Claim 15 is significantly similar to claims 5 and 6 and is rejected upon the same prior art as claims 5 and 6. Response to Arguments The arguments filed 11/04/2025 have been fully considered. The arguments pertaining to the 101 rejection are not persuasive. Applicant argues that the activation layer of rectified linear units (ReLus) is distinct from the conventional use of a softmax output and provides evidential deep learning which is an improvement in out-of-distribution node detection. Applicant further argues that the current claimed invention provides an improved technology for detecting out-of-distribution data and provides an improvement to a machine learning model itself. Examiner respectfully disagrees. The use of ReLus in a machine learning model is a well-known technique in machine learning models to identify negative data pairs and does not provide significantly more to the abstract idea. The current claimed invention of “a multi-label evidential graph neural network having a stack of graph convolutional layers, a plurality of fully connected layers, and a plurality of rectified linear unit layers” is merely describing a machine learning evidential neural network under broadest reasonable interpretation. The current claimed invention is silent on an activation layer and multi-label networks as described in the arguments. The current claimed invention is a generic evidential neural network using a plurality of layers to identify data and does not provide an improvement to technology that would overcome the abstract idea as shown in the specification above. The arguments related to the claimed invention being an improvement to a machine learning model are not persuasive, as the specification recites in paragraph 55 that an evidential neural network is used and the claimed invention fails to disclose elements that would provide an improvement over the generic use of an evidential neural network. The functions argued are representative of the abstract idea. The claims here are not directed to a specific improvement to computer functionality that amount to a practical application. Rather, they are directed to the use of conventional or generic technology in a well-known environment, without any claim that the invention reflects an inventive solution to a technical problem presented by combining the two. In the present case, the claims fail to recite any elements that individually or as an ordered combination transform the identified abstract idea(s) in the rejection into a patent-eligible application of that idea. Further, not every claim that recites concrete, tangible components escapes the reach of the abstract-idea inquiry. (See, e.g., Alice, 134). It is well-settled that mere recitation of concrete, tangible components that are generic is insufficient to confer patent eligibility to an otherwise abstract idea. In order to amount to an inventive concept, the components must involve more than performance of “’well-understood, routine, conventional activities’ previously known to the industry.” (Alice, 134 S. Ct. at 2359 (quoting Mayo, 132 S.Ct. at 1294)). The originally filed specification was investigated and found to support this conclusion. The arguments pertaining to the 103 rejection are not persuasive. Applicant argues that Examiner agreed the amendment would overcome the 103 rejection. As noted in the interview 11/03/2025, “specifically reciting how the multi-label graph neural network processes graph data to perform corrective responses on the nodes could overcome the prior art”, however the claim amendments are completely silent in how the multi-label neural network processes graph data to perform corrective responses. The amendments of “having a stack of graph convolutional layers, a plurality of fully connected layers, and a plurality of rectified linear unit layers” is describing the structure of the neural network and does not describe how the graph data performs corrective responses. The amendments do not overcome the prior art of Tran, as Tran teaches in column 39 the list of layers used to perform its calculations in its evidential neural network and uses ReLu layers which reads on how the connected layers and rectified linear unit layers are structured in the specification paragraph 55. The dependent claims rely on the arguments of the independent claims and are rejected for the reasons stated above. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to KIMBERLY A SASS whose telephone number is (571)272-4774. The examiner can normally be reached 7AM-5PM (EST). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, JASON DUNHAM can be reached at 571-272-8109. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /K.A.S./Examiner, Art Unit 3686 /JASON B DUNHAM/Supervisory Patent Examiner, Art Unit 3686
Read full office action

Prosecution Timeline

Oct 05, 2023
Application Filed
Aug 01, 2025
Non-Final Rejection — §101, §103
Oct 22, 2025
Interview Requested
Oct 29, 2025
Examiner Interview Summary
Oct 29, 2025
Applicant Interview (Telephonic)
Nov 04, 2025
Response Filed
Feb 07, 2026
Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602732
SYSTEM AND METHODS FOR SECURING A DRUG THERAPY
2y 5m to grant Granted Apr 14, 2026
Patent 12580059
IV COMPOUNDING SYSTEMS AND METHODS
2y 5m to grant Granted Mar 17, 2026
Patent 12531163
Medical Intelligence System and Method
2y 5m to grant Granted Jan 20, 2026
Patent 12505920
SMART DIAGNOSIS SYSTEM AND METHOD
2y 5m to grant Granted Dec 23, 2025
Patent 12481736
COMMUNICATION MODE SELECTION BASED UPON USER CONTEXT FOR PRESCRIPTION PROCESSES
2y 5m to grant Granted Nov 25, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
52%
Grant Probability
99%
With Interview (+53.8%)
3y 8m
Median Time to Grant
Moderate
PTA Risk
Based on 195 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month