Prosecution Insights
Last updated: April 19, 2026
Application No. 17/617,659

IDENTIFICATION APPARATUS, IDENTIFICATION METHOD AND RECORDING MEDIUM

Final Rejection §101§103§112
Filed
Dec 09, 2021
Examiner
JONES, CHARLES JEFFREY
Art Unit
2122
Tech Center
2100 — Computer Architecture & Software
Assignee
NEC Corporation
OA Round
2 (Final)
27%
Grant Probability
At Risk
3-4
OA Rounds
4y 2m
To Grant
93%
With Interview

Examiner Intelligence

Grants only 27% of cases
27%
Career Allow Rate
4 granted / 15 resolved
-28.3% vs TC avg
Strong +66% interview lift
Without
With
+65.9%
Interview Lift
resolved cases with interview
Typical timeline
4y 2m
Avg Prosecution
27 currently pending
Career history
42
Total Applications
across all art units

Statute-Specific Performance

§101
34.5%
-5.5% vs TC avg
§103
29.1%
-10.9% vs TC avg
§102
17.7%
-22.3% vs TC avg
§112
17.7%
-22.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 15 resolved cases

Office Action

§101 §103 §112
DETAILED ACTION This action is responsive to the amendment filed on 08/20/2025 for application 17/617,659. Claims 1-7 and 9-19 are pending in the case. Claims 1, 7, 9, 10, 14-18 are amended and claim 8 has been canceled. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Claim Rejections - 35 USC § 112 Previous action rejections concerning 112(b) have been withdrawn as the amendments have overcome deficient language. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-7 and 9-19 are rejected under 35 U.S.C. 101 because directed towards an abstract idea without much more. Regarding claim 1: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites identify a class of input data which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user choosing a type of information. See 2106.04.(a)(2).III.C. The claim recites update the learning model, by using an objective function based on a calculated evaluation curve indicating a relevance between a first index value for evaluating accuracy of a result of identification of the class of the input data and a second index value for evaluating time required to identify the class of the input data which is an abstract idea (Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A)))) as the model is updated using a mathematical function. The claim recites a plurality of likelihoods, each indicating a certainty that the series data belongs to a predetermined class, correspondingly to each of the plurality of sub data which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user deciding similarity of data. See 2106.04.(a)(2).III.C. Subject Matter Eligibility Analysis Step 2A Prong 2: at least one memory configured to store instructions; and at least one processor configured(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) by using a learnable learning model(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) wherein the input data include series data containing a plurality of sub data that can be arranged systematically(merely specifies a particular technological environment in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h))) Output, using the updated learning model…when the series data is inputted (merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) each indicating a certainty that the series data belongs to a predetermined class(merely specifies a particular technological environment in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h))) Subject Matter Eligibility Analysis Step 2B: Additional elements (a) (b) and (d) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f). Additional elements (c) and (e) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation merely specifies a field of use in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h)). The additional element(s) (a) (b) (c) (d) and (e) in claim 1 do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Regarding claim 2: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites wherein the objective function includes a function based on a curve that indicates the relevance on a coordinate plane including two coordinate axes respectively corresponding to the first and second index values which is an abstract idea (Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A)))). Subject Matter Eligibility Analysis Step 2A Prong 2: The claim does not contain elements that would warrant a Step 2A Prong 2 analysis. Subject Matter Eligibility Analysis Step 2B: The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible. Regarding claim 3: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites the objective function includes a function based on a square measure of an area under the curve which is an abstract idea (Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A)))) Subject Matter Eligibility Analysis Step 2A Prong 2: The claim does not contain elements that would warrant a Step 2A Prong 2 analysis. Subject Matter Eligibility Analysis Step 2B: The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible. Regarding claim 4: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites wherein when each of the first and second index values is normalized so that a minimum value is 0 and a maximum value is 1, the area under the curve is an area that is surrounded by the curve which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). The claim recites one coordinate axis corresponding to the time index value of the two coordinate axes which is an abstract idea (Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A)))). The claim recites and a straight line represented by an equation that is the time index value = 1 which is an abstract idea (Mathematical Formulas or Equations (see MPEP 2106.04(a)(2)(I)(B)))). Subject Matter Eligibility Analysis Step 2A Prong 2: The claim does not contain elements that would warrant a Step 2A Prong 2 analysis. Subject Matter Eligibility Analysis Step 2B: The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible. Regarding claim 5: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites wherein the objective function is defined by using an equation L=(1 -S)2, wherein L is the objective function which is an abstract idea (Mathematical Formulas or Equations (see MPEP 2106.04(a)(2)(I)(B)))) The claim recites and S is the square measure that is normalized so that the maximum value is 1 which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). Subject Matter Eligibility Analysis Step 2A Prong 2: The claim does not contain elements that would warrant a Step 2A Prong 2 analysis. Subject Matter Eligibility Analysis Step 2B: The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible. Regarding claim 6: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites update the learning model by using the objective function to maximize the square measure, which is an abstract idea (Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A)))). Subject Matter Eligibility Analysis Step 2A Prong 2: wherein the at least one processor is configured to execute the instructions (merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) Subject Matter Eligibility Analysis Step 2B: The additional element(s) (a) in claim 6 do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Regarding claim 7: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites the learning model outputs a likelihood indicating a certainty that the input data belongs to a predetermined class, when the input data is inputted which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user choosing a type of information. See 2106.04.(a)(2).III.C. The claim recites identify the class of the input data which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user choosing a type of information. See 2106.04.(a)(2).III.C. The claim recites based on a magnitude correlation between the likelihood and a predetermined threshold which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). The claim recites (i) calculate the first and second index values based on the result of identification using a plurality of different predetermined thresholds which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). The claim recites (ii) calculate the objective function based on the calculated first and second index values which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). The claim recites (iii) update the learning model by using the calculated objective function which is an abstract idea (Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A)))) as the model is updated using a mathematical function. Subject Matter Eligibility Analysis Step 2A Prong 2: wherein the at least one processor is configured to execute the instructions to:(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) Subject Matter Eligibility Analysis Step 2B: The additional element(s) (a) in claim 7 do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Regarding claim 9: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites identify a class of input data which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user choosing a type of information. See 2106.04.(a)(2).III.C. The claim recites updating the learning model, by using an objective function based on a calculated evaluation curve indicating a relevance between a first index value for evaluating accuracy of a result of identification of the class of the input data and a second index value for evaluating time required to identify the class of the input data which is an abstract idea (Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A)))) as the model is updated using a mathematical function. The claim recites a plurality of likelihoods, each indicating a certainty that the series data belongs to a predetermined class, correspondingly to each of the plurality of sub data which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user deciding similarity of data. See 2106.04.(a)(2).III.C. Subject Matter Eligibility Analysis Step 2A Prong 2: by using a learnable learning model(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) wherein the input data include series data containing a plurality of sub data that can be arranged systematically(merely specifies a particular technological environment in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h))) Output, using the updated learning model…when the series data is inputted (merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) each indicating a certainty that the series data belongs to a predetermined class(merely specifies a particular technological environment in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h))) Subject Matter Eligibility Analysis Step 2B: Additional elements (a) and (c) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f). Additional elements (b) and (d) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation merely specifies a field of use in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h)). The additional element(s) (a) (b) (c) and (d) in claim 9 do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Regarding claim 10: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites identify a class of input data which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user choosing a type of information. See 2106.04.(a)(2).III.C. The claim recites and update the learning model, by using an objective function based on a calculated evaluation curve indicating a relevance between a first index value for evaluating accuracy of a result of identification of the class of the input data and a second index value for evaluating time required to identify the class of the input data which is an abstract idea (Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A)))) as the model is updated using a mathematical function. The claim recites a plurality of likelihoods, each indicating a certainty that the series data belongs to a predetermined class, correspondingly to each of the plurality of sub data which, under the broadest reasonable interpretation, covers performance of the limitation in the mind with or without a physical aid. The limitations encompass a user deciding similarity of data. See 2106.04.(a)(2).III.C. Subject Matter Eligibility Analysis Step 2A Prong 2: A non-transitory recording medium on which a computer program that allows a computer to execute an identification method is recorded(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) by using a learnable learning model(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) wherein the input data include series data containing a plurality of sub data that can be arranged systematically(merely specifies a particular technological environment in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h))) Output, using the updated learning model…when the series data is inputted (merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) each indicating a certainty that the series data belongs to a predetermined class(merely specifies a particular technological environment in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h))) Subject Matter Eligibility Analysis Step 2B: Additional elements (a) (b) and (d) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f). Additional elements (c) and (e) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation merely specifies a field of use in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h)). The additional element(s) (a) (b) (c) (d) and (e) in claim 10 do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Claim 11 is rejected under that same 101 claim analysis due to the substantially similarity of the limitations and additional elements of claim 5 found in claim 11. Claim 12 is rejected under that same 101 claim analysis due to the substantially similarity of the limitations and additional elements of claim 6 found in claim 12. Claim 13 is rejected under that same 101 claim analysis due to the substantially similarity of the limitations and additional elements of claim 6 found in claim 13. Claim 14 is rejected under that same 101 claim analysis due to the substantially similarity of the limitations and additional elements of claim 7 found in claim 14. Claim 15 is rejected under that same 101 claim analysis due to the substantially similarity of the limitations and additional elements of claim 7 found in claim 15. Claim 16 is rejected under that same 101 claim analysis due to the substantially similarity of the limitations and additional elements of claim 7 found in claim 16. Claim 17 is rejected under that same 101 claim analysis due to the substantially similarity of the limitations and additional elements of claim 7 found in claim 17. Claim 18 is rejected under that same 101 claim analysis due to the substantially similarity of the limitations and additional elements of claim 7 found in claim 18. Regarding claim 19: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim does not contain elements that would warrant a Step 2A Prong analysis. Subject Matter Eligibility Analysis Step 2A Prong 2: wherein the at least one processor is further configured to execute the instructions to update the learning model by updating a parameter of the learning model to maximize a square measure of an area under the evaluation curve(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) Subject Matter Eligibility Analysis Step 2B: Additional elements (a) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f). The additional element(s) (a) in claim 19 do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-3, 6-7, 9-10, 14-15 and 18-19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Preuveneers et al. (Resource Usage and Performance Trade-offs for Machine Learning Models in Smart Environments) henceforth known as Preuveneers and further in view of Ren et al. (A Hierarchical Approach Using Machine Learning Methods in Solar Photovoltaic Energy Production Forecasting) henceforth known as Ren. Regarding claim 1: Preuveneers discloses An identification apparatus comprising at least one memory configured to store instructions; and at least one processor configured to execute the instructions to(“The base deployment environment for the hyperparameter tuning is a Dell PowerEdge R620 server with 64GB of memory and two Intel Xeon E5-2650 (8 cores) CPUs running at 2.00GHz and hyperthreading enabled”) Preuveneers discloses identify a class of input data(Preuveneers, ABSTRACT, “We demonstrate the feasibility of our approach by means of an anomaly detection use case Additionally, we evaluate the extent that transfer learning techniques can be applied to reduce the amount of training required by reusing previous models, parameters and trade-off points from similar settings” where detecting an anomaly is considered identifying a class of input data) by using a learnable learning model(Preuveneers, ABSTRACT, “We propose a multi-objective optimization solution to find acceptable trade-offs between model accuracy and resource consumption to enable the deployment of machine learning models in resource constrained smart environments…We demonstrate the feasibility of our approach by means of an anomaly detection use case.” where using a multi-objective optimization solution between models to identify anomalies is considered using a learning model) Preuveneers discloses and update the learning model, by using an objective function (Preuveneers, Page 12, Paragraph 3, “The methodology we follow re-evaluates the various hyperparameter configurations in the new context based on a distance metric from the original Pareto front (see Figure 4), and demonstrates the amount of time saved compared to executing a full exploration phase from the start” where changing hyperparameters of model based on of the Pareto front is considered updating the learning mode and where the Pareto front is considered an objective function as the Pareto front represents the optimal solution in a multi-objective optimization problem that achieves the highest objective (accuracy) without degrading another objective (resources i.e. time), See: Preuveneers, Page 2, Paragraph 4, “This multi-objective optimization process results in a collection of models of which some are on the Pareto front (Models on the Pareto front present the trade-offs between the optimization objectives and are considered equally good)”)based on…a relevance between a first index value for evaluating accuracy of a result of identification of the class of the input data and a second index value for evaluating time required to identify the class of the input data(Preuveneers, Page 4, Paragraph 4,“We present a hyperparameter tuning framework that also considers resource trade-offs when selecting the best model for deployment in smart environments, as depicted in Figure 1…to find acceptable trade-offs between model accuracy and resource consumption” where possible resources being wall clock time vs Accuracy(See Also: Preuveneers, Figure 4) is considered updating learning model based on accuracy and time to complete identify and an objective function is understood to be a function that is the optimal solution desired to minimizing error) Preuveneers discloses wherein the input data include series data containing a plurality of sub data that can be arranged systematically(Preuveneers, Page 7, Paragraph 4, “Here the objective is twofold: to identify anomalies in a time series dataset, and at the same time optimize resource consumption” where the time series dataset is considered a systematically arranged input data), Preuveneers discloses output, using the updated learning model, a plurality of likelihoods(Preuveneers, Page 10, Paragraph 3, “At time of inference, the reconstruction error is used as a measure of how faithful the decoder’s reconstruction of the input is” where the reconstruction errors are used as a measure to likelihood of anomalous data), each indicating a certainty that the series data belongs to a predetermined class, correspondingly to each of the plurality of sub data, when the series data is inputted(Preuveneers, Page 10, Paragraph 3, “A threshold in reconstruction error differentiates between the categories the input sample is classified, and this threshold can be set as the maximum reconstruction error observed during the training phase on normal samples.”) Preuveneers does not explicitly disclose, however Ren discloses an objective function based on a calculated evaluation curve (Ren, Page 2, Col. 2, Paragraph 1 and Equation 1, “In this paper, we propose an AUC-based PU learning framework where the AUC metric is used to guide the learning process” where using an AUC metric to guide the learning process is considered having an objective function based on a calculated evaluation curve References Preuveneers and Ren are analogous art because they are from the same field of endeavor of using curve-based evaluations with machine learning Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art, having the teachings of Preuveneers and Ren before him or her, to modify the model of Preuveneers to include the evaluation curve evaluation of Ren to enhance the classification. The suggestion/motivation for doing so would have been “AUC can better evaluate the performance of the classifier than classification error because it is invariant to the percentage of positive samples”(Ren, Page 2, Col. 1, Paragraph 2) Regarding claim 2: The rejection of Preuveneers-Ren discloses the identification apparatus of claim 1 is incorporated and further: Preuveneers further discloses wherein the objective function includes a function based on a curve(Preuveneers, Page 8, Paragraph 2,“As a result, we have to be vigilant with identifying and positioning the performance metrics and trade-offs in relation to each other before drawing conclusions and generalizations. The end goal is therefore to produce a cost and trade-off model that would be able to take into account the following parameters and indicators: …wall clock time…vs. accuracy” where the Pareto front producing of a cost and trade-off AUC model that compares wall clock time vs accuracy is considered a function based on the relevance of time and accuracy (first and second index values) and producing a cost and trade-off model is used to minimizing error) that indicates the relevance on a coordinate plane including two coordinate axes respectively corresponding to the first and second index values( Preuveneers, Page 17 Figure 12 and Preuveneers, Page 19, Paragraph 3, “With this approach, we take the hyperparameters of the AutoSklearn models on the Pareto front for the origin dataset (depicted in Figure 13), and retrain model with the same configurations on the target dataset” where the Pareto front uses AUC of multiple objectives to understand the trade-offs between the objectives.)) Regarding claim 3: The rejection of Preuveneers-Ren discloses the identification apparatus of claim 2 is incorporated and further: Preuveneers further discloses the objective function (Preuveneers, Page 2, Paragraph 4, “This multi-objective optimization process results in a collection of models of which some are on the Pareto front (Models on the Pareto front present the trade-offs between the optimization objectives and are considered equally good)” where the Pareto front is considered an objective function as the Pareto front represents a solution in a multi-objective optimization problem that achieves the highest objective (accuracy) without degrading another objective) includes a function based on a square measure of an area under the curve (Preuveneers, Page 17, Paragraph 2, “Figure 12 depicts the Pareto-fronts for 64 different configurations of autoencoders. … we now use the AUC metric to compare the different autoencoders” where the square measure is understood to be the measure of an area under the curve and using an area under the curve metric to create the Pareto front to compare difference configurations to find optimal hyperparameters is considered using a square measure of an area under a curve to minimize error) Regarding claim 6: The rejection of Preuveneers-Ren discloses the identification apparatus of claim 3 is incorporated and further: Preuveneers further discloses the at least one processor is configured to execute the instructions to update the learning model by using the objective function(Preuveneers, Page 12, Paragraph 3, “The methodology we follow re-evaluates the various hyperparameter configurations in the new context based on a distance metric from the original Pareto front (see Figure 4), and demonstrates the amount of time saved compared to executing a full exploration phase from the start” where changing hyperparameters of model based on of the Pareto front is considered updating the learning mode by using an objective function) to maximize the square measure (Preuveneers, Figure 4, where the Pareto front represents the optimal solution and aims to maximize the square measure by minimizing the error rate.) Regarding claim 7: The rejection of Preuveneers-Ren discloses the identification apparatus of claim 1 is incorporated and further: Preuveneers further discloses the at least one processor is configured to execute the instructions to:(“The base deployment environment for the hyperparameter tuning is a Dell PowerEdge R620 server with 64GB of memory and two Intel Xeon E5-2650 (8 cores) CPUs running at 2.00GHz and hyperthreading enabled”) Preuveneers further discloses wherein the learning model outputs a likelihood indicating a certainty that the input data belongs to a predetermined class, when the input data is inputted … identify the class of the input data(Preuveneers, Page 10, Paragraph 3, “At time of inference, the reconstruction error is used as a measure of how faithful the decoder’s reconstruction of the input is. Normal input samples have a low reconstruction error, whereas anomalous samples result in a high reconstruction error” where the reconstruction error is considered a likelihood being used to determine the anomalous samples via high reconstruction error is considered identifying input data belonging to a predetermined class) based on a magnitude correlation between the likelihood and a predetermined threshold (Preuveneers, Page 10, Paragraph 3, “A threshold in reconstruction error differentiates between the categories the input sample is classified, and this threshold can be set as the maximum reconstruction error observed during the training phase on normal samples” where thresholds being able to be set is considered a predetermined threshold and using the thresholds with the reconstruction error is considered using identifying based on a magnitude correlation ) Preuveneers further discloses (i) calculate the first and second index values based on the result of identification(Preuveneers, Page 17, Figure 12, where the top left graph of figure 12 shows wall clock time in milliseconds vs AUC which is considered calculating time(as milliseconds is a unit of time) and accuracy(as the AUC goal is predict correct classes) using first and second index values based on the result of identification(as the AUC metric focuses on identifying according to positive and negative) for the Pareto front) using a plurality of different predetermined thresholds(Preuveneers, Page 17, Paragraph 2, “We vary the threshold to compute the area under the ROC curve, and use this threshold-independent metric to compare the different autoencoders”) Preuveneers further discloses (ii) calculate the objective function based on the calculated first and second index values(Preuveneers, Page 4, Paragraph 4,“We present a hyperparameter tuning framework that also considers resource trade-offs when selecting the best model for deployment in smart environments, as depicted in Figure 1…to find acceptable trade-offs between model accuracy and resource consumption” where possible resources being wall clock time vs Accuracy(See Also: Preuveneers, Figure 4) is considered calculated the Pareto front based on accuracy and time(first and second index values)), Preuveneers further discloses (iii) update the learning model by using the calculated objective function(Preuveneers, Page 12, Paragraph 3, “The methodology we follow re-evaluates the various hyperparameter configurations in the new context based on a distance metric from the original Pareto front (see Figure 4), and demonstrates the amount of time saved compared to executing a full exploration phase from the start” where changing hyperparameters of model based on of the Pareto front is considered updating the learning mode by using an objective function) Regarding claim 9: Preuveneers discloses identifying a class of input data(Preuveneers, ABSTRACT, “We demonstrate the feasibility of our approach by means of an anomaly detection use case Additionally, we evaluate the extent that transfer learning techniques can be applied to reduce the amount of training required by reusing previous models, parameters and trade-off points from similar settings” where detecting an anomaly is considered identifying a class of input data) by using a learnable learning model(Preuveneers, ABSTRACT, “We propose a multi-objective optimization solution to find acceptable trade-offs between model accuracy and resource consumption to enable the deployment of machine learning models in resource constrained smart environments…We demonstrate the feasibility of our approach by means of an anomaly detection use case.” where using a multi-objective optimization solution between models to identify anomalies is considered using a learning model) Preuveneers discloses update the learning model, by using an objective function (Preuveneers, Page 12, Paragraph 3, “The methodology we follow re-evaluates the various hyperparameter configurations in the new context based on a distance metric from the original Pareto front (see Figure 4), and demonstrates the amount of time saved compared to executing a full exploration phase from the start” where changing hyperparameters of model based on of the Pareto front is considered updating the learning mode and where the Pareto front is considered an objective function as the Pareto front represents the optimal solution in a multi-objective optimization problem that achieves the highest objective (accuracy) without degrading another objective (resources i.e. time), See: Preuveneers, Page 2, Paragraph 4, “This multi-objective optimization process results in a collection of models of which some are on the Pareto front (Models on the Pareto front present the trade-offs between the optimization objectives and are considered equally good)”)based on… a relevance between a first index value for evaluating accuracy of a result of identification of the class of the input data and a second index value for evaluating time required to identify the class of the input data(Preuveneers, Page 4, Paragraph 4,“We present a hyperparameter tuning framework that also considers resource trade-offs when selecting the best model for deployment in smart environments, as depicted in Figure 1…to find acceptable trade-offs between model accuracy and resource consumption” where possible resources being wall clock time vs Accuracy(See Also: Preuveneers, Figure 4) is considered updating learning model based on accuracy and time to complete identify and an objective function is understood to be a function that is the optimal solution desired to minimizing error) Preuveneers discloses wherein the input data include series data containing a plurality of sub data that can be arranged systematically(Preuveneers, Page 7, Paragraph 4, “Here the objective is twofold: to identify anomalies in a time series dataset, and at the same time optimize resource consumption” where the time series dataset is considered a systematically arranged input data), Preuveneers discloses outputting, using the updated learning model, a plurality of likelihoods(Preuveneers, Page 10, Paragraph 3, “At time of inference, the reconstruction error is used as a measure of how faithful the decoder’s reconstruction of the input is” where the reconstruction errors are used as a measure to likelihood of anomalous data), each indicating a certainty that the series data belongs to a predetermined class, correspondingly to each of the plurality of sub data, when the series data is inputted(Preuveneers, Page 10, Paragraph 3, “A threshold in reconstruction error differentiates between the categories the input sample is classified, and this threshold can be set as the maximum reconstruction error observed during the training phase on normal samples.”) Preuveneers does not explicitly disclose, however Ren discloses an objective function based on a calculated evaluation curve (Ren, Page 2, Col. 2, Paragraph 1 and Equation 1, “In this paper, we propose an AUC-based PU learning framework where the AUC metric is used to guide the learning process” where using an AUC metric to guide the learning process is considered having an objective function based on a calculated evaluation curve References Preuveneers and Ren are analogous art because they are from the same field of endeavor of using curve-based evaluations with machine learning Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art, having the teachings of Preuveneers and Ren before him or her, to modify the model of Preuveneers to include the evaluation curve evaluation of Ren to enhance the classification. The suggestion/motivation for doing so would have been “AUC can better evaluate the performance of the classifier than classification error because it is invariant to the percentage of positive samples”(Ren, Page 2, Col. 1, Paragraph 2) Regarding claim 10: Preuveneers discloses A non-transitory recording medium on which a computer program that allows a computer to execute an identification method is recorded(“The base deployment environment for the hyperparameter tuning is a Dell PowerEdge R620 server with 64GB of memory and two Intel Xeon E5-2650 (8 cores) CPUs running at 2.00GHz and hyperthreading enabled” where 64GB of memory is considered non-transitory recording medium to execute an identification method) Preuveneers discloses identifying a class of input data(Preuveneers, ABSTRACT, “We demonstrate the feasibility of our approach by means of an anomaly detection use case Additionally, we evaluate the extent that transfer learning techniques can be applied to reduce the amount of training required by reusing previous models, parameters and trade-off points from similar settings” where detecting an anomaly is considered identifying a class of input data) by using a learnable learning model(Preuveneers, ABSTRACT, “We propose a multi-objective optimization solution to find acceptable trade-offs between model accuracy and resource consumption to enable the deployment of machine learning models in resource constrained smart environments…We demonstrate the feasibility of our approach by means of an anomaly detection use case.” where using a multi-objective optimization solution between models to identify anomalies is considered using a learning model) Preuveneers discloses and updating the learning model, by using an objective function (Preuveneers, Page 12, Paragraph 3, “The methodology we follow re-evaluates the various hyperparameter configurations in the new context based on a distance metric from the original Pareto front (see Figure 4), and demonstrates the amount of time saved compared to executing a full exploration phase from the start” where changing hyperparameters of model based on of the Pareto front is considered updating the learning mode and where the Pareto front is considered an objective function as the Pareto front represents the optimal solution in a multi-objective optimization problem that achieves the highest objective (accuracy) without degrading another objective (resources i.e. time), See: Preuveneers, Page 2, Paragraph 4, “This multi-objective optimization process results in a collection of models of which some are on the Pareto front (Models on the Pareto front present the trade-offs between the optimization objectives and are considered equally good)”)based on…a relevance between a first index value for evaluating accuracy of a result of identification of the class of the input data and a second index value for evaluating time required to identify the class of the input data(Preuveneers, Page 4, Paragraph 4,“We present a hyperparameter tuning framework that also considers resource trade-offs when selecting the best model for deployment in smart environments, as depicted in Figure 1…to find acceptable trade-offs between model accuracy and resource consumption” where possible resources being wall clock time vs Accuracy(See Also: Preuveneers, Figure 4) is considered updating learning model based on accuracy and time to complete identify and an objective function is understood to be a function that is the optimal solution desired to minimizing error) Preuveneers discloses wherein the input data include series data containing a plurality of sub data that can be arranged systematically(Preuveneers, Page 7, Paragraph 4, “Here the objective is twofold: to identify anomalies in a time series dataset, and at the same time optimize resource consumption” where the time series dataset is considered a systematically arranged input data), Preuveneers discloses output, using the updated learning model, a plurality of likelihoods(Preuveneers, Page 10, Paragraph 3, “At time of inference, the reconstruction error is used as a measure of how faithful the decoder’s reconstruction of the input is” where the reconstruction errors are used as a measure to likelihood of anomalous data), each indicating a certainty that the series data belongs to a predetermined class, correspondingly to each of the plurality of sub data, when the series data is inputted(Preuveneers, Page 10, Paragraph 3, “A threshold in reconstruction error differentiates between the categories the input sample is classified, and this threshold can be set as the maximum reconstruction error observed during the training phase on normal samples.”) Preuveneers does not explicitly disclose, however Ren discloses an objective function based on a calculated evaluation curve (Ren, Page 2, Col. 2, Paragraph 1 and Equation 1, “In this paper, we propose an AUC-based PU learning framework where the AUC metric is used to guide the learning process” where using an AUC metric to guide the learning process is considered having an objective function based on a calculated evaluation curve References Preuveneers and Ren are analogous art because they are from the same field of endeavor of using curve-based evaluations with machine learning Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art, having the teachings of Preuveneers and Ren before him or her, to modify the model of Preuveneers to include the evaluation curve evaluation of Ren to enhance the classification. The suggestion/motivation for doing so would have been “AUC can better evaluate the performance of the classifier than classification error because it is invariant to the percentage of positive samples”(Ren, Page 2, Col. 1, Paragraph 2) Regarding claim 14, The rejection of claim 2 incorporated in claim 14, and further, claim 14 is rejected under the same rationale as set forth in the rejection of claim 7. Regarding claim 15, The rejection of claim 3 incorporated in claim 15, and further, claim 15 is rejected under the same rationale as set forth in the rejection of claim 7. Regarding claim 18, The rejection of claim 6 incorporated in claim 18, and further, claim 18 is rejected under the same rationale as set forth in the rejection of claim 7. Regarding claim 19: The rejection of Preuveneers-Ren discloses the identification apparatus of claim 1 is incorporated and further: Ren discloses wherein the at least one processor is further configured to execute the instructions to update the learning model by updating a parameter of the learning model to maximize a square measure of an area under the evaluation curve(Ren, Page4, Col. 2, Paragraph 4, “In particular, we iteratively sample x from X to calculate the unbiased stochastic gradient… and apply the projected gradient step to update the next iteration” where the stochastic gradients of the BAUC-based objective derived is then used to update the parameters w and ϵ being used to optimize the BAUC-OF objective corresponds to updating a parameter to maximize the measure of an area under the evaluation curve (See also Page 2, Col. 1, Paragraph 2, “The proposed BAUC-OF model is more robust than existing classification error minimization frameworks…particularly in these two aspects: 1) there is no need to set a prior value for the percentage of positive samples π in the training process, which has been a difficulty in many applications; and 2) both outlier detection and feature selection are integrated with the AUC maximization formulation” and Ren, Page 2, Col. 1, Paragraph 1, “…we propose to use a blind AUC (BAUC) criterion to approximate the target AUC, and we will show that in theory maximizing BAUC is equivalent to maximizing AUC”) Claim(s) 4, 11-12 and 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Preuveneers et al. (Resource Usage and Performance Trade-offs for Machine Learning Models in Smart Environments) henceforth known as Preuveneers and further in view of Li et al. (A Hierarchical Approach Using Machine Learning Methods in Solar Photovoltaic Energy Production Forecasting) henceforth known as Li. Regarding claim 4: The rejection of Preuveneers discloses the learning apparatus of claim 3 is incorporated and further: Preuveneers discloses the area under the curve is an area that is surrounded by the curve(Preuveneers, Figure 4, where the area under the curve (the Pareto front) being measured is considered an area that is surrounded by the curve), one coordinate axis corresponding to the time index value of the two coordinate axes(Preuveneers, Figure 4, shows time and error where evaluation time being representing the x-axis is considered having a time index value of time on one coordinate axis), and a straight line represented by an equation that is the time index value = 1(Preuveneers, Figure 4, where the 1.0 line which forms a straight line and where a straight line representing time index value = 1 is understood to be the end of a range of normalized values (0 to 1). Preuveneers does not disclose, however Li does disclose wherein when each of the first and second index values is normalized so that a minimum value is 0 and a maximum value is 1(Li, Page 6, Paragraph 2 and Equation 13, “Before applying the training algorithm, both the input and output data are normalized to the range from 0 to 1”) References Preuveneers and Li are analogous art because they are from the same field of endeavor of using machine learning for forecasting and prediction. Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art, having the teachings of Preuveneers and Li before him or her, to modify the input/output of Preuveneers to include the normalization of Li for additional optimization. The suggestion/motivation for doing so would have been Parameters are optimized using all input data normalized between 0 and 1. (Li, Page 3, Paragraph 3) Regarding claim 11, The rejection
Read full office action

Prosecution Timeline

Dec 09, 2021
Application Filed
May 16, 2025
Non-Final Rejection — §101, §103, §112
Aug 19, 2025
Applicant Interview (Telephonic)
Aug 20, 2025
Response Filed
Aug 23, 2025
Examiner Interview Summary
Nov 26, 2025
Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12582959
DATA GENERATION DEVICE AND METHOD, AND LEARNING DEVICE AND METHOD
2y 5m to grant Granted Mar 24, 2026
Patent 12380333
METHOD OF CONSTRUCTING NETWORK MODEL FOR DEEP LEARNING, DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Aug 05, 2025
Study what changed to get past this examiner. Based on 2 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
27%
Grant Probability
93%
With Interview (+65.9%)
4y 2m
Median Time to Grant
Moderate
PTA Risk
Based on 15 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month