Prosecution Insights
Last updated: April 19, 2026
Application No. 18/186,101

Method and Apparatus for Continuous Learning of Object Anomaly Detection and State Classification Model

Non-Final OA §101§103§112
Filed
Mar 17, 2023
Examiner
JONES, CHARLES JEFFREY
Art Unit
2122
Tech Center
2100 — Computer Architecture & Software
Assignee
SK Planet Co. Ltd.
OA Round
1 (Non-Final)
27%
Grant Probability
At Risk
1-2
OA Rounds
4y 2m
To Grant
93%
With Interview

Examiner Intelligence

Grants only 27% of cases
27%
Career Allow Rate
4 granted / 15 resolved
-28.3% vs TC avg
Strong +66% interview lift
Without
With
+65.9%
Interview Lift
resolved cases with interview
Typical timeline
4y 2m
Avg Prosecution
27 currently pending
Career history
42
Total Applications
across all art units

Statute-Specific Performance

§101
34.5%
-5.5% vs TC avg
§103
29.1%
-10.9% vs TC avg
§102
17.7%
-22.3% vs TC avg
§112
17.7%
-22.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 15 resolved cases

Office Action

§101 §103 §112
DETAILED ACTION This action is responsive to the Application filed on 03/17/2023. Claims 1-17 are have been examined and are pending. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55 and Foreign Priority of 03/26/2021 is acknowledged. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(d): (d) REFERENCE IN DEPENDENT FORMS.—Subject to subsection (e), a claim in dependent form shall contain a reference to a claim previously set forth and then specify a further limitation of the subject matter claimed. A claim in dependent form shall be construed to incorporate by reference all the limitations of the claim to which it refers. The following is a quotation of pre-AIA 35 U.S.C. 112, fourth paragraph: Subject to the following paragraph [i.e., the fifth paragraph of pre-AIA 35 U.S.C. 112], a claim in dependent form shall contain a reference to a claim previously set forth and then specify a further limitation of the subject matter claimed. A claim in dependent form shall be construed to incorporate by reference all the limitations of the claim to which it refers. Claim 17 is rejected under 35 U.S.C. 112(d) or pre-AIA 35 U.S.C. 112, 4th paragraph, as being of improper dependent form for failing to further limit the subject matter of the claim upon which it depends, or for failing to include all the limitations of the claim upon which it depends. The contingent limitation in claim 16 includes the BRI that a discriminator is not used and therefor claim 17 does not further limit the claims as claim 17 provides more information of the discriminator, which is not required with the BRI embodiment. Applicant may cancel the claim(s), amend the claim(s) to place the claim(s) in proper dependent form, rewrite the claim(s) in independent form, or present a sufficient showing that the dependent claim(s) complies with the statutory requirements. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: specifically data processing unit and detection unit that are configured to and modified with functional language without sufficient structure in claims 9-17. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: specifically learning unit that is configured to and modified with functional language without sufficient structure in claims 12-17. Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-17 are rejected under 35 U.S.C. 101 as the claims are directed towards an abstract idea without significantly more. Regarding Claim 1: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites generating…an input value, which is a feature vector matrix including a plurality of feature vectors, from the medium information which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user making a vector from a set. See 2106.04.(a)(2).III.C. The claim recites deriving…a restored value imitating the input value which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user choosing a value that . See 2106.04.(a)(2).III.C. The claim recites determining…whether a restoration error indicating a difference between the input value and the restored value is greater than or equal to a previously calculated reference value which is an abstract idea (Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A)))). The claim recites storing…the input value as normal data which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user choosing a category for data based on a comparison of values. See 2106.04.(a)(2).III.C. The claim recites determining that the restoration error is less than the reference value which is an abstract idea (Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A)))). Subject Matter Eligibility Analysis Step 2A Prong 2: acquiring…information (which amount to mere extra solution activity of obtaining and/or gathering data over a network, see MPEP §2106.05(g)) about a medium of anomaly detection from an inspection target(merely specifies a particular technological environment in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h))) a detection and classification apparatus(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) through a detection network learned to generates the restored value for the input value(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) Subject Matter Eligibility Analysis Step 2B: Additional element (a) of obtaining a network input and sending data to a network is well understood, routine, and conventional activity of “transmitting or receiving data over a network" (see MPEP 2106.05(d)(II)(i) using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362) Additional elements (b) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation merely specifies a field of use in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h)). Additional elements (c) and (d) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f). The additional element(s) (a) (b) (c) and (d) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Regarding Claim 2: The rejection of claim 1 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites calculating…a classification value indicating a probability which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). Subject Matter Eligibility Analysis Step 2A Prong 2: …that the input value belongs to a category of an anomaly state through a classification network learned to calculate the probability for the input value(merely specifies a particular technological environment in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h))) Subject Matter Eligibility Analysis Step 2B: Additional elements (a) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation merely specifies a field of use in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h)). The additional element(s) (a) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Regarding Claim 3: The rejection of claim 2 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites …determining…whether the classification value is greater than or equal to a predetermined threshold upon determining that the restoration error is greater than or equal to the reference value which is an abstract idea (Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A)))). The claim recites storing…the input value as category data upon determining that the classification value is greater than or equal to the predetermined threshold which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user choosing a category for data based on a comparison of values. See 2106.04.(a)(2).III.C. Subject Matter Eligibility Analysis Step 2A Prong 2: The claim does not contain elements that would warrant a Step 2A Prong 2 analysis. Subject Matter Eligibility Analysis Step 2B: The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible. Regarding Claim 4: The rejection of claim 3 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites detecting…occurrence of an event requiring a model update which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user making a judgement that an event has occurred and an action should take place. See 2106.04.(a)(2).III.C. Subject Matter Eligibility Analysis Step 2A Prong 2: learning the detection network using the stored normal data when normal data of a first predetermined number or more are stored(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) learning the classification network using the stored category data when category data of a second predetermined number or more are stored(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) Subject Matter Eligibility Analysis Step 2B: Additional elements (a) and (b) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f). The additional element(s) (a) and (b) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Regarding Claim 5: The rejection of claim 4 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites calculating…an uncompressed latent value from the training input value which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). The claim recites calculating…the restored value from the latent value which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). The claim recites calculating…a loss that is a difference between the restored value and the training input value which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). The claim recites performing…optimization of updating a parameter of the detection network to minimize the loss which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user choosing parameters to update and judging values to use. See 2106.04.(a)(2).III.C. Alternatively the limitation falls under Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A))). Subject Matter Eligibility Analysis Step 2A Prong 2: Initializing…the detection network(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) Inputting…the stored normal data as a training input value to the initialized detection network(which amount to mere extra solution activity of receiving or transmitting data over a network, see MPEP §2106.05(g)) Subject Matter Eligibility Analysis Step 2B: Additional elements (a) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f). Additional element (b) of obtaining a network input and sending data to a network is well understood, routine, and conventional activity of “transmitting or receiving data over a network" (see MPEP 2106.05(d)(II)(i) using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362) The additional element(s) (a) and (b) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Regarding Claim 6: The rejection of claim 5 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites after the learning, calculating, by the detection and classification apparatus, the reference value in accordance with Equation θ= μ+(k × σ) which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). The claim recites wherein μ denotes an average of a mean squared error (MSE) between a plurality of training input values and a plurality of restored values corresponding to the plurality of training input values used for learning on the detection network, wherein a denotes a standard deviation of the MSE between the plurality of training input values and the plurality of restored values corresponding to the plurality of training input values, and wherein k is a weight for the standard deviation which is an abstract idea (Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A)))). Subject Matter Eligibility Analysis Step 2A Prong 2: The claim does not contain elements that would warrant a Step 2A Prong 2 analysis. Subject Matter Eligibility Analysis Step 2B: The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible. Regarding Claim 7: The rejection of claim 4 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites preparing…a training input value by setting a label corresponding to a category of the stored category data which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user choosing a matching label for a category in a set. See 2106.04.(a)(2).III.C. The claim recites calculating…a classification value from the training input value by performing an operation in which a plurality of inter-layer weights are applied which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). The claim recites calculating…a classification loss that is a difference between the classification value and the label which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). The claim recites performing…optimization of updating a parameter of the classification network to minimize the classification loss which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user choosing parameters to update and judging values to use. See 2106.04.(a)(2).III.C. Alternatively the limitation falls under Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A))). Subject Matter Eligibility Analysis Step 2A Prong 2: initializing…the classification network(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) inputting…the training input value to the initialized classification network(which amount to mere extra solution activity of receiving or transmitting data over a network, see MPEP §2106.05(g)) Subject Matter Eligibility Analysis Step 2B: Additional elements (a) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f). Additional element (b) of obtaining a network input and sending data to a network is well understood, routine, and conventional activity of “transmitting or receiving data over a network" (see MPEP 2106.05(d)(II)(i) using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362) The additional element(s) (a) and (b) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Regarding Claim 8: The rejection of claim 1 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim does not contain elements that would warrant a Step 2A Prong 1 analysis. Subject Matter Eligibility Analysis Step 2A Prong 2: A non-transitory computer-readable recording medium that records a program for executing the method for continuous learning(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) Subject Matter Eligibility Analysis Step 2B: Additional elements (a) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f). The additional element(s) (a) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Regarding Claim 9: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites generate an input value, which is a feature vector matrix including a plurality of feature vectors, from information about a medium of anomaly detection from an inspection target which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user making a vector from a set. See 2106.04.(a)(2).III.C. The claim recites derive a restored value imitating the input value which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user choosing a value that . See 2106.04.(a)(2).III.C. The claim recites determine whether a restoration error indicating a difference between the input value and the restored value is greater than or equal to a previously calculated reference value which is an abstract idea (Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A)))). The claim recites store the input value as normal data upon which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user choosing a category for data based on a comparison of values. See 2106.04.(a)(2).III.C. The claim recites determining that the restoration error is less than the reference value which is an abstract idea (Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A)))). Subject Matter Eligibility Analysis Step 2A Prong 2: a data processing unit configured to (merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) a detection unit configured to (merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) through a detection network learned to generates the restored value for the input value(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) Subject Matter Eligibility Analysis Step 2B: Additional elements (a) (b) and (c) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f). The additional element(s) (a) (b) and (c) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Regarding Claim 10: The rejection of claim 9 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites calculate a classification value indicating a probability which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). Subject Matter Eligibility Analysis Step 2A Prong 2: that the input value belongs to a category of an anomaly state through a classification network learned to calculate the probability for the input value(merely specifies a particular technological environment in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h))) Subject Matter Eligibility Analysis Step 2B: Additional elements (a) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation merely specifies a field of use in which the abstract idea is to take place, i.e. a field of use (see MPEP 2106.05(h)). The additional element(s) (a) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Regarding Claim 11: The rejection of claim 10 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites determine whether the classification value is greater than or equal to a predetermined threshold which is an abstract idea (Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A)))). The claim recites determining that the restoration error is greater than or equal to the reference value which is an abstract idea (Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A)))). The claim recites store the input value as category data which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user choosing a category for data based on a comparison of values. See 2106.04.(a)(2).III.C. The claim recites determining that the classification value is greater than or equal to the predetermined threshold which is an abstract idea (Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A)))). Subject Matter Eligibility Analysis Step 2A Prong 2: The claim does not contain elements that would warrant a Step 2A Prong 2 analysis. Subject Matter Eligibility Analysis Step 2B: The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible. Regarding Claim 12: The rejection of claim 11 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites detect occurrence of an event requiring a model update which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user making a judgement that an event has occurred and an action should take place. See 2106.04.(a)(2).III.C. Subject Matter Eligibility Analysis Step 2A Prong 2: learn the detection network using the stored normal data when normal data of a first predetermined number or more are stored(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) learn the classification network using the stored category data when category data of a second predetermined number or more are stored(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) Subject Matter Eligibility Analysis Step 2B: Additional elements (a) and (b) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f). The additional element(s) (a) and (b) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Regarding Claim 13: The rejection of claim 12 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites calculates an uncompressed latent value from the training input value which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). The claim recites calculates the restored value from the latent value which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). The claim recites calculate a loss that is a difference between the restored value and the training input value which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). The claim recites, perform optimization of updating a parameter of the detection network to minimize the loss which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user choosing parameters to update and judging values to use. See 2106.04.(a)(2).III.C. Alternatively the limitation falls under Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A))). Subject Matter Eligibility Analysis Step 2A Prong 2: initialize the detection network(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) input the stored normal data as a training input value to the initialized detection network(which amount to mere extra solution activity of receiving or transmitting data over a network, see MPEP §2106.05(g)) when an encoder of the detection network(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) Subject Matter Eligibility Analysis Step 2B: Additional elements (a) and (c) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f). Additional element (b) of obtaining a network input and sending data to a network is well understood, routine, and conventional activity of “transmitting or receiving data over a network" (see MPEP 2106.05(d)(II)(i) using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362) The additional element(s) (a) (b) and (c) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Regarding Claim 14: The rejection of claim 13 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites calculate the reference value in accordance with Equation θ= μ+(k × σ) which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). The claim recites wherein μ denotes an average of a mean squared error (MSE) between a plurality of training input values and a plurality of restored values corresponding to the plurality of training input values used for learning on the detection network, wherein a denotes a standard deviation of the MSE between the plurality of training input values and the plurality of restored values corresponding to the plurality of training input values, and wherein k is a weight for the standard deviation which is an abstract idea (Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A)))). Subject Matter Eligibility Analysis Step 2A Prong 2: The claim does not contain elements that would warrant a Step 2A Prong 2 analysis. Subject Matter Eligibility Analysis Step 2B: The claim does not include any additional element, when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible. Regarding Claim 15: The rejection of claim 13 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites prepare a training input value by setting a label corresponding to a category of the stored category data which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user choosing a matching label for a category in a set. See 2106.04.(a)(2).III.C. The claim recites calculates a classification value from the training input value by performing an operation in which a plurality of inter-layer weights are applied which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). The claim recites calculate a classification loss that is a difference between the classification value and the label which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). The claim recites perform optimization of updating a parameter of the classification network to minimize the classification loss which, under the broadest reasonable interpretation, covers performance of the limitation in the mind. The limitations encompass a user choosing parameters to update and judging values to use. See 2106.04.(a)(2).III.C. Alternatively the limitation falls under Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A))). Subject Matter Eligibility Analysis Step 2A Prong 2: initialize the classification network(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) input the training input value to the initialized classification network(which amount to mere extra solution activity of receiving or transmitting data over a network, see MPEP §2106.05(g)) when the classification network(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) Subject Matter Eligibility Analysis Step 2B: Additional elements (a) and (c) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f). Additional element (b) of obtaining a network input and sending data to a network is well understood, routine, and conventional activity of “transmitting or receiving data over a network" (see MPEP 2106.05(d)(II)(i) using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362) The additional element(s) (a) (b) and (c) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Regarding Claim 16: The rejection of claim 13 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim does not contain elements that would warrant a Step 2A Prong 1 analysis. Subject Matter Eligibility Analysis Step 2A Prong 2: an autoencoder model including an encoder and a decoder(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) a generative adversarial network(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) including a single encoder, a single decoder, and a single discriminator(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) a generative artificial neural network(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) selectively including a single or a plurality of encoders, decoders, and discriminators(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) Subject Matter Eligibility Analysis Step 2B: Additional elements (a) (b) (c) (d) and (e) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f). The additional element(s) (a) (b) (c) (d) and (e) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Regarding Claim 17: The rejection of claim 16 is incorporated and further claim recites further additional elements/limitations: Subject Matter Eligibility Analysis Step 2A Prong 1: The claim recites generate a mean square loss of an input value and a restored value or use a restoration error which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). The claim recites use a mean square loss of a discriminator output for an actual input and a generated input as a discrimination error which is an abstract idea (Mathematical Calculations (see MPEP 2106.04(a)(2)(I)(C))). The claim recites set the restoration error and the discrimination error according to the user input which is an abstract idea (Mathematical Relationships (see MPEP 2106.04(a)(2)(I)(A)))). Subject Matter Eligibility Analysis Step 2A Prong 2: when using the discriminator(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) when a user input is entered(merely recites a generic computer on which to perform the abstract idea, e.g. "apply it on a computer" (see MPEP 2106.05(f))) Subject Matter Eligibility Analysis Step 2B: Additional elements (a) and (b) do not integrate the abstract idea into a practical application nor do the additional limitation provide significantly more than the abstract idea because the limitation amount to no more than mere instructions to apply the exception using a generic computer component. Please see MPEP §2106.05(f). The additional element(s) (a) and (b) in the claim do/does not include any additional elements , when considered separately and in combination, that amount to an integration of the judicial exception into a practical application, nor significantly more than the judicial exception for the reasons set forth in step 2A prong 2 analysis above. The claim is not patent eligible. Claim Rejections - 35 USC § 103 Claim(s) 1-17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Sokhandan et al.(US 20220108163A1, henceforth known as Sokhandan) and further in view of Clark et al.(“ Adaptive Threshold for Outlier Detection on Data Streams” henceforth known as Clark) Regarding Claim 1: Sokhandan discloses acquiring, by a detection and classification apparatus, information about a medium of anomaly detection from an inspection target(Sokhandan, [0109], “The sequence 500 may begin at operation 510 by supplying, to the image encoder 105, an input image 150 of the object, the input image 150 of the object containing zero or more anomalies” where the input image of an object is considered information about a medium of anomaly detection from an inspection target.) Sokhandan discloses generating, by the detection and classification apparatus, an input value, which is a feature vector matrix including a plurality of feature vectors, from the medium information(Sokhandan, [0082], “…The image models are placed in a latent space 115. In more details, a neural network is used to extract a compact set of image features, smaller than the size of the original images, to form the image models placed in the latent space 115. In a non-limiting embodiment, the neural network may be based on a normalizing flow structure” where extracting a set of features corresponds to generating an input value which is a feature vector matrix from the medium information(See Also Sokhandan, [0019], “supplying, to the latent space, a statistically sufficient sample of information contained in the vectors containing the means and standard deviations”)) Sokhandan discloses deriving, by the detection and classification apparatus, a restored value imitating the input value through a detection network learned to generates the restored value for the input value(Sokhandan, [0078], “Having learned a rich representation of the non-anomalous object, the system is able to receive an input image of a particular object that may contain anomalies, generate an image model and regenerate a substitute non-anomalous image of the object” where generating an image model and regenerate a substitute image of the object corresponds to deriving a restored value imitating the input value to generate a restored value for the input value) Sokhandan does not explicitly teach, however Clark does disclose determining, by the detection and classification apparatus, whether a restoration error indicating a difference between the input value and the restored value is greater than or equal to a previously calculated reference value(Algorithm 1 lines 6-10, where the scoret > threshold if else statement and scoret <- f(xt) corresponds to determining restoration error indicating a difference between the input value and the restored value is greater than or equal to a previously calculated reference value) and storing, by the detection and classification apparatus, the input value as normal data upon determining that the restoration error is less than the reference value(Algorithm 1 lines 10-12, where the score being stored in sliding windows z_w2/t_w corresponds to storing the input value as normal data upon determining that the restoration error is less than the reference value) References Sokhandan and Clark are analogous art because they are from the same field of endeavor of using machine-learning methods to determine anomaly/outlier in detection systems. Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art, having the teachings of Sokhandan and Clark before him or her, to modify the anomaly scoring and thresholding of Sokhandan to include the inference step (computing anomaly score(scoret)), threshold decision classifying as outlier or normal and selective memory updating of Clark to reduce false alarms and ensuring normal-identified data has influence in the network. The suggestion/motivation for doing so would have been “we describe the method we have designed which outfits a one-class learning anomaly detection system with an adaptive threshold setting method whose aim is to decrease the false alarm rate” (Clark, Page 3, Col. 1, Paragraph 2) Regarding Claim 2: The rejection of claim 1 is incorporated and further: Sokhandan discloses when deriving the restored value, calculating, by the detection and classification apparatus, a classification value indicating a probability that the input value belongs to a category of an anomaly state(Sokhandan, [0096], “Additionally, the system 200 may comprise a classifier 260 that is supplied with the labels from the supplier 240 and with at least some of the content of the latent space 115. The classifier 260 may use the content of the latent space 115 to generate classification information for each anomaly type. The latent space 115 contains a set of extracted features at the output of the encoder 105. The classifier 260 may take these features as input and pass them through another neural network (not shown) that classifies each anomaly type. This neural network is also trained end-to-end with the rest of the system 200 at the training time.”) through a classification network learned to calculate the probability for the input value(Sokhandan, [0097] The classifier 260 may further use the labels identifying the one or more anomaly types to calculate a classification loss 265 for each of the anomaly types. The system 200 may further be trained using the one or more classification losses 265 calculated for the one or more anomaly types. The classification loss 265 may, for example and without limitation, be calculated as expressed in https://en.wikipedia.org/wiki/Cross entropy, the disclosure of which is incorporated by reference herein” where cross-entropy loss being defined over predicted class probabilities corresponds to a classification network learned to calculate the probability or the input value) Regarding Claim 3: The rejection of claim 2 is incorporated and further: Sokhandan discloses after determining whether the restoration error is greater than or equal to the reference value, determining, by the detection and classification apparatus, whether the classification value is greater than or equal to a predetermined threshold, upon determining that the restoration error is greater than or equal to the reference value(Sokhandan, [0107], “…As such, an anomaly may be detected…values that are higher than a detection threshold” where an anomaly being detected when values are higher than a detection threshold corresponds to determining whether a classification value is greater than or equal to a predetermined threshold) and storing, by the detection and classification apparatus, the input value as category data upon determining that the classification value is greater than or equal to the predetermined threshold(Sokhandan, [0095], “The system 200 may be trained in the same manner as expressed in relation to the system 100 and may further be trained using the one or more sets of anomalous images supplied to the image encoder 105… The system 200 may define one of more flow-based models for the one of more anomaly types. Hence, the anomalous images are mapped to the latent space 115 and the labels are mapped to the vectors 250 and 255.” where using anomalous images supplied to the image encoder requires the images to be stored in order to use them for training which corresponds to storing the input value as category data(See Also Sokhandan, [0093] “To this end, one or more sets of anomalous images of the object are supplied to the image encoder 105. These images contain anomalies corresponding to one or more known anomaly types for the object…In some embodiments, a small number of anomalous images may be supplied to the image encoder…The system 200 also includes a supplier 240 of anomaly type labels. The supplier 240 may provide labels to an anomaly encoder 245, which is another neural network that gets trained end-to-end with the rest of the system 100. Each label provided to the anomaly encoder 245 corresponds to a given one of the anomalous images of the object and identifies a related anomaly type.”)) Regarding Claim 4: The rejection of claim 3 is incorporated and further: Sokhandan discloses detecting, by the detection and classification apparatus, occurrence of an event requiring a model update(Sokhandan, [0099], “[73] In the industrial context where the object is produced or tested, new anomaly types may be detected after a few weeks or a few months of production.” where new anomaly types being detected triggering retraining is corresponds to an event requiting a model update) and upon detecting the occurrence of the event, by the detection and classification apparatus, learning the detection network using the stored normal data when normal data of a first predetermined number or more are stored, or learning the classification network using the stored category data when category data of a second predetermined number or more are stored(Sokhandan, [0099], “When new anomaly types are identified for the object, one or more new sets of anomalous images of the object are supplied to the image encoder. These images contain anomalies corresponding to one or more new anomaly types for the object…The supplier 240 provides new labels to the anomaly encoder 245, each new label corresponding to a given one of the new anomalous images of the object and identifying a related new anomaly type” where supplying a new set of anomalous images to train the network and the supplier providing new labels to the new anomalous images and identifying the new anomaly type corresponds to learning the classification using stored category data) Regarding Claim 5: The rejection of claim 4 is incorporated and further: Sokhandan discloses initializing, by the detection and classification apparatus, the detection network(Sokhandan, [0082], “FIG. 1 is a block diagram of an anomaly detection system”) inputting, by the detection and classification apparatus, the stored normal data as a training input value to the initialized detection network(Sokhandan, [0082], FIG. 1 is a block diagram of an anomaly detection system 100 adapted to be trained in unsupervised mode in accordance with an embodiment of the present technology. The system 100 includes an image encoder 105 that receives input images 110 and forms an image model for each input image 110) calculating, by the detection and classification apparatus, an uncompressed latent value from the training input value(Sokhandan, [0082], “The image models are placed in a latent space 115. In more details, a neural network is used to extract a compact set of image features, smaller than the size of the original images, to form the image models placed in the latent space”) calculating, by the detection and classification apparatus, the restored value from the latent value(Sokhandan, [0082], “An image decoder 120 produces regenerated images 125 based on the image models placed in the latent space 115.”) calculating, by the detection and classification apparatus, a loss that is a difference between the restored value and the training input value(Sokhandan, [0086], “The system 100 calculates a reconstruction loss 130 using equation (1)”) and performing, by the detection and classification apparatus, optimization of updating a parameter of the detection network to minimize the loss(Sokhandan, [0091], “The system 100 is trained using the reconstruction loss 130 and may further be trained using the log-likelihood loss 135 and the regularization loss, following which the system 100 is ready to identify anomalies in a particular object similar to the anomaly-free object. This training process is sometimes called “optimization through backpropagation”, a technique that has been used for training various types of neural networks. In this process, the gradient of the loss with respect to each layer in the neural network is computed and is used to update the corresponding weights in that layer.”) Regarding Claim 6: The rejection of claim 5 is incorporated and further: Clark discloses after the learning, calculating, by the detection and classification apparatus, the reference value in accordance with Equation θ= μ+(k × σ), wherein denotes an average of a mean squared error (MSE) between a plurality of training input values and a plurality of restored values corresponding to the plurality of training input values used for learning on the detection network, wherein a denotes a standard deviation of the MSE between the plurality of training input values and the plurality of restored values corresponding to the plurality of training input values, and wherein k is a weight for the standard deviation(Clark, Page 4, Paragraph 1, “The first part of the algorithm consists of obtaining the initial threshold (lines1-3). S is divided into a training data set and a validation data set. The training set is used to train a model (in this work, either an autoencoder or LOF), and the validation set is used to obtain the initial threshold. This threshold is set to the mean + 2 standard deviations of the outlier scores output by the model on the validation data” where the threshold being determined by mean + 2 * standard deviates corresponds to having a reference value in accordance with the Equation θ= μ+(k × σ), References Sokhandan and Clark are analogous art because they are from the same field of endeavor of using machine-learning methods to determine anomaly/outlier in detection systems. Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art, having the teachings of Sokhandan and Clark before him or her, to modify the thresholding of Sokhandan to include threshold rule of Clark to reduce false alarms and ensuring normal-identified data has influence in the network. The suggestion/motivation for doing so would have been “we describe the method we have designed which outfits a one-class learning anomaly detection system with an adaptive threshold setting method whose aim is to decrease the false alarm rate” (Clark, Page 3, Col. 1, Paragraph 2) Regarding Claim 7: The rejection of claim 4 is incorporated and further: Sokhandan discloses initializing, by the detection and classification apparatus, the classification network; (Sokhandan, [0096], “Additionally, the system 200 may comprise a classifier”) preparing, by the detection and classification apparatus, a training input value(Sokhandan, [0096], “…that is supplied with the labels from the supplier 240 and with at least some of the content of the latent space 115.”) by setting a label corresponding to a category of the stored category data(Sokhandan, [0097], “The classifier 260 may further use the labels identifying the one or more anomaly types to calculate a classification loss 265 for each of the anomaly types.”) inputting, by the detection and classification apparatus, the training input value to the initialized classification network(Sokhandan, [0096], “…The latent space 115 contains a set of extracted features at the output of the encoder 105. The classifier 260 may take these features as input”) calculating, by the detection and classification apparatus, a classification value from the training input value by performing an operation in which a plurality of inter-layer weights are applied(Sokhandan, [0097], “…The system 200 may further be trained using the one or more classification losses 265 calculated for the one or more anomaly types.”) calculating, by the detection and classification apparatus, a classification loss that is a difference between the classification value and the label(Sokhandan, [0097], “…The classification loss 265 may, for example and without limitation, be calculated as expressed in https://en.wikipedia.org/wiki/Cross entropy, the disclosure of which is incorporated by reference herein” where the cross entropy quantifying the difference between the classification value produced by the classification network and the ground-truth label corresponds to the calculating a log loss that is the difference between the classification value and the label) and performing, by the detection and classification apparatus, optimization of updating a parameter of the classification network(Sokhandan, [0104], “The training engine 400 may also obtain values for the classification loss 265 from the systems 200 or 300” where the training engine explicitly consuming the classification loss to input into the training when updating the parameters corresponds optimization of the classification network ) to minimize the classification loss(Sokhandan, [0091], “The system 100 is trained using the reconstruction loss 130 and may further be trained…This training process is sometimes called “optimization through backpropagation”…the gradient of the loss with respect to each layer in the neural network is computed and is used to update the corresponding weights in that layer” where the use of backpropagation is specifically designed to minimize the loss function by and the classification loss being used in the training corresponds to updating parameters to of the classification network to minimize the classification loss.) Regarding Claim 8: The rejection of claim 1 is incorporated and further: Sokhandan discloses a non-transitory computer-readable recording medium that records a program for executing the method for continuous learning according to claim 1(Sokhandan, [0075], “Similarly, it will be appreciated that any flowcharts, flow diagrams, state transition diagrams, pseudo-code, and the like represent various processes that may be substantially represented in non-transitory computer-readable media and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.”) Regarding Claim 9: Sokhandan discloses a data processing unit configured to(Sokhandan, [0076] “…In some embodiments of the present technology, the processor may be a general-purpose processor, such as a central processing unit (CPU)” where a CPU corresponds to a data processing unit) generate an input value, which is a feature vector matrix including a plurality of feature vectors, from information about a medium of anomaly detection from an inspection target; (Sokhandan, [0082], “…The image models are placed in a latent space 115. In more details, a neural network is used to extract a compact set of image features, smaller than the size of the original images, to form the image models placed in the latent space 115. In a non-limiting embodiment, the neural network may be based on a normalizing flow structure” where extracting a set of features corresponds to generating an input value which is a feature vector matrix from the medium information(See Also Sokhandan, [0019], “supplying, to the latent space, a statistically sufficient sample of information contained in the vectors containing the means and standard deviations”)) Sokhandan discloses a detection unit configured to(Sokhandan, [0076] “…In some embodiments of the present technology, the processor may be a general-purpose processor, such as a central processing unit (CPU)” where a CPU corresponds to a detection unit unit) derive a restored value imitating the input value through a detection network learned to generates the restored value for the input value(Sokhandan, [0078], “Having learned a rich representation of the non-anomalous object, the system is able to receive an input image of a particular object that may contain anomalies, generate an image model and regenerate a substitute non-anomalous image of the object” where generating an image model and regenerate a substitute image of the object corresponds to deriving a restored value imitating the input value to generate a restored value for the input value), Sokhandan does not explicitly teach, however Clark does disclose to determine whether a restoration error indicating a difference between the input value and the restored value is greater than or equal to a previously calculated reference value(Algorithm 1 lines 6-10, where the scoret > threshold if else statement and scoret <- f(xt) corresponds to determining restoration error indicating a difference between the input value and the restored value is greater than or equal to a previously calculated reference value) , and to store the input value as normal data upon determining that the restoration error is less than the reference value(Algorithm 1 lines 10-12, where the score being stored in sliding windows z_w2/t_w corresponds to storing the input value as normal data upon determining that the restoration error is less than the reference value) References Sokhandan and Clark are analogous art because they are from the same field of endeavor of using machine-learning methods to determine anomaly/outlier in detection systems. Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art, having the teachings of Sokhandan and Clark before him or her, to modify the anomaly scoring and thresholding of Sokhandan to include the inference step (computing anomaly score(scoret)), threshold decision classifying as outlier or normal and selective memory updating of Clark to reduce false alarms and ensuring normal-identified data has influence in the network. The suggestion/motivation for doing so would have been “we describe the method we have designed which outfits a one-class learning anomaly detection system with an adaptive threshold setting method whose aim is to decrease the false alarm rate” (Clark, Page 3, Col. 1, Paragraph 2) Regarding Claim 10: The rejection of claim 9 is incorporated and further: Sokhandan discloses calculate a classification value indicating a probability that the input value belongs to a category of an anomaly state(Sokhandan, [0096], “Additionally, the system 200 may comprise a classifier 260 that is supplied with the labels from the supplier 240 and with at least some of the content of the latent space 115. The classifier 260 may use the content of the latent space 115 to generate classification information for each anomaly type. The latent space 115 contains a set of extracted features at the output of the encoder 105. The classifier 260 may take these features as input and pass them through another neural network (not shown) that classifies each anomaly type. This neural network is also trained end-to-end with the rest of the system 200 at the training time.”) through a classification network learned to calculate the probability for the input value(Sokhandan, [0097] The classifier 260 may further use the labels identifying the one or more anomaly types to calculate a classification loss 265 for each of the anomaly types. The system 200 may further be trained using the one or more classification losses 265 calculated for the one or more anomaly types. The classification loss 265 may, for example and without limitation, be calculated as expressed in https://en.wikipedia.org/wiki/Cross entropy, the disclosure of which is incorporated by reference herein” where cross-entropy loss being defined over predicted class probabilities corresponds to a classification network learned to calculate the probability or the input value) Regarding Claim 11: The rejection of claim 10 is incorporated and further: Sokhandan discloses wherein the detection unit is configured to determine whether the classification value is greater than or equal to a predetermined threshold, upon determining that the restoration error is greater than or equal to the reference value(Sokhandan, [0107], “…As such, an anomaly may be detected…values that are higher than a detection threshold” where an anomaly being detected when values are higher than a detection threshold corresponds to determining whether a classification value is greater than or equal to a predetermined threshold), and to store the input value as category data upon determining that the classification value is greater than or equal to the predetermined threshold(Sokhandan, [0095], “The system 200 may be trained in the same manner as expressed in relation to the system 100 and may further be trained using the one or more sets of anomalous images supplied to the image encoder 105… The system 200 may define one of more flow-based models for the one of more anomaly types. Hence, the anomalous images are mapped to the latent space 115 and the labels are mapped to the vectors 250 and 255.” where using anomalous images supplied to the image encoder requires the images to be stored in order to use them for training which corresponds to storing the input value as category data(See Also Sokhandan, [0093] “To this end, one or more sets of anomalous images of the object are supplied to the image encoder 105. These images contain anomalies corresponding to one or more known anomaly types for the object…In some embodiments, a small number of anomalous images may be supplied to the image encoder…The system 200 also includes a supplier 240 of anomaly type labels. The supplier 240 may provide labels to an anomaly encoder 245, which is another neural network that gets trained end-to-end with the rest of the system 100. Each label provided to the anomaly encoder 245 corresponds to a given one of the anomalous images of the object and identifies a related anomaly type.”)) Regarding Claim 12: The rejection of claim 11 is incorporated and further: Sokhandan discloses a learning unit configured to(Sokhandan, [0076] “…In some embodiments of the present technology, the processor may be a general-purpose processor, such as a central processing unit (CPU)” where a CPU corresponds to a data processing unit) detect occurrence of an event requiring a model update(Sokhandan, [0099], “[73] In the industrial context where the object is produced or tested, new anomaly types may be detected after a few weeks or a few months of production.” where new anomaly types being detected triggering retraining is corresponds to an event requiting a model update), and upon detecting the occurrence of the event, learn the detection network using the stored normal data when normal data of a first predetermined number or more are stored, or learn the classification network using the stored category data when category data of a second predetermined number or more are stored(Sokhandan, [0099], “When new anomaly types are identified for the object, one or more new sets of anomalous images of the object are supplied to the image encoder. These images contain anomalies corresponding to one or more new anomaly types for the object…The supplier 240 provides new labels to the anomaly encoder 245, each new label corresponding to a given one of the new anomalous images of the object and identifying a related new anomaly type” where supplying a new set of anomalous images to train the network and the supplier providing new labels to the new anomalous images and identifying the new anomaly type corresponds to learning the classification using stored category data) Regarding Claim 13: The rejection of claim 12 is incorporated and further: Sokhandan discloses initialize the detection network(Sokhandan, [0082], “FIG. 1 is a block diagram of an anomaly detection system”), and input the stored normal data as a training input value to the initialized detection network(Sokhandan, [0082], FIG. 1 is a block diagram of an anomaly detection system 100 adapted to be trained in unsupervised mode in accordance with an embodiment of the present technology. The system 100 includes an image encoder 105 that receives input images 110 and forms an image model for each input image 110), when an encoder of the detection network calculates an uncompressed latent value from the training input value, and calculates the restored value from the latent value(Sokhandan, [0082], “An image decoder 120 produces regenerated images 125 based on the image models placed in the latent space 115.”), calculate a loss that is a difference between the restored value and the training input value(Sokhandan, [0086], “The system 100 calculates a reconstruction loss 130 using equation (1)”), and perform optimization of updating a parameter of the detection network to minimize the loss(Sokhandan, [0091], “The system 100 is trained using the reconstruction loss 130 and may further be trained using the log-likelihood loss 135 and the regularization loss, following which the system 100 is ready to identify anomalies in a particular object similar to the anomaly-free object. This training process is sometimes called “optimization through backpropagation”, a technique that has been used for training various types of neural networks. In this process, the gradient of the loss with respect to each layer in the neural network is computed and is used to update the corresponding weights in that layer.”) Regarding Claim 14: The rejection of claim 13 is incorporated and further: Sokhandan discloses wherein the learning unit is configured to:calculate the reference value in accordance with Equation θ= μ+(k × σ), wherein denotes an average of a mean squared error (MSE) between a plurality of training input values and a plurality of restored values corresponding to the plurality of training input values used for learning on the detection network, wherein a denotes a standard deviation of the MSE between the plurality of training input values and the plurality of restored values corresponding to the plurality of training input values, and wherein k is a weight for the standard deviation(Clark, Page 4, Paragraph 1, “The first part of the algorithm consists of obtaining the initial threshold (lines1-3). S is divided into a training data set and a validation data set. The training set is used to train a model (in this work, either an autoencoder or LOF), and the validation set is used to obtain the initial threshold. This threshold is set to the mean + 2 standard deviations of the outlier scores output by the model on the validation data” where the threshold being determined by mean + 2 * standard deviates corresponds to having a reference value in accordance with the Equation θ= μ+(k × σ), References Sokhandan and Clark are analogous art because they are from the same field of endeavor of using machine-learning methods to determine anomaly/outlier in detection systems. Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art, having the teachings of Sokhandan and Clark before him or her, to modify the thresholding of Sokhandan to include threshold rule of Clark to reduce false alarms and ensuring normal-identified data has influence in the network. The suggestion/motivation for doing so would have been “we describe the method we have designed which outfits a one-class learning anomaly detection system with an adaptive threshold setting method whose aim is to decrease the false alarm rate” (Clark, Page 3, Col. 1, Paragraph 2) Regarding Claim 15: The rejection of claim 12 is incorporated and further: Sokhandan discloses initialize the classification network(Sokhandan, [0096], “Additionally, the system 200 may comprise a classifier”), prepare a training input value(Sokhandan, [0096], “…that is supplied with the labels from the supplier 240 and with at least some of the content of the latent space 115.”) by setting a label corresponding to a category of the stored category data(Sokhandan, [0097], “The classifier 260 may further use the labels identifying the one or more anomaly types to calculate a classification loss 265 for each of the anomaly types.”), and input the training input value to the initialized classification network(Sokhandan, [0096], “…The latent space 115 contains a set of extracted features at the output of the encoder 105. The classifier 260 may take these features as input”), when the classification network calculates a classification value from the training input value by performing an operation in which a plurality of inter-layer weights are applied(Sokhandan, [0097], “…The classification loss 265 may, for example and without limitation, be calculated as expressed in https://en.wikipedia.org/wiki/Cross entropy, the disclosure of which is incorporated by reference herein” where the cross entropy quantifying the difference between the classification value produced by the classification network and the ground-truth label corresponds to the calculating a log loss that is the difference between the classification value and the label), calculate a classification loss that is a difference between the classification value and the label, and perform optimization of updating a parameter of the classification network(Sokhandan, [0104], “The training engine 400 may also obtain values for the classification loss 265 from the systems 200 or 300” where the training engine explicitly consuming the classification loss to input into the training when updating the parameters corresponds optimization of the classification network ) to minimize the classification loss(Sokhandan, [0091], “The system 100 is trained using the reconstruction loss 130 and may further be trained…This training process is sometimes called “optimization through backpropagation”…the gradient of the loss with respect to each layer in the neural network is computed and is used to update the corresponding weights in that layer” where the use of backpropagation is specifically designed to minimize the loss function by and the classification loss being used in the training corresponds to updating parameters to of the classification network to minimize the classification loss.) Regarding Claim 16: The rejection of claim 13 is incorporated and further: Sokhandan discloses wherein the learning unit includes any one of: an autoencoder model including an encoder and a decoder(Sokhandan, [0022], “In some implementations of the present technology, the model of the object is a variational autoencoder model” and Sokhandan, [0048] “In some implementations of the present technology, the system further comprises an image decoder, the image encoder implementing a first function”) Regarding Claim 17: The rejection of claim 16 is incorporated and further: The BRI interpretation of the claims does not require generate a mean square loss of an input value and a restored value or use a restoration error, when using the discriminator, use a mean square loss of a discriminator output for an actual input and a generated input as a discrimination error, and when a user input is entered, set the restoration error and the discrimination error according to the user input(this claim is rejected under the rationale defined under 112(d) for claim 17 as the BRI interpretation of the claim does not require discriminator and the discriminator is a contingent limitation as defined in claim 16.) Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHARLES JEFFREY JONES JR whose telephone number is (703)756-1414. The examiner can normally be reached Monday - Friday 8:00 - 5:00 EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kakali Chaki can be reached at 571-272-3719. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /C.J.J./Examiner, Art Unit 2122 /KAKALI CHAKI/Supervisory Patent Examiner, Art Unit 2122
Read full office action

Prosecution Timeline

Mar 17, 2023
Application Filed
Mar 02, 2026
Non-Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12582959
DATA GENERATION DEVICE AND METHOD, AND LEARNING DEVICE AND METHOD
2y 5m to grant Granted Mar 24, 2026
Patent 12380333
METHOD OF CONSTRUCTING NETWORK MODEL FOR DEEP LEARNING, DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Aug 05, 2025
Study what changed to get past this examiner. Based on 2 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
27%
Grant Probability
93%
With Interview (+65.9%)
4y 2m
Median Time to Grant
Low
PTA Risk
Based on 15 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month