Prosecution Insights
Last updated: April 19, 2026
Application No. 18/265,346

LEARNING APPARATUS, LEARNING METHOD, ANOMALY DETECTION APPARATUS, ANOMALY DETECTION METHOD, AND COMPUTER-READABLE RECORDING MEDIUM

Non-Final OA §101§102§103§112
Filed
Jun 05, 2023
Examiner
SUSSMAN MOSS, JACOB ZACHARY
Art Unit
2122
Tech Center
2100 — Computer Architecture & Software
Assignee
NEC Corporation
OA Round
1 (Non-Final)
14%
Grant Probability
At Risk
1-2
OA Rounds
3y 3m
To Grant
-6%
With Interview

Examiner Intelligence

Grants only 14% of cases
14%
Career Allow Rate
1 granted / 7 resolved
-40.7% vs TC avg
Minimal -20% lift
Without
With
+-20.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
26 currently pending
Career history
33
Total Applications
across all art units

Statute-Specific Performance

§101
37.3%
-2.7% vs TC avg
§103
35.2%
-4.8% vs TC avg
§102
11.9%
-28.1% vs TC avg
§112
15.5%
-24.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 7 resolved cases

Office Action

§101 §102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This action is responsive to the application filed on June 6th, 2023. Claims 1-14 are pending in the case. Claims 1, 4, 8 and 11 are independent claims. The preliminary amendment filed June 6th, 2023, has been accepted and the amendments to the claims and specification has been entered. Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. The information disclosure statements (IDS) submitted on June 5th, 2023 and August 29th, 2023 are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statements are being considered by the examiner. Specification A substitute specification in proper idiomatic English and in compliance with 37 CFR 1.52(a) and (b) is required. The substitute specification filed must be accompanied by a statement that it contains no new matter. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-7 and 10-14 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 1 recites the limitation “one or more processors configured to execute “ in line 3 of the claim. It is unclear whether these are the same one or more processors introduced in line 2 of the claim, or second new one or more processors. For examination purposes this limitation has been interpreted as “the one or more processors are configured to execute”. Claims 2 and 3 recite the limitation “one or more processors”. It is unclear whether these are the same one or more processors introduced in claim 1 or second new one or more processors. For examination purposes this limitation has been interpreted as “the one or more processors”. Claim 3 recites the limitation “a feature vector” in line 4 of the claim. It is unclear whether this is the same as the feature vector generated based on normal data from line 6 of claim 1, or whether this is a second new feature vector composed of normal data. For examination purposes this limitation has been interpreted as “the feature vector”. Claim 4 recites the limitation “one or more processors configured to execute” in line 3 of the claim. It is unclear whether these are the same one or more processors introduced in line 2 of the claim, or second new one or more processors. For examination purposes this limitation has been interpreted as “the one or more processors are configured to execute”. Claim 4 recites the limitation “determine that a feature vector is anomalous” in lines 7-8 of the claim. It is unclear whether the feature vector determined to be anomalous is the same feature vector generated from the input data in line 5 of the claim, or a second new feature vector. For examination purposes this limitation has been interpreted as “determine that the feature vector is anomalous”. Claim 5 recites the limitation “determine that a feature vector mapped outside the region is anomalous” in lines 3-4 of the claim. It is unclear whether this is the same feature vector determined to be anomalous in claim 4, or a second new feature vector. For examination purposes this limitation has been interpreted as “determine that the feature vector mapped outside the region is anomalous”. Claims 5 and 6 recite the limitation “one or more processors”. It is unclear whether these are the same one or more processors introduced in claim 4, or second new one or more processors. For examination purposes this limitation has been interpreted as “the one or more processors”. Claim 6 recites the limitation “the autoencoder” in line 8 of the claim. There is insufficient antecedent basis for this limitation in the claim. For examination purposes this limitation has been interpreted as “an autoencoder”. Claim 6 recites the limitation “receive input of a feature vector of normal data” in line 4 of the claim. It is unclear whether this is the same feature vector determined to be anomalous in claim 4, or a second new feature vector. For examination purposes this limitation has been interpreted as “receive input of the feature vector of normal data”. Claim 7 recites the limitation “the system” in line 3 of the claim. It is unclear whether this is the “target system” of claim 4, or a second separate system. For examination purposes this limitation has been interpreted as “the target system”. Claim 10 recites the limitation “a feature vector” in line 3 of the claim. It is unclear whether this is the same as the feature vector generated based on normal data from line 4 of claim 8, or whether this is a second new feature vector composed of normal data. For examination purposes this limitation has been interpreted as “the feature vector”. Claim 11 recites the limitation “determine that a feature vector is anomalous” in line 5 of the claim. It is unclear whether the feature vector determined to be anomalous is the same feature vector generated from the input data in line 3 of the claim, or a second new feature vector. For examination purposes this limitation has been interpreted as “determine that the feature vector is anomalous”. Claim 12 recites the limitation “a feature vector mapped outside the region is determined to be anomalous” in lines 3-4 of the claim. It is unclear whether this is the same feature vector determined to be anomalous in claim 4, or a second new feature vector. For examination purposes this limitation has been interpreted as “the feature vector mapped outside the region is determined to be anomalous”. Claim 13 recites the limitation “receiving input of a feature vector of normal data” in line 4 of the claim. It is unclear whether this is the same feature vector determined to be anomalous in claim 4, or a second new feature vector. For examination purposes this limitation has been interpreted as “receiving input of the feature vector of normal data”. Claim 13 recites the limitation “the reconstruction error due to the reconstruction” in line 8 of the claim. There is insufficient antecedent basis for this limitation in the claim. For examination purposes this limitation has been interpreted as “the reconstruction error Claims 1, 8 and 11 recite the limitation “a region set based on a subspace”. It is unclear whether this is a “region set” of data, or whether this is a region which has been set, based on a subspace. For examination purposes this limitation has been interpreted as “a region which has been set based on a subspace” Appropriate correction is required. Claim 14 recites the limitation “the system” in line 3 of the claim. It is unclear whether this is the “target system” of claim 11, or a second separate system. For examination purposes this limitation has been interpreted as “the target system”. Claim Objections Claim 4 and 11 objected to because of the following informalities: “input input data” is not idiomatic English, an alternative such as “provide input data” would make the claims clearer. Appropriate correction is required. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-14 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding claim 1: Step 1: Claim 1 is directed to [a] learning apparatus, therefore it falls under the statuary category of a manufacture. Step 2A Prong 1: The claim recites, in part: “learn a first parameter and a second parameter that are included…for mapping, to a region set based on a subspace set in advance and a distance from the subspace, a feature vector generated based on normal data…, the first parameter being for generating the feature vector and the second parameter being for adjusting the distance” this encompasses the mental learning of observed parameters for mapping to an observed subspace based on a mentally created feature vector. Step 2A Prong 2: The judicial exception is not integrated into a practical application; the remaining limitations of the claim are as follows: “one or more memories storing instructions”, “a mapping model” these limitations are an additional element that generally links the use of the judicial exception to a particular technological environment or field of use. See MPEP § 2106.05(h). “one or more processors configured to execute the instructions to” the limitation is an additional element that amounts to adding the words “apply it” (or an equivalent) with the judicial exception, or merely uses a computer in its ordinary capacity as a tool to perform an existing process. See MPEP § 2106.05(f)(2). “input as training data” the limitation is an additional element that amounts to adding insignificant extra-solution activity to the judicial exception. See MPEP § 2106.05(g). Step 2B: The additional elements, taken individually and in combination, do not provide an inventive concept of significantly more than the abstract idea itself for the reasons set forth in step 2A prong 2 above. Further, “input as training data” the limitation is an additional element that amounts to adding insignificant extra-solution activity to the judicial exception. See MPEP § 2106.05(g). Furthermore the additional element is directed to storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015). See MPEP § 2106.05(d)/(II). Therefore, the claim is ineligible. Regarding claim 2, the rejection of claim 1 is incorporated and further: Step 2A Prong 1: The claim recites, in part: “select, as the subspace, at least one of a hypersphere, a hyperellipsoid, a hyper hyperboloid, a torus, a hyperplane, part thereof, and a union or intersection thereof” this encompasses the mental selection of a subspace. Step 2A Prong 2: The judicial exception is not integrated into a practical application; the remaining limitations of the claim are as follows: “one or more processors is further configured to execute the instructions to” the limitation is an additional element that amounts to adding the words “apply it” (or an equivalent) with the judicial exception, or merely uses a computer in its ordinary capacity as a tool to perform an existing process. See MPEP § 2106.05(f)(2). Step 2B: The additional elements, taken individually and in combination, do not provide an inventive concept of significantly more than the abstract idea itself for the reasons set forth in step 2A prong 2 above. Therefore, the claim is ineligible. Regarding claim 3, the rejection of claim 1 is incorporated and further: Step 2A Prong 1: The claim recites, in part: “reconstructing input data corresponding to the feature vector” this encompasses the mental reconstruction of data corresponding to an observed feature vector. Step 2A Prong 2: The judicial exception is not integrated into a practical application; the remaining limitations of the claim are as follows: “one or more processors is further configured to execute the instructions to” the limitation is an additional element that amounts to adding the words “apply it” (or an equivalent) with the judicial exception, or merely uses a computer in its ordinary capacity as a tool to perform an existing process. See MPEP § 2106.05(f)(2). “receive input of a feature vector of the normal data” the limitation is an additional element that amounts to adding insignificant extra-solution activity to the judicial exception. See MPEP § 2106.05(g). Step 2B: The additional elements, taken individually and in combination, do not provide an inventive concept of significantly more than the abstract idea itself for the reasons set forth in step 2A prong 2 above. Further, “receive input of a feature vector of the normal data” the limitation is an additional element that amounts to adding insignificant extra-solution activity to the judicial exception. See MPEP § 2106.05(g). Furthermore the additional element is directed to storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015). See MPEP § 2106.05(d)/(II). Therefore, the claim is ineligible. Regarding claim 4: Step 1: Claim 4 is directed to [a]n anomaly detection apparatus, therefore it falls under the statuary category of a manufacture. Step 2A Prong 1: The claim recites, in part: “mapping a feature vector generated based on the input data to a region set based on a subspace set in advance and a distance from the subspace” this encompasses the mental mapping of an observed feature vector to a subspace. “determine that a feature vector is anomalous based on a result of the mapping” this encompasses the mental determination that an observed feature vector is anomalous based on a result of observed mapping Step 2A Prong 2: The judicial exception is not integrated into a practical application; the remaining limitations of the claim are as follows: “one or more memories storing instructions”, “a mapping model” these limitations are an additional element that generally links the use of the judicial exception to a particular technological environment or field of use. See MPEP § 2106.05(h). “one or more processors configured to execute the instructions to” the limitation is an additional element that amounts to adding the words “apply it” (or an equivalent) with the judicial exception, or merely uses a computer in its ordinary capacity as a tool to perform an existing process. See MPEP § 2106.05(f)(2). “input input data acquired from a target system” the limitation is an additional element that amounts to adding insignificant extra-solution activity to the judicial exception. See MPEP § 2106.05(g). Step 2B: The additional elements, taken individually and in combination, do not provide an inventive concept of significantly more than the abstract idea itself for the reasons set forth in step 2A prong 2 above. Further, “input input data acquired from a target system” the limitation is an additional element that amounts to adding insignificant extra-solution activity to the judicial exception. See MPEP § 2106.05(g). Furthermore the additional element is directed to storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015). See MPEP § 2106.05(d)/(II). Therefore, the claim is ineligible. Regarding claim 5, the rejection of claim 4 is incorporated and further: Step 2A Prong 1: The claim recites, in part: “determine that a feature vector mapped outside the region is anomalous” this encompasses the mental determination of that an observed feature vector is anomalous. Step 2A Prong 2: The judicial exception is not integrated into a practical application; the remaining limitations of the claim are as follows: “wherein one or more processors is further configured to execute the instructions to” the limitation is an additional element that amounts to adding the words “apply it” (or an equivalent) with the judicial exception, or merely uses a computer in its ordinary capacity as a tool to perform an existing process. See MPEP § 2106.05(f)(2). Step 2B: The additional elements, taken individually and in combination, do not provide an inventive concept of significantly more than the abstract idea itself for the reasons set forth in step 2A prong 2 above. Therefore, the claim is ineligible. Regarding claim 6, the rejection of claim 4 is incorporated and further: Step 2A Prong 1: The claim recites, in part: “reconstructing input data corresponding to the feature vector” this encompasses the mental reconstruction of data corresponding to an observed feature vector. “calculate a reconstruction error representing a difference between the input data and reconstructed data” this limitation is a mathematical concept. “determine an anomaly of the feature vector, based on a result of the mapping and the reconstruction error” this encompasses the mental determination of an anomaly of an observed feature vector, based on a result of the mapping and an observed reconstruction error. Step 2A Prong 2: The judicial exception is not integrated into a practical application; the remaining limitations of the claim are as follows: “one or more processors is further configured to execute the instructions to”, “obtained by inputting the feature vector of the input data to the autoencoder” the limitations are an additional element that amounts to adding the words “apply it” (or an equivalent) with the judicial exception, or merely uses a computer in its ordinary capacity as a tool to perform an existing process. See MPEP § 2106.05(f)(2). “receive input of a feature vector of normal data” the limitation is an additional element that amounts to adding insignificant extra-solution activity to the judicial exception. See MPEP § 2106.05(g). Step 2B: The additional elements, taken individually and in combination, do not provide an inventive concept of significantly more than the abstract idea itself for the reasons set forth in step 2A prong 2 above. Therefore, the claim is ineligible. Regarding claim 7, the rejection of claim 4 is incorporated and further: Step 2A Prong 1: a continuation of the abstract idea identified in the parent claim. Step 2A Prong 2: The judicial exception is not integrated into a practical application; the remaining limitations of the claim are as follows: “the input data includes one of traffic data of a network in the system and sensor data output from a sensor” the limitation is an additional element that generally links the use of the judicial exception to a particular technological environment or field of use. See MPEP § 2106.05(h). Step 2B: The additional elements, taken individually and in combination, do not provide an inventive concept of significantly more than the abstract idea itself for the reasons set forth in step 2A prong 2 above. Therefore, the claim is ineligible. Regarding claim 8: Step 1: Claim 8 is directed to [a] learning method, therefore it falls under the statuary category of a process. Step 2A Prong 1: The claim recites, in part: “learning a first parameter and a second parameter that are included…for mapping, to a region set based on a subspace set in advance and a distance from the subspace, a feature vector generated based on normal data…, the first parameter being for generating the feature vector and the second parameter being for adjusting the distance” this encompasses the mental learning of observed parameters for mapping to an observed subspace based on a mentally created feature vector. Step 2A Prong 2: The judicial exception is not integrated into a practical application; the remaining limitations of the claim are as follows: “a mapping model” the limitation is an additional element that generally links the use of the judicial exception to a particular technological environment or field of use. See MPEP § 2106.05(h). “input as training data” the limitation is an additional element that amounts to adding insignificant extra-solution activity to the judicial exception. See MPEP § 2106.05(g). Step 2B: The additional elements, taken individually and in combination, do not provide an inventive concept of significantly more than the abstract idea itself for the reasons set forth in step 2A prong 2 above. Further, “input as training data” the limitation is an additional element that amounts to adding insignificant extra-solution activity to the judicial exception. See MPEP § 2106.05(g). Furthermore the additional element is directed to storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015). See MPEP § 2106.05(d)/(II). Therefore, the claim is ineligible. Regarding claims 9-10: The rejection of claim 8 is further incorporated, the rejection of claims 2-3 are applicable to claims 9-10, respectively. Regarding claim 11: Step 1: Claim 11 is directed to [a]n anomaly detection method, therefore it falls under the statuary category of a process. “mapping a feature vector generated based on the input data to a region set based on a subspace set in advance and a distance from the subspace” this encompasses the mental mapping of an observed feature vector to a subspace. “determining that a feature vector is anomalous based on a result of the mapping” this encompasses the mental determination that an observed feature vector is anomalous based on a result of observed mapping Step 2A Prong 2: The judicial exception is not integrated into a practical application; the remaining limitations of the claim are as follows: “a mapping model” the limitation is an additional element that generally links the use of the judicial exception to a particular technological environment or field of use. See MPEP § 2106.05(h). “inputting input data acquired from a target system” the limitation is an additional element that amounts to adding insignificant extra-solution activity to the judicial exception. See MPEP § 2106.05(g). Step 2B: The additional elements, taken individually and in combination, do not provide an inventive concept of significantly more than the abstract idea itself for the reasons set forth in step 2A prong 2 above. Further, “inputting input data acquired from a target system” the limitation is an additional element that amounts to adding insignificant extra-solution activity to the judicial exception. See MPEP § 2106.05(g). Furthermore the additional element is directed to storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015). See MPEP § 2106.05(d)/(II). Therefore, the claim is ineligible. Regarding claims 12-14: The rejection of claim 11 is further incorporated, the rejection of claims 5-7 are applicable to claims 12-14, respectively. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-5 and 8-12 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Ruff et al. (“Deep One-Class Classification”, Ruff et al., 2018) (as cited in the IDS, hereinafter “Ruff”). Regarding claim 1: Ruff teaches [a] learning apparatus comprising: one or more memories storing instructions (Ruff, page 6, footnote 2 “We provide our code at https://github.com/lukasruff/Deep SVDD.” Here, the code used for Deep SVDD discloses a memory storing instructions); and one or more processors configured to execute the instructions (Ruff, page 4, col 2, section 3.2, ¶1 “Deep SVDD to scale well with large datasets as its computational complexity scales linearly in the number of training batches and each batch can be processed in parallel (e.g. by processing on multiple GPUs)” here, the use of GPUs discloses processors executing the instructions) to: learn a first parameter and a second parameter that are included in a mapping model for mapping (Ruff, page 4, col 1, ¶1 “To do this we employ a neural network that is jointly trained to map the data into a hypersphere of minimum volume.”), to a region set based on a subspace set in advance and a distance from the subspace, a feature vector generated based on normal data input as training data (Ruff, page 4, col 1, ¶2 “Given some training data Dn = {x1,...,xn} on X, we define the soft-boundary Deep SVDD objective as PNG media_image1.png 192 653 media_image1.png Greyscale As in kernel SVDD, minimizing R2 minimizes the volume of the hypersphere.” Here, W can be considered the first parameter, R can be considered the second parameter, the hypersphere can be considered the subspace with R (the radius) being the distance from the subspace, and Dn can be considered the feature vector generated based on the training data), the first parameter being for generating the feature vector (Ruff, page 4, col 1, ¶2 “and set of weights W = {W1,...,W L} where W l are the weights of layer l ∈ {1,...,L}. That is, φ(x;W) ∈ F is the feature representation of x ∈ X given by network φ with parameters W.” here, the first parameter W, is used to generate the feature vector) and the second parameter being for adjusting the distance (Ruff, page 4, col 1, ¶2 “As in kernel SVDD, minimizing R2 minimizes the volume of the hypersphere.” Here, the second parameter, radius R can be considered the distance). Regarding claim 2: Ruff teaches [t]he learning apparatus according to claim 1, comprising: one or more processors is further configured to execute the instructions to select, as the subspace, at least one of a hypersphere (Ruff, page 4, col 1, ¶1 “To do this we employ a neural network that is jointly trained to map the data into a hypersphere of minimum volume.” It is noted the claim recites alternative language, and Ruff teaches at least one of the alternatives), a hyperellipsoid, a hyper hyperboloid, a torus, a hyperplane, part thereof, and a union or intersection thereof. Regarding claim 3: Ruff teaches [t]he learning apparatus according to claim 1, comprising: one or more processors is further configured to execute the instructions to receive input of a feature vector of the normal data and reconstructing input data corresponding to the feature vector (Ruff, page 2, figure 1 “Deep SVDD learns a neural network transformation φ(·;W) with weights W from input space X ⊆ Rd to output space F ⊆Rp that attempts to map most of the data network representations into a hypersphere characterized by center c and radius R of minimum volume.”). Regarding claim 4: Ruff teaches [a]n anomaly detection apparatus (Ruff, page 1, abstract “In this paper we introduce a new anomaly detection method—Deep Support Vector Data Description—, which is trained on an anomaly detection based objective.”) comprising: input input data acquired from a target system to a mapping model (Ruff, page 4, col 1, ¶1 “To do this we employ a neural network that is jointly trained to map the data into a hypersphere of minimum volume.”), and mapping a feature vector generated based on the input data to a region on a subspace set in advance and a distance from the subspace (Ruff, page 4, col 1, ¶2 “Given some training data Dn = {x1,...,xn} on X, we define the soft-boundary Deep SVDD objective as PNG media_image1.png 192 653 media_image1.png Greyscale As in kernel SVDD, minimizing R2 minimizes the volume of the hypersphere.” Here, W can be considered the first parameter, R can be considered the second parameter, the hypersphere can be considered the subspace with R (the radius) being the distance from the subspace, and Dn can be considered the feature vector generated based on the training data); and determine that a feature vector is anomalous based on a result of the mapping (Ruff, page 4, col 1, ¶4 “As a result, normal examples of the data are closely mapped to center c, whereas anomalous examples are mapped further away from the center or outside of the hypersphere.”). Regarding claim 5: Ruff teaches [t]he anomaly detection apparatus according to claim 4, wherein one or more processors is further configured to execute the instructions to determine that a feature vector mapped outside the region is anomalous (Ruff, page 4, col 1, ¶4 “As a result, normal examples of the data are closely mapped to center c, whereas anomalous examples are mapped further away from the center or outside of the hypersphere.”). Regarding claim 8: Ruff teaches [a] learning method comprising: learning a first parameter and a second parameter that are included in a mapping model for mapping, to a region set based on a subspace set in advance and a distance from the subspace, a feature vector generated based on normal data input as training data (Ruff, page 4, col 1, ¶2 “Given some training data Dn = {x1,...,xn} on X, we define the soft-boundary Deep SVDD objective as PNG media_image1.png 192 653 media_image1.png Greyscale As in kernel SVDD, minimizing R2 minimizes the volume of the hypersphere.” Here, W can be considered the first parameter, R can be considered the second parameter, the hypersphere can be considered the subspace with R (the radius) being the distance from the subspace, and Dn can be considered the feature vector generated based on the training data), the first parameter being for generating the feature vector (Ruff, page 4, col 1, ¶2 “and set of weights W = {W1,...,W L} where W l are the weights of layer l ∈ {1,...,L}. That is, φ(x;W) ∈ F is the feature representation of x ∈ X given by network φ with parameters W.” here, the first parameter W, is used to generate the feature vector) and the second parameter being for adjusting the distance (Ruff, page 4, col 1, ¶2 “As in kernel SVDD, minimizing R2 minimizes the volume of the hypersphere.” Here, the second parameter, radius R can be considered the distance). Regarding claims 9-10: The rejection of claims 2-3 are applicable to claims 9-10, respectively. Regarding claim 11: Ruff teaches [a]n anomaly detection method comprising: inputting input data acquired from a target system to a mapping model (Ruff, page 4, col 1, ¶1 “To do this we employ a neural network that is jointly trained to map the data into a hypersphere of minimum volume.”), and mapping a feature vector generated based on the input data to a region set based on a subspace set in advance and a distance from the subspace (Ruff, page 4, col 1, ¶2 “Given some training data Dn = {x1,...,xn} on X, we define the soft-boundary Deep SVDD objective as PNG media_image1.png 192 653 media_image1.png Greyscale As in kernel SVDD, minimizing R2 minimizes the volume of the hypersphere.” Here, W can be considered the first parameter, R can be considered the second parameter, the hypersphere can be considered the subspace with R (the radius) being the distance from the subspace, and Dn can be considered the feature vector generated based on the training data); and determining that a feature vector is anomalous based on a result of the mapping (Ruff, page 4, col 1, ¶4 “As a result, normal examples of the data are closely mapped to center c, whereas anomalous examples are mapped further away from the center or outside of the hypersphere.”).. Regarding claim 12: The rejection of claim 5 is applicable to claim 12. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 6-7 and 13-14 are rejected under 35 U.S.C. 103 as being unpatentable over Ruff in view of Erfani et al. (“High-dimensional and large-scale anomaly detection using a linear one-class SVM with deep learning”, Erfani et al., 2016) (hereinafter “Erfani”). Regarding claim 6: Ruff teaches [t]he anomaly detection apparatus according to claim 4, comprising: one or more processors is further configured to execute the instructions to receive input of a feature vector of normal data and reconstructing input data corresponding to the feature vector, Rudd does not teach “wherein in the determination, calculate a reconstruction error representing a difference between the input data and reconstructed data obtained by inputting the feature vector of the input data to the autoencoder, and determine an anomaly of the feature vector, based on a result of the mapping and the reconstruction error” However, Erfani teaches wherein in the determination, calculate a reconstruction error representing a difference between the input data and reconstructed data obtained by inputting the feature vector of the input data to the autoencoder, and determine an anomaly of the feature vector, based on a result of the mapping and the reconstruction error (Erfani, page 6, col 1, section 4.1, ¶1 “The whole process of pre training and fine-tuning was performed in an unsupervised manner so far. When the autoencoder is used for anomaly detection, anomalies can be identified based on the history of the squared error between the inputs and outputs for the training records. Let e be the set of reconstruction error values of the x i ∈ X , where i = 1,…,m. If the reconstruction error for a test sample is larger than the threshold τ = μ ⅇ + 3 σ ⅇ , where μ ⅇ and σ ⅇ are the mean and standard deviation of the values in the set e, respectively, then the record is identified as anomalous, otherwise it is identified as normal.”). Ruff and Erfani are analogous art because both references concern methods for deep one-class learning for anomaly detection. Accordingly, it would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Ruff’s anomaly detection system to incorporate the reconstruction error taught by Erfani. The motivation for doing so would have been to be scalable and computationally efficient as stated in Erfani, page 1, Abstract “Since a linear kernel can be substituted for nonlinear ones in our hybrid model without loss of accuracy, our model is scalable and computationally efficient.” Regarding claim 7: Ruff teaches [t]he anomaly detection apparatus according to claim 4 Ruff does not teach “ wherein the input data includes one of traffic data of a network in the system and sensor data output from a sensor “ However, Erfani teaches wherein the input data includes one of traffic data of a network in the system and sensor data output from a sensor (Erfani, page 6, col 2, ¶2 “The real-life datasets are from the UCI Machine Learning Repository: (i) Forest Adult Gas Sensor Array Drift (Gas),…” It is noted the claim recites alternative language, and Erfani teaches at least one of the alternatives.). Ruff and Erfani are analogous art because both references concern methods for deep one-class learning for anomaly detection. Accordingly, it would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Ruff’s anomaly detection system to incorporate the sensor data taught by Erfani. The motivation for doing so would have been to be scalable and computationally efficient as stated in Erfani, page 1, Abstract “Since a linear kernel can be substituted for nonlinear ones in our hybrid model without loss of accuracy, our model is scalable and computationally efficient.” Regarding claims 13-14: The rejections of claims 6-7 are applicable to claims 13-14, respectively. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Tran et al. (“Anomaly Detection using a Convolutional Winner-Take-All Autoencoder”, Tran et al., 2017) discloses using the motion-feature encoding extracted from a convolutional autoencoder as input to a one-class SVM rather than exploiting reconstruction error of the convolutional autoencoder, and (2) introducing a spatial winner-take-all step after the final encoding layer during training to introduce a high degree of sparsity. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JACOB Z SUSSMAN MOSS whose telephone number is (571) 272-1579. The examiner can normally be reached Monday - Friday, 9 a.m. - 5 p.m. ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kakali Chaki can be reached on (571) 272-3719. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /J.S.M./Examiner, Art Unit 2122 /KAKALI CHAKI/Supervisory Patent Examiner, Art Unit 2122
Read full office action

Prosecution Timeline

Jun 05, 2023
Application Filed
Feb 17, 2026
Non-Final Rejection — §101, §102, §103 (current)

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
14%
Grant Probability
-6%
With Interview (-20.0%)
3y 3m
Median Time to Grant
Low
PTA Risk
Based on 7 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month