Prosecution Insights
Last updated: April 19, 2026
Application No. 18/078,504

PRUNING HARDWARE UNIT FOR TRAINING NEURAL NETWORK

Non-Final OA §101§102§103§112
Filed
Dec 09, 2022
Examiner
COULSON, JESSE CHEN
Art Unit
2122
Tech Center
2100 — Computer Architecture & Software
Assignee
Alibaba Group Holding Limited
OA Round
1 (Non-Final)
25%
Grant Probability
At Risk
1-2
OA Rounds
3y 3m
To Grant
99%
With Interview

Examiner Intelligence

Grants only 25% of cases
25%
Career Allow Rate
1 granted / 4 resolved
-30.0% vs TC avg
Strong +100% interview lift
Without
With
+100.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
33 currently pending
Career history
37
Total Applications
across all art units

Statute-Specific Performance

§101
30.6%
-9.4% vs TC avg
§103
29.8%
-10.2% vs TC avg
§102
22.6%
-17.4% vs TC avg
§112
17.1%
-22.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 4 resolved cases

Office Action

§101 §102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . The action is in response to the application filed on 12/09/2022. Claims 1-20 are pending and have been examined. Information Disclosure Statement The information disclosure statements (IDS) submitted on 12/20/2022 and 5/12/2025 are in compliance with the provisions of 37 CFR 1.97, 1.98, and MPEP § 609. They have been placed in the application file, and the information referred to therein has been considered as to the merits. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation is: a neural network training engine in claim 8. Because this claim limitation is being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it is being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this limitation interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation to avoid it being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation recites sufficient structure to perform the claimed function so as to avoid it being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 8-15 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Regarding Claim 8: Claim limitation “a neural network training engine configured to generate outputs comprising values of weights for nodes of a neural network” invokes 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. However, the written description fails to disclose the corresponding structure, material, or acts for performing the entire claimed function and to clearly link the structure, material, or acts to the function. There is not sufficient structure for the neural network training engine. Therefore, the claim is indefinite and is rejected under 35 U.S.C. 112(b) or pre-AIA 35 U.S.C. 112, second paragraph. Applicant may: (a) Amend the claim so that the claim limitation will no longer be interpreted as a limitation under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph; (b) Amend the written description of the specification such that it expressly recites what structure, material, or acts perform the entire claimed function, without introducing any new matter (35 U.S.C. 132(a)); or (c) Amend the written description of the specification such that it clearly links the structure, material, or acts disclosed therein to the function recited in the claim, without introducing any new matter (35 U.S.C. 132(a)). If applicant is of the opinion that the written description of the specification already implicitly or inherently discloses the corresponding structure, material, or acts and clearly links them to the function so that one of ordinary skill in the art would recognize what structure, material, or acts perform the claimed function, applicant should clarify the record by either: (a) Amending the written description of the specification such that it expressly recites the corresponding structure, material, or acts for performing the claimed function and clearly links or associates the structure, material, or acts to the claimed function, without introducing any new matter (35 U.S.C. 132(a)); or (b) Stating on the record what the corresponding structure, material, or acts, which are implicitly or inherently set forth in the written description of the specification, perform the claimed function. For more information, see 37 CFR 1.75(d) and MPEP §§ 608.01(o) and 2181. Regarding Claims 9-15: Claims 9-15 are rejected as being dependent on a rejected base claim without curing any of the deficiencies. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding Claim 1: Step 1: The claim recites an apparatus, which is one of the four statutory categories of patentable subject matter. Step 2A prong 1: The claim recites an abstract idea. Specifically, the limitation selecting, from the weights, unpruned weights for pruning amounts to a mental process as it can be performed in a human mind. The claim recites an additional abstract idea. Specifically, the limitation pruning the selected unpruned weights amounts to a mental process as it can be performed in a human mind. The claim recites an additional abstract idea. Specifically, the limitation updating the value of the indicator of each of the weights according to the pruned weights amounts to a mental process as it can be performed in a human mind. Step 2A prong 2: The additional element of using an apparatus for training neural networks is a generic computer component amounting to mere instructions to apply the abstract idea, therefore does not integrate the abstract idea into practical application MPEP 2106.05(f). The additional element of using a controller is a generic computer component amounting to mere instructions to apply the abstract idea, therefore does not integrate the abstract idea into practical application MPEP 2106.05(f). The additional element of using a plurality of registers coupled to the controller is a generic computer component amounting to mere instructions to apply the abstract idea, therefore does not integrate the abstract idea into practical application MPEP 2106.05(f). The additional element of receiving inputs comprising (i) values of weights for nodes of a neural network and (ii) a value of an indicator of each of the weights does not integrate the abstract idea into practical application because receiving graph data is considered an insignificant extra solution activity of “mere data gathering” MPEP 2106.05(g). The additional element of providing the updated value of the indicator of each of the weights to a neural network training engine is generally linked to the abstract idea, therefore does not integrate the abstract idea into practical application MPEP 2106.05(h). Step 2B: The additional element of using an apparatus for training neural networks is a generic computer component amounting to mere instructions to apply the abstract idea, therefore does not amount to significantly more MPEP 2106.05(f). The additional element of using a controller is a generic computer component amounting to mere instructions to apply the abstract idea, therefore does not amount to significantly more MPEP 2106.05(f). The additional element of using a plurality of registers coupled to the controller is a generic computer component amounting to mere instructions to apply the abstract idea, therefore does not amount to significantly more MPEP 2106.05(f). The additional element of receiving inputs comprising (i) values of weights for nodes of a neural network and (ii) a value of an indicator of each of the weights does not amount to significantly more because the additional element is an insignificant extra solution activity and further is a well understood routine and conventional activity. See MPEP 2106.05(d)(II)(i), (buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network)). The additional element of providing the updated value of the indicator of each of the weights to a neural network training engine is generally linked to the abstract idea, therefore does not amount to significantly more MPEP 2106.05(h). Therefore, the claim is ineligible. Regarding Claim 2: Claim 2 incorporates the rejection of Claim 1. The claim does not recite any additional elements that integrate the abstract idea into practical application or amount to significantly more. Specifically, the claim recites a further additional element a register that indicates a pruning mode of a plurality of pruning modes; the plurality of pruning modes include an incremental pruning mode; and a fraction of the pruned weights increases as with neural network training which is generally linked to the abstract idea MPEP 2106.05(h). The claim is ineligible. Regarding Claim 3: Claim 3 incorporates the rejection of Claim 1. This claim further recites a description of the abstract idea of pruning the selected unpruned weights step of Claim 1. The claim does not recite any additional elements that integrate the abstract idea into practical application or amount to significantly more. The claim is ineligible. Regarding Claim 4: Claim 4 incorporates the rejection of Claim 1. The claim does not recite any additional elements that integrate the abstract idea into practical application or amount to significantly more. Specifically, the claim recites a further additional element a register that stores a value of a number of unpruned weights and a register that stores a value for selecting the weights for pruning which is an insignificant extra solution activity MPEP 2106.05(g). The additional element is further a well understood routine and conventional activity. See MPEP 2106.05(d)(II)(iv), (Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015)). The claim is ineligible. Regarding Claim 5: Claim 5 which incorporates the rejection of Claim 4, recites a further abstract idea compute the value for selecting the weights for pruning by multiplying the value of the number of unpruned weights and a value based on a target sparsity value which is a mathematical concept. The claim does not recite any additional elements that integrate the abstract idea into practical application or amount to significantly more. The claim is ineligible. Regarding Claim 6: Claim 6 incorporates the rejection of Claim 1. The claim does not recite any additional elements that integrate the abstract idea into practical application or amount to significantly more. Specifically, the claim recites a further additional element registers comprise a register that stores a value for selecting the inputs which is an insignificant extra solution activity MPEP 2106.05(g). The additional elements is further a well understood routine and conventional activity. See MPEP 2106.05(d)(II)(iv), (Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015)). The claim is ineligible. Regarding Claim 7: Claim 7 incorporates the rejection of Claim 1. The claim does not recite any additional elements that integrate the abstract idea into practical application or amount to significantly more. Specifically, the claim recites a further additional element registers comprise a register that stores criteria for pruning the weights which is an insignificant extra solution activity MPEP 2106.05(g). The additional element is further a well understood routine and conventional activity. See MPEP 2106.05(d)(II)(iv), (Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015)). The claim is ineligible. Regarding Claim 8: Step 1: The claim recites a system, which is one of the four statutory categories of patentable subject matter. Step 2A prong 1: The claim recites an abstract idea. Specifically, the limitation select, from the weights, unpruned weights for pruning amounts to a mental process as it can be performed in a human mind. The claim recites an additional abstract idea. Specifically, the limitation prune the selected unpruned weights amounts to a mental process as it can be performed in a human mind. The claim recites an additional abstract idea. Specifically, the limitation update the value of the indicator of each of the weights according to the pruned weights amounts to a mental process as it can be performed in a human mind. Step 2A prong 2: The additional element of using a neural network training engine is a generic computer component amounting to mere instructions to apply the abstract idea, therefore does not integrate the abstract idea into practical application MPEP 2106.05(f). The additional element of generate outputs comprising values of weights for nodes of a neural network is generally linked to the abstract idea, therefore does not integrate the abstract idea into practical application MPEP 2106.05(h). The additional element of using an apparatus is a generic computer component amounting to mere instructions to apply the abstract idea, therefore does not integrate the abstract idea into practical application MPEP 2106.05(f). The additional element of using an application programming interface (API) is a generic computer component amounting to mere instructions to apply the abstract idea, therefore does not integrate the abstract idea into practical application MPEP 2106.05(f). The additional element of using a controller is a generic computer component amounting to mere instructions to apply the abstract idea, therefore does not integrate the abstract idea into practical application MPEP 2106.05(f). The additional element of using a plurality of registers coupled to the controller is a generic computer component amounting to mere instructions to apply the abstract idea, therefore does not integrate the abstract idea into practical application MPEP 2106.05(f). The additional element of receive the outputs from the training engine does not integrate the abstract idea into practical application because receiving graph data is considered an insignificant extra solution activity of “mere data gathering” MPEP 2106.05(g). The additional element receive a value of an indicator of each of the weights does not integrate the abstract idea into practical application because receiving graph data is considered an insignificant extra solution activity of “mere data gathering” MPEP 2106.05(g). The additional element of provide the updated value of the indicator of each of the weights to a neural network training engine is generally linked to the abstract idea, therefore does not integrate the abstract idea into practical application MPEP 2106.05(h). Step 2B: The additional element of using a neural network training engine is a generic computer component amounting to mere instructions to apply the abstract idea, therefore does not amount to significantly more MPEP 2106.05(f). The additional element of generate outputs comprising values of weights for nodes of a neural network is generally linked to the abstract idea, therefore does not amount to significantly more MPEP 2106.05(h). The additional element of using an apparatus is a generic computer component amounting to mere instructions to apply the abstract idea therefore does not amount to significantly more MPEP 2106.05(f). The additional element of using an application programming interface (API) is a generic computer component amounting to mere instructions to apply the abstract idea, therefore does not amount to significantly more MPEP 2106.05(f). The additional element of using a controller is a generic computer component amounting to mere instructions to apply the abstract idea, therefore does not amount to significantly more MPEP 2106.05(f). The additional element of using a plurality of registers coupled to the controller is a generic computer component amounting to mere instructions to apply the abstract idea, therefore does not amount to significantly more MPEP 2106.05(f). The additional element of receive the outputs from the training engine does not amount to significantly more because the additional element is an insignificant extra solution activity and further is a well understood routine and conventional activity. See MPEP 2106.05(d)(II)(i), (buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network)). The additional element of receive a value of an indicator of each of the weights does not amount to significantly more because the additional element is an insignificant extra solution activity and further is a well understood routine and conventional activity. See MPEP 2106.05(d)(II)(i), (buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network)). The additional element of provide the updated value of the indicator of each of the weights to a neural network training engine is generally linked to the abstract idea therefore does not amount to significantly more MPEP 2106.05(h). Regarding Claim 9: Claim 9 incorporates the rejection of Claim 8. The claim does not recite any additional elements that integrate the abstract idea into practical application or amount to significantly more. Specifically, the claim recites a further additional element indicates a pruning mode of a plurality of pruning modes; the plurality of pruning modes include an incremental pruning mode; and a fraction of the pruned weights increases with neural network training which is generally linked to the abstract idea MPEP 2106.05(h). The claim is ineligible. Regarding Claim 10: Claim 10 incorporates the rejection of Claim 8. This claim further recites a description of the abstract idea of pruning the selected unpruned weights step of Claim 8. The claim does not recite any additional elements that integrate the abstract idea into practical application or amount to significantly more. The claim is ineligible. Regarding Claim 11: Claim 11 incorporates the rejection of Claim 8. The claim does not recite any additional elements that integrate the abstract idea into practical application or amount to significantly more. Specifically, the claim recites a further additional element a register that stores a value of a number of unpruned weights and a register that stores a value for selecting the weights for pruning which is an insignificant extra solution activity MPEP 2106.05(g). The additional element is further a well understood routine and conventional activity. See MPEP 2106.05(d)(II)(iv), (Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015)). The claim is ineligible. Regarding Claim 12: Claim 12 which incorporates the rejection of Claim 11, recites a further abstract idea determine the value for selecting the weights for pruning by multiplying the value of the number of unpruned weights and a value based on a target sparsity value which is a mathematical concept. The claim does not recite any additional elements that integrate the abstract idea into practical application or amount to significantly more. The claim is ineligible. Regarding Claim 13: Claim 13 incorporates the rejection of Claim 8. The claim does not recite any additional elements that integrate the abstract idea into practical application or amount to significantly more. Specifically, the claim recites a further additional element registers comprise a register that stores a value for selecting inputs which is an insignificant extra solution activity MPEP 2106.05(g). The additional element is further a well understood routine and conventional activity. See MPEP 2106.05(d)(II)(iv), (Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015)). The claim is ineligible. Regarding Claim 14: Claim 14 incorporates the rejection of Claim 8. The claim does not recite any additional elements that integrate the abstract idea into practical application or amount to significantly more. Specifically, the claim recites a further additional element registers comprise a register that stores criteria for pruning the weights which is an insignificant extra solution activity MPEP 2106.05(g). The additional element is further a well understood routine and conventional activity. See MPEP 2106.05(d)(II)(iv), (Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015)). The claim is ineligible. Regarding Claim 15: Claim 15 incorporates the rejection of Claim 8. The claim does not recite any additional elements that integrate the abstract idea into practical application or amount to significantly more. Specifically, the claim recites a further additional element write values to the plurality of registers which is an insignificant extra solution activity MPEP 2106.05(g). The additional element is further a well understood routine and conventional activity. See MPEP 2106.05(d)(II)(iv), (Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015)). The claim is ineligible. Regarding Claim 16: Step 1: The claim recites an apparatus, which is one of the four statutory categories of patentable subject matter. Step 2A prong 1: The claim recites an abstract idea. Specifically, the limitation outputting criteria of weights for unpruned weights amounts to a mental process as it can be performed in a human mind. The claim recites an additional abstract idea. Specifically, the limitation computing a value of a pruning threshold based on the outputted criteria amounts to a mental process as it can be performed in a human mind. The claim recites an additional abstract idea. Specifically, the limitation updating the values of the indicator of each of the weights amounts to a mental process as it can be performed in a human mind. Step 2A prong 2: The additional element of using an apparatus for training neural networks is a generic computer component amounting to mere instructions to apply the abstract idea, therefore does not integrate the abstract idea into practical application MPEP 2106.05(f). The additional element of using a controller is a generic computer component amounting to mere instructions to apply the abstract idea, therefore does not integrate the abstract idea into practical application MPEP 2106.05(f). The additional element of using a plurality of registers coupled to the controller is a generic computer component amounting to mere instructions to apply the abstract idea, therefore does not integrate the abstract idea into practical application MPEP 2106.05(f). The additional element of using a neural network training engine is a generic computer component amounting to mere instructions to apply the abstract idea, therefore does not integrate the abstract idea into practical application MPEP 2106.05(f). The additional element of receiving inputs from… the inputs comprising (i) values of weights for nodes of a neural network and (ii) values of an indicator of each of the weights does not integrate the abstract idea into practical application because receiving graph data is considered an insignificant extra solution activity of “mere data gathering” MPEP 2106.05(g). The additional element of updating values sued by the neural network training engine based on the updated values of the indicator of each of the weights is generally linked to the abstract idea, therefore does not integrate the abstract idea into practical application MPEP 2106.05(h). Step 2B: The additional element of using an apparatus for training neural networks is a generic computer component amounting to mere instructions to apply the abstract idea, therefore does amount to significantly more MPEP 2106.05(f). The additional element of using a controller is a generic computer component amounting to mere instructions to apply the abstract idea, therefore does amount to significantly more MPEP 2106.05(f). The additional element of using a plurality of registers coupled to the controller is a generic computer component amounting to mere instructions to apply the abstract idea, therefore does amount to significantly more MPEP 2106.05(f). The additional element of using a neural network training engine is a generic computer component amounting to mere instructions to apply the abstract idea, therefore does amount to significantly more MPEP 2106.05(f). The additional element of receiving inputs from… the inputs comprising (i) values of weights for nodes of a neural network and (ii) values of an indicator of each of the weights does not amount to significantly more because the additional element is an insignificant extra solution activity and further is a well understood routine and conventional activity. See MPEP 2106.05(d)(II)(i), (buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network)). The additional element of updating values sued by the neural network training engine based on the updated values of the indicator of each of the weights is generally linked to the abstract idea, therefore does amount to significantly more MPEP 2106.05(h). Therefore, the claim is ineligible. Regarding Claim 17: Claim 17 incorporates the rejection of Claim 16. The claim does not recite any additional elements that integrate the abstract idea into practical application or amount to significantly more. Specifically, the claim recites a further additional element indicates a pruning mode of a plurality of pruning modes; the plurality of pruning modes include an incremental pruning mode; and a fraction of the pruned weights increases with neural network training which is generally linked to the abstract idea MPEP 2106.05(h). The claim is ineligible. Regarding Claim 18: Claim 18 which incorporates the rejection of Claim 16, recites a further abstract idea compare the criteria of weights for unpruned weights and the pruning threshold to select weights for pruning and update the values of the indicator of each of the weights accordingly which amounts to a mental process as it can be performed in a human mind. The claim does not recite any additional elements that integrate the abstract idea into practical application or amount to significantly more. Specifically, the claim recites a further additional element a register that stores a value of a number of unpruned weights and a register that stores a value for selecting weights for pruning which is an insignificant extra solution activity MPEP 2106.05(g). The additional element is further a well understood routine and conventional activity. See MPEP 2106.05(d)(II)(iv), (Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015)). The claim is ineligible. Regarding Claim 19: Claim 19 incorporates the rejection of Claim 16. The claim does not recite any additional elements that integrate the abstract idea into practical application or amount to significantly more. Specifically, the claim recites a further additional element a register that stores a value for selecting the inputs which is an insignificant extra solution activity MPEP 2106.05(g). The additional elements is further a well understood routine and conventional activity. See MPEP 2106.05(d)(II)(iv), (Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015)). The claim is ineligible. Regarding Claim 20: Claim 20 incorporates the rejection of Claim 16. The claim does not recite any additional elements that integrate the abstract idea into practical application or amount to significantly more. Specifically, the claim recites a further additional element a register that stores criteria for pruning the weights which is an insignificant extra solution activity MPEP 2106.05(g). The additional element is further a well understood routine and conventional activity. See MPEP 2106.05(d)(II)(iv), (Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015)). The claim is ineligible. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-7 and 16-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Xu et al. (US Patent Application Publication No. US 20190362235 A1), from applicant IDS, hereinafter “Xu”. Regarding Claim 1, Xu teaches: An apparatus for training neural networks, the apparatus comprising: a controller (p. 13, ¶113, “integrated memory controller”); and a plurality of registers coupled to the controller (p. 13, ¶113, “integrated memory controller… to communicate with memory elements”, p. 11, ¶90, “storage 1158 may be… registers”); wherein the apparatus is configured to perform operations comprising: receiving inputs comprising (i) values of weights for nodes of a neural network (p. 1, Abstract, “weight values of each channel”) and (ii) a value of an indicator of each of the weights, wherein the value of the indicator indicates whether the weight is a pruned weight or an unpruned weight (p. 7, ¶56, “maskln may be defined to represent the binary mask governing which weights to prune”); selecting, from the weights, unpruned weights for pruning (p. 5, ¶46, “For instance, in an initial prune, 30% of the lowest ranked channels (e.g., those with the lowest aggregate weights) may be selected for pruning and a mask may be generated”); pruning the selected unpruned weights (p. 5, ¶46, “The channels may then be pruned 530 according to the mask to generate a pruned version”); updating the value of the indicator of each of the weights according to the pruned weights (New mask is generated on updated weights in next iteration, p. 5, ¶46, “then the pruning steps for the particular layer are repeated”, p.5, ¶47, “pruning percentage may be increased… and a new mask may be created (at 525) to prune an additional number of channels”); and providing the updated value of the indicator of each of the weights to a neural network training engine (Fig. 5A, Step 525 creating mask leads to further training steps, Step 530 Prune channels and Step 535 Forward network with pruned layer). Regarding Claim 2, Xu teaches the apparatus of Claim 1 as referenced above. Xu further teaches: wherein: the plurality of registers comprise a register (p. 11, ¶90, “storage 1158 may be… registers”) that indicates a pruning mode of a plurality of pruning modes (Using coarse-grained pruning module or fine-grained pruning module indicates which mode, p. 2, ¶28, “perform both coarse-grained neural network pruning… (using coarse-grained pruning module 220), as well as more surgical, fine-grained neural network pruning… (using fine-grained pruning module 225)”); the plurality of pruning modes include an incremental pruning mode; and a fraction of the pruned weights increases as with neural network training (p. 5, ¶47, “the initial pruning percentage may be increased (e.g., incremented (e.g., by 5%, 10%, etc.)) by the pruner tool and a new mask may be created (at 525) to prune an additional number of channels from the layer according to the incremented percentage”). Regarding Claim 3, Xu teaches the apparatus of Claim 1 as referenced above. Xu further teaches: wherein pruning the selected unpruned weights comprises setting values of the weights selected for pruning to zero (p. 4, ¶41, “pruned such that their weight values are artificially changed to zero”). Regarding Claim 4, Xu teaches the apparatus of Claim 1 as referenced above. Xu further teaches: wherein: the plurality of registers comprise a register that stores a value (p. 11, ¶90, “storage 1158 may be… registers”) of a number of unpruned weights (p. 3, ¶30, “inputs, such values as a neural network model (e.g., 230b) to be pruned”) and a register that stores a value (p. 11, ¶90, “storage 1158 may be… registers”) for selecting the weights for pruning (p. 7, ¶56, “maskln may be defined to represent the binary mask governing which weights to prune”); and the value for selecting the weights for pruning corresponds to a fraction of the number of unpruned weights (Binary mask corresponds to fraction of pruned and unpruned weights, p. 5, ¶46, “mask to generate a pruned version of the layer”). Regarding Claim 5, Xu teaches the apparatus of Claim 4 as referenced above. Xu further teaches: wherein the controller is configured to compute the value for selecting the weights for pruning by multiplying the value of the number of unpruned weights and a value based on a target sparsity value (Controlling factor σ is value based on target sparsity level, p. 7, ¶46, “This binary mask is… based on the threshold that is computed from the mean and standard deviation of the weights in each layer with sparsity level controlling factor σ”, Equation 1 tln). Regarding Claim 6, Xu teaches the apparatus of Claim 1 as referenced above. Xu further teaches: wherein the plurality of registers comprise a register that stores a value (p. 11, ¶90, “storage 1158 may be… registers”) for selecting the inputs (Input weights are selected for pruning with binary mask values 0 or 1, p. 7, Equation 1, p. 5, ¶46, “The channels may then be pruned 530 according to the mask to generate a pruned version of the layer”). Regarding Claim 7, Xu teaches the apparatus of Claim 1 as referenced above. Xu further teaches: wherein the plurality of registers comprise a register that stores (p. 11, ¶90, “storage 1158 may be… registers”) criteria for pruning the weights (Criteria is percentage to prune, p. 5, ¶46, “Any non-zero starting percentage may be set during configuration… For instance, in an initial prune, 30% of the lowest ranked channels (e.g., those with the lowest aggregate weights) may be selected for pruning”). Regarding Claim 16, Xu teaches: An apparatus for training neural networks, the apparatus comprising: a controller (p. 13, ¶113, “integrated memory controller”); and a plurality of registers coupled to the controller (p. 13, ¶113, “integrated memory controller… to communicate with memory elements”, p. 11, ¶90, “storage 1158 may be… registers”); wherein the apparatus is configured to perform operations comprising: receiving inputs from a neural network training engine, the inputs comprising (i) values of weights for nodes of a neural network (p. 1, Abstract, “weight values of each channel”) and (ii) values of an indicator of each of the weights, wherein the value of the indicator indicates whether the weight is a pruned weight or an unpruned weight (p. 7, ¶56, “maskln may be defined to represent the binary mask governing which weights to prune”); outputting criteria of weights for unpruned weights and computing a value of a pruning threshold based on the outputted criteria (Criteria is percentage to prune, computed value threshold is values of weights in % of lowest ranked channels, p. 5, ¶46, “Any non-zero starting percentage may be set during configuration… For instance, in an initial prune, 30% of the lowest ranked channels (e.g., those with the lowest aggregate weights) may be selected for pruning”); and updating the values of the indicator of each of the weights (New mask is generated on updated weights in next iteration, p. 5, ¶46, “then the pruning steps for the particular layer are repeated”, p.5, ¶47, “pruning percentage may be increased… and a new mask may be created (at 525) to prune an additional number of channels”); updating values used by the neural network training engine based on the updated values of the indicator of each of the weights (Fig. 5A, Step 525 creating updated mask during network training with Step 530 Prune channels and Step 535 Forward network with pruned layer, and repeating the process Step 545). Regarding Claim 17, the rejection of Claim 16 is incorporated and further, the claim is rejected for the same reasons as set forth in Claim 2. Regarding Claim 18 teaches the apparatus of Claim 16 as referenced above. Xu further teaches: wherein: the plurality of registers comprise a register that stores a value (p. 11, ¶90, “storage 1158 may be… registers”) of a number of unpruned weights (p. 3, ¶30, “inputs, such values as a neural network model (e.g., 230b) to be pruned”) and a register that stores a value (p. 11, ¶90, “storage 1158 may be… registers”) for selecting weights for pruning (p. 7, ¶56, “maskln may be defined to represent the binary mask governing which weights to prune”); the value for selecting weights for pruning corresponds to a fraction of the number of unpruned weights (Binary mask corresponds to fraction of unpruned weights, p. 5, ¶46, “mask to generate a pruned version of the layer”); and the apparatus is configured to compare the criteria of weights for unpruned weights and the pruning threshold to select weights for pruning and update the values of the indicator of each of the weights accordingly (Weight values are compared with pruning threshold to generate current mask indicator, p. 5, ¶46, “initial prune, 30% of the lowest ranked channels (e.g., those with the lowest aggregate weights) may be selected for pruning and a mask may be generated 525 based on this pruning percentage”). Regarding Claim 19, the rejection of Claim 16 is incorporated and further, the claim is rejected for the same reasons as set forth in Claim 6. Regarding Claim 20, the rejection of Claim 16 is incorporated and further, the claim is rejected for the same reasons as set forth in Claim 7. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 8-15 are rejected under 35 U.S.C. 103 as being obvious over Xu, in view of Byun et al. (US Patent Application Publication No. US 20190087729 A1), hereinafter “Byun”. Regarding Claim 8, Xu teaches: A system for training neural networks, the system comprising: a neural network training engine configured to generate outputs comprising values of weights for nodes of a neural network (After pruning receive outputs of weight values for pruned neural network, Xu, p. 5, ¶46, “The channels may then be pruned 530 according to the mask to generate a pruned version of the layer”); and an apparatus coupled to the neural network training engine via… wherein: the apparatus comprises a controller and a plurality of registers coupled to the controller (Xu, p. 13, ¶113, “integrated memory controller… to communicate with memory elements”, p. 11, ¶90, “storage 1158 may be… registers”); and the apparatus is configured to: receive the outputs from the training engine (Xu, Fig 5A steps 530 and 535 shows pruned model data received and used in training process); receive a value of an indicator of each of the weights, wherein the value of the indicator indicates whether the weight is a pruned weight or an unpruned weight (Xu, p. 7, ¶56, “maskln may be defined to represent the binary mask governing which weights to prune”); select, from the weights, unpruned weights for pruning (Xu, p. 5, ¶46, “For instance, in an initial prune, 30% of the lowest ranked channels (e.g., those with the lowest aggregate weights) may be selected for pruning and a mask may be generated”); prune the selected unpruned weights (Xu, p. 5, ¶46, “The channels may then be pruned 530 according to the mask to generate a pruned version”); update the value of the indicator of each of the weights according to the pruned weights (New mask is generated on updated weights in next iteration, Xu, p. 5, ¶46, “then the pruning steps for the particular layer are repeated”, p.5, ¶47, “pruning percentage may be increased… and a new mask may be created (at 525) to prune an additional number of channels”); and provide the updated value of the indicator of each of the weights to a neural network training engine (Xu, Fig. 5A, Step 525 provides updated mask goes to Step 530 to continue training process). Xu does not expressly teach: …an application programming interface (API)… However, Byun teaches: …an application programming interface (API)… (Byun, p. 6, ¶56, “examples may be implemented using… software elements… software may include… application program interfaces (API)”) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to use an application programming interface as does Byun, in the invention of Xu. The motivation to do so would be to allow interfacing with software and data. Regarding Claim 9, the rejection of Claim 8 is incorporated and further, the claim is rejected for the same reasons as set forth in Claim 2. Regarding Claim 10, the rejection of Claim 8 is incorporated and further, the claim is rejected for the same reasons as set forth in Claim 3. Regarding Claim 11, the rejection of Claim 8 is incorporated and further, the claim is rejected for the same reasons as set forth in Claim 4. Regarding Claim 12, the rejection of Claim 11 is incorporated and further, the claim is rejected for the same reasons as set forth in Claim 5. Regarding Claim 13, the rejection of Claim 8 is incorporated and further, the claim is rejected for the same reasons as set forth in Claim 6. Regarding Claim 14, the rejection of Claim 8 is incorporated and further, the claim is rejected for the same reasons as set forth in Claim 7. Regarding Claim 15, Xu in view of Byun teaches the system of Claim 8 as referenced above. In the combination as set forth above in Claim 8, Xu further teaches: wherein the… write values to the plurality of registers (Xu, p.13, ¶113, “Memory elements 1332 and/or 1334 may store various data”) In the combination as set forth above in Claim 8, Byun teaches: …API is configured to… (Byun, p. 6, ¶56, “examples may be implemented using… software elements… software may include… application program interfaces (API)”) Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JESSE CHEN COULSON whose telephone number is (571)272-4716. The examiner can normally be reached Monday-Friday 8:30-5:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kakali Chaki can be reached at (571) 272-3719. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JESSE C COULSON/ Examiner, Art Unit 2122 /KAKALI CHAKI/Supervisory Patent Examiner, Art Unit 2122
Read full office action

Prosecution Timeline

Dec 09, 2022
Application Filed
Aug 27, 2025
Non-Final Rejection — §101, §102, §103 (current)

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
25%
Grant Probability
99%
With Interview (+100.0%)
3y 3m
Median Time to Grant
Low
PTA Risk
Based on 4 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month