Prosecution Insights
Last updated: April 19, 2026
Application No. 17/431,012

APPARATUS AND A METHOD FOR NEURAL NETWORK COMPRESSION

Final Rejection §101
Filed
Aug 13, 2021
Examiner
TRUJILLO, JAMES K
Art Unit
2151
Tech Center
2100 — Computer Architecture & Software
Assignee
Nokia Technologies Oy
OA Round
4 (Final)
16%
Grant Probability
At Risk
5-6
OA Rounds
4y 3m
To Grant
28%
With Interview

Examiner Intelligence

Grants only 16% of cases
16%
Career Allow Rate
4 granted / 25 resolved
-39.0% vs TC avg
Moderate +12% lift
Without
With
+12.0%
Interview Lift
resolved cases with interview
Typical timeline
4y 3m
Avg Prosecution
4 currently pending
Career history
29
Total Applications
across all art units

Statute-Specific Performance

§101
18.3%
-21.7% vs TC avg
§103
51.0%
+11.0% vs TC avg
§102
17.8%
-22.2% vs TC avg
§112
9.1%
-30.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 25 resolved cases

Office Action

§101
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 04/28/2025 has been entered. Response to Amendment Applicant’s submission filed on October 8, 2025 has been entered. Claims 18, 20-32, 34 and 36-37 are pending. Claims 18, 21, 32 and 37 were amended. Claims 19, 33, and 35 have been canceled. Response to Arguments Applicant’s arguments filed on October 8, 2025 with regard to 35 USC 112 for claims 33 and 35 have been fully considered and not found persuasive. Response to 112(a) Rejection Applicant has canceled claims 33 and 35. Therefore the rejections under 35 USC 112(a) to those claims are no longer applicable. Response to 101 Rejection On page 11 of the Remarks, Applicant argues: “The elements of the claims are integrated into the practical application of increasing a sparsity of a neural network using a compression loss and a compression loss weight, which is useful because as indicated in the specification at page 8, line 34 to page 9, line 6, the resulting compressed neural network requires less memory for storage and fewer computation resources, and, when transmitting the compressed neural network, the required bandwidth is much less than when transmitting the original uncompressed model.” On page 13 of the Remarks, Applicant argues: “The claimed method addresses the problems discussed within Applicant's filed specification. As discussed within Applicant's filed specification at page 1, lines 15-19, "Running neural network(s) require large memory and computational resources. Requirements for large memory and computational resources prohibits efficient use of neural networks and deployment of neural network(s) to devices having limited memory and computational resources, such as mobile phones and IoT devices." Examiner’s Response: Regarding the first two arguments, applicant identifies their improvement as the loss function. The loss function is a mathematical concept, an equation, which is an abstract idea used to calculate sparsity. In this case, the loss function is configured to reduce sparsity. The abstract idea cannot be grounds for the improvement, therefore the training of a neural network with a specific loss function cannot be the grounds for improvement. See MPEP 2106.05(a) “It is important to note, the judicial exception alone cannot provide the improvement. The improvement can be provided by one or more additional elements. See the discussion of Diamond v. Diehr, 450 U.S. 175, 187 and 191-92, 209 USPQ 1, 10 (1981)) in subsection II, below.” Notably, “the novelty of the mathematical algorithm is not a determining factor at all[.]” MPEP 2106.04 (quoting Flook, 437 U.S. at 591-92, 198 USPQ2d at 198). As discussed above, the amended claims are directed to mathematical relationships and mathematical operations. Applicant attempts to shift the improvement to using the compressed neural network; however the alleged efficiency of using a compressed neural network would not be realized without training the neural network on the specific loss function, therefore the improvement is the loss function (and its constituent components such as the compression loss). Therefore, the improvement is the specific use of a loss function, a mathematical operation, which cannot be the grounds for the improvement. Therefore, at this time the claims do not demonstrate an integration into a practical application. On page 14 of the Remarks, Applicant argues: “The claims of the instant case are similar to the claims in U.S. Patent Application Number 15/859,448 (now U.S. Patent No. 10,735,346, hereinafter the '448 application), which claims a method of prioritizing a data payload in an Internet of Things (IoT) network. During the September 24, 2019 final rejection of the '448 application, the claims were rejected under 35 U.S.C. § 101. In Applicant's March 17, 2019 response, the claims were amended to include specific limitations describing the device and the prioritization process, and Applicant argued on page 13 of the March 17, 2019 response: "The present claims are directed to IoT devices and techniques for transmitting data in a network of IoT devices. Many IoT devices are battery powered and tend to have limited processing resources. Additionally, a typical IoT network may include a large number of IoT devices. Accordingly, energy efficiency and network capacity usage are of greater concern in IoT networks compared to other types of computer networks. The present claims recite techniques that improve energy efficiency and network capacity usage in IoT networks." The Examiner allowed the claims after reviewing the amendments and arguments provided within Applicant's March 17, 2019 response. Refer to the April 2, 2020 Notice of Allowance." Examiner’s Response: Applicant does not cite to support that another examiner’s decision is binding upon other examiners within the Office. Examiner can find no support within the MPEP that another examiner’s decision is binding upon other examiners within the Office. Further, the claims are not similar to the claims in the instant application. The claims in other mentioned application are of a different technology. Further, and more importantly, the claims in the other mentioned application do not recite in prose a mathematical equation as do the claims in the instant application. For these reasons, and the reasons under the 35 U.S.C. 101 rejection header, the Examiner maintains the 35 U.S.C. 101 rejection. Response to 103 Rejection Applicant’s arguments, see pages 15-22, filed October 8, 2025, with respect to claims 18, 20-32, 34 and 36-37 have been fully considered and are persuasive. The references do not teach the combination of limitations as in the claimed invention. The 35 USC 103 rejections of claims 18, 20-32, 34 and 36-37 have been withdrawn. Rejection 35 U.S.C. § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 18, 20-32, 34 and 36-37 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 of the subject matter eligibility test (see MPEP 2106.03). Claim 18, 20-31, and 38 are directed to am “apparatus” which describes one of the four statutory categories of patentable subject matter, i.e., a machine. Claims 32, 34 and 36 are directed to a “method” which describes one of the four statutory categories of patentable subject matter, i.e., a process. 35 U.S.C. 100(b). Claim 37 is directed to a “non-transitory computer readable medium” which describes one of the four statutory categories of patentable subject matter, i.e., a manufacture. Regarding Claim 18: Step 2A of the subject matter eligibility test (see MPEP 2106.04). Prong One: Claim 18 recites (“sets forth” or “describes”) the abstract idea, substantially as follows: “train a neural network, wherein to train the neural network, the apparatus is further caused to apply a loss function configured to increase sparsity of a weight tensor of the neural network;” – This limitation is directed to the abstract idea of mathematical concepts (see MPEP 2106.04(a)(2)) as applying a loss function during training is a mathematical process and increasing sparsity of a weight tensor refers to having numerical values in a tensor at or similar values. “wherein the loss function comprises a task loss added to: a compression loss weight multiplied with a compression loss;” – This limitation is directed to the abstract idea of mathematical concepts (see MPEP 2106.04(a)(2)) as it is reciting a loss function in prose and detailing mathematical operations including division, addition, multiplication, squaring a value of two values comprising a loss function. "wherein the compression loss is defined by: an L1 norm of the weight tensor of the neural network divided by an L2 norm of the weight tensor of the neural network, added to a transformation factor that is multiplied with a square of the L2 norm of the weight tensor of the neural network divided by the L1 norm of the weight tensor of the neural network." -- This limitation is directed to the abstract idea of mathematical concepts (see MPEP 2106.04(a)(2)) as it is reciting a loss function in prose and detailing mathematical operations including division and multiplication comprising a loss function. “entropy encode the weight tensor of the neural network to obtain a compressed neural network wherein the sparsity of the weight tensor of the neural network is increased following the training of the neural network by applying the loss function configured to increase the sparsity of the weight tensor of the neural network;” – This limitation is directed to the abstract idea of mathematical concepts (see MPEP 2106.04(a)(2)) as entropy encoding aims to optimize mapping of bits by assigning shorter codes to more frequent symbols based on their probability distribution is a mathematical operation. Prong Two: Claim 18 does not include additional elements that integrate the abstract idea into a practical application. “comprising at least one processor; and at least one memory storing instructions that, when executed by the at least one processor, cause the apparatus to at least to:” – This limitation recites a processor with memory including computer program code that are configured to implement the method with the at least one processor. A processor with memory is considered part of a generic computer. A general purpose computer that applies a judicial exception, such as an abstract idea, by use of conventional computer functions does not qualify as a particular machine (see MPEP 2016.05(f)). “transmit the compressed neural network for at least one task, wherein the at least one task comprises at least one of: image processing, image analysis, video processing, video analysis, multimedia content description, or multimedia content analysis.” – A general purpose computer that applies a judicial exception, such as an abstract idea, by use of conventional computer functions does not qualify as a particular machine (see MPEP 2016.05(f)) as it is merely invokes a computer or other machinery merely as a tool to perform an existing process to transmit images, videos or multimedia content. Therefore, the additional elements, alone or in combination, do not integrate the abstract idea into a practical application. See MPEP 2106.04(d). Step 2B of the subject matter eligibility test (see MPEP 2106.05). When considered individually or in combination, the additional limitations and elements of claim 18 does not amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements recited at Claim 18 amount to no more than generic computer components in a generic computer environment that do not link a judicial exception to a particular technological environment. The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception under step 2B. “comprising at least one processor; and at least one memory storing instructions that, when executed by the at least one processor, cause the apparatus to at least to:” – This limitation recites a processor with memory including computer program code that are configured to implement the method with the at least one processor. A processor with memory is considered part of a generic computer. A general purpose computer that applies a judicial exception, such as an abstract idea, by use of conventional computer functions does not qualify as a particular machine (see MPEP 2016.05(f)). “use the compressed neural network for at least one task, wherein the at least one task comprises at least one of: image processing, image analysis, video processing, video analysis, multimedia content description, or multimedia content analysis.” – is a recitation of field of use, see MPEP 2106 (citing Intellectual Ventures I LLC v. Capital One Bank (USA), N.A., 792 F.3d 1363, 1366, 115 USPQ2d 1636, 1639 (Fed. Cir. 2015) (“An abstract idea does not become nonabstract by limiting the invention to a particular field of use or technological environment, such as the Internet [or] a computer”) and therefore does not practically integrate the abstract idea in nor amount to significantly more than the judicial exception. Claim 20 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claim is dependent on claim 18 which included an abstract idea (see rejection for claim 18). This claim merely recites a further limitation on the loss function limitation from claim 18 which was directed to a mathematical process. The additional limitation: “wherein the loss function comprises at least one critical point, and wherein the loss function at the critical point corresponds to the sparse weight tensor of the neural network, and wherein a plurality of non-zero elements of the sparse weight tensor of the neural network have the same absolute value.” – This recitation is directed to the abstract idea of mathematical concepts (see MPEP 2106.04(a)(2)) as it involves calculating a critical point. Thus, the judicial exception is not integrated into a practical application (see MPEP 2106.04(d) I.), failing step 2A Prong 2. The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception under step 2B. Claim 21 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claim is dependent on claim 18 which included an abstract idea (see rejection for claim 18). This claim merely recites a further limitation on the loss function limitation from claim 18 which was directed to a mathematical process. The additional limitation: “wherein the loss function comprises a compression loss defined by an L1 norm of the weight tensor of the neural network divided by an L2 norm of the weight tensor of the neural network.” This recitation is directed to the clarifying the abstract idea of mathematical concepts (see MPEP 2106.04(a)(2)) as it is defining the loss function by an L1 norm of the weight tensor divided by an L2 norm of the weight tensor. Thus, the judicial exception is not integrated into a practical application (see MPEP 2106.04(d) I.), failing step 2A Prong 2. The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception under step 2B. Claim 22 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claim is dependent on claim 18 which included an abstract idea (see rejection for claim 18). This claim merely recites a further limitation on the defining sparse weight tensor limitation from claim 1 which was directed to the abstract idea of a mathematical process. The additional limitation: “wherein at least a portion of elements of the sparse weight tensor of the neural network are substantially equal to zero such that an absolute value of a difference between a value of any element of the portion of elements of the sparse weight tensor of the neural network and zero is less than or equal to a threshold.” – This limitation is directed to the abstract idea of a mental (including an observation, evaluation, judgment, opinion), evaluating whether the portion of elements of the sparse weight tensor are substantially equal to zero, which can be performed in the human mind, or by a human using pen and paper (see MPEP 2106.04(a)(2) III. C.). Additionally, this recitation is directed to performing the mental process using a mathematical calculation and therefore is also directed to the abstract idea of mathematical concepts (see MPEP 2106.04(a)(2)) as it is defining to evaluate substantially equal by comparing an absolute value of a difference within a threshold. Thus, the judicial exception is not integrated into a practical application (see MPEP 2106.04(d) I.), failing step 2A Prong 2. The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception under step 2B. Claim 23 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claim is dependent on claim 18 which included an abstract idea (see rejection for claim 18). This claim merely recites a further limitation on loss function from claim 18 which was directed to the abstract idea of a mathematical process. The additional limitation “wherein the loss function comprises a plurality of critical points comprising a first critical point and a second critical point, and wherein a first weight tensor of the neural network corresponding to a first value of the loss function at the first critical point comprises a first number of elements that are substantially equal to zero such that an absolute value of a difference between a value of any element of the portions of elements of the sparse weight tensor of the neural network and zero is less than or equal to a threshold; a second weight tensor of the neural network corresponding to a second value of the loss function at the second critical point comprises a second number of elements that are substantially equal to zero such that an absolute value of a difference between a value of any element of the portions of elements of the sparse weight tensor of the neural network and zero is less than or equal to a threshold; and wherein the first number is higher than the second number; and wherein the first value of the loss function is lower than the second value of the loss function.” - This limitation is directed to the abstract idea of mathematical concepts (see MPEP 2106.04(a)(2)) as it involves calculating a first critical point and a second critical point additionally it recites the evaluation of the values at the first critical point and second critical point, i.e., the first tensor and second weight tensor, with a mathematical operation, i.e., comparing absolute difference of values within a threshold. Thus, the judicial exception is not integrated into a practical application (see MPEP 2106.04(d) I.), failing step 2A Prong 2. The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception under step 2B. Claim 24 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claim is dependent on claim 18 which included an abstract idea (see rejection for claim 18). The additional limitation: “wherein during training, the apparatus causes a plurality of non-zero elements of the weight tensor of the neural network to have the same absolute value.” – In light of specification, because the loss function results in the plurality of non-zero elements of the weight tensor of the neural network to have the same absolute value this limitation is directed to the abstract idea of mathematical concepts (see MPEP 2106.04(a)(2)). Thus, the judicial exception is not integrated into a practical application (see MPEP 2106.04(d) I.), failing step 2A Prong 2. The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception under step 2B. Claim 25 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claim is dependent on claim 18 which included an abstract idea (see rejection for claim 18). This claim merely recites a further limitation on the apparatus from claim 18 which was directed to the abstract idea. “wherein the apparatus is further caused to: quantize the weight tensor of the neural network.” – This limitation is directed to the abstract idea of mathematical concepts (see MPEP 2106.04(a)(2)) to quantize the weight tensor. Thus, the judicial exception is not integrated into a practical application (see MPEP 2106.04(d) I.), failing step 2A Prong 2. The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception under step 2B. Claim 26 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claim is dependent on claim 25 which included an abstract idea (see rejection for claim 25). This claim merely recites a further limitation on the quantize the weight tensor limitation from claim 25 which was directed to the abstract idea of a mental process. The additional limitation: “wherein to quantize the weight tensor of the neural network, the apparatus is further caused to: approximate quantization by introducing additive noise to the weight tensor of the neural network during training, wherein the additive noise level is defined by a first hyperparameter.” – This limitation is directed to the abstract idea of mathematical concepts (see MPEP 2106.04(a)(2)) as it involves including a hyperparameter for additive noise to quantize the weight tensor. Thus, the judicial exception is not integrated into a practical application (see MPEP 2106.04(d) I.), failing step 2A Prong 2. The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception under step 2B. Claim 27 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claim is dependent on claim 25 which included an abstract idea (see rejection for claim 25). This claim merely recites a further limitation on the quantizing the weight tensor limitation from claim 25 which was directed to the abstract idea of a mental process. The additional limitation: “herein the quantizing is performed after training according to a set of hyperparameters comprising a first hyperparameter defining the additive noise level; a second hyperparameter defining a lower limit of a weight range; and a third hyperparameter defining an upper limit of the weight range.” – This limitation is directed to the abstract idea of mathematical concepts (see MPEP 2106.04(a)(2)) as it involves including three hyperparameter for additive noise to quantize the weight tensor. Thus, the judicial exception is not integrated into a practical application (see MPEP 2106.04(d) I.), failing step 2A Prong 2. The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception under step 2B. Claim 28 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claim is dependent on claim 18 which included an abstract idea (see rejection for claim 18). This claim merely recites a further limitation on neural network from claim 18 which was directed to the abstract idea of a mathematical process. The additional limitation: “initialize the neural network randomly by applying a mapping function arranged such that the initialization falls into non-saturated region of the mapping function.” – This limitation is directed to the abstract idea of mathematical concepts (see MPEP 2106.04(a)(2)) as it involves applying a mapping function. Thus, the judicial exception is not integrated into a practical application (see MPEP 2106.04(d) I.), failing step 2A Prong 2. The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception under step 2B. Claim 29 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claim is dependent on claim 28 which included an abstract idea (see rejection for claim 28). This claim merely recites a further limitation on mapping function from claim 28 which was directed to the abstract idea of a mathematical process. The additional limitation: “adaptively change weight initialization given the mapping function.” – This limitation is directed to the abstract idea of mathematical concepts (see MPEP 2106.04(a)(2)) as it involves adaptively change the weight initialization given the mapping function. Thus, the judicial exception is not integrated into a practical application (see MPEP 2106.04(d) I.), failing step 2A Prong 2. The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception under step 2B. Claim 30 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claim is dependent on claim 18 which included an abstract idea (see rejection for claim 18). This claim merely recites a further limitation on neural network from claim 18 which was directed to the abstract idea of a mathematical process. The additional limitation: “initialize the neural network from a given seed by applying a mapping function arranged such that the seed falls into non-saturated region of the mapping function.” – This limitation is directed to the abstract idea of mathematical concepts (see MPEP 2106.04(a)(2)) as it involves applying a mapping function. Thus, the judicial exception is not integrated into a practical application (see MPEP 2106.04(d) I.), failing step 2A Prong 2. The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception under step 2B. Claim 31 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claim is dependent on claim 28 which included an abstract idea (see rejection for claim 28). This claim merely recites a further limitation on mapping function from claim 28 which was directed to the abstract idea of a mathematical process. The additional limitation: “adaptively change the mapping function according to a given weight initialization.” – This limitation is directed to the abstract idea of mathematical concepts (see MPEP 2106.04(a)(2)) as it involves adaptively the mapping function given the weight initialization. Thus, the judicial exception is not integrated into a practical application (see MPEP 2106.04(d) I.), failing step 2A Prong 2. The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception under step 2B. Claim 33, which is dependent on claim 18, is rejected as at least one memory storing instructions that, when executed by the at least one processor, cause the apparatus to at least to is a limitation that recites a processor with memory including computer program code that are configured to implement the method with the at least one processor. A processor with memory is considered part of a generic computer. A general purpose computer that applies a judicial exception, such as an abstract idea, by use of conventional computer functions does not qualify as a particular machine (see MPEP 2016.05(f)). And “arrange the compression loss weight that is multiplied with the compression loss such that the compression loss is a multiplicative factor times the task force." -- This limitation is directed to the abstract idea of mathematical concepts (see MPEP 2106.04(a)(2)) as it is reciting further terms, i.e., task loss, of the loss function in prose and detailing mathematical operations including addition and multiplication comprising a loss function. Thus, the judicial exception is not integrated into a practical application (see MPEP 2106.04(d) I.), failing step 2A Prong 2. The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception under step 2B. Claim 35, which is dependent on claim 19, is rejected as at least one memory storing instructions that, when executed by the at least one processor, cause the apparatus to at least to is a limitation that recites a processor with memory including computer program code that are configured to implement the method with the at least one processor. A processor with memory is considered part of a generic computer. A general purpose computer that applies a judicial exception, such as an abstract idea, by use of conventional computer functions does not qualify as a particular machine (see MPEP 2016.05(f)). And “arrange the transformation factor such that a square of the L2 norm of the weight tensor of the neural network divided by the L1 norm of the weigh tensor of the neural network is equal to one-third times the L1 norm of the weight tensor of the neural network divided by the L2 norm of the weight tensor of the neural network." -- This limitation is directed to the abstract idea of mathematical concepts (see MPEP 2106.04(a)(2)) as it is reciting further terms, i.e., task loss, of the loss function in prose and detailing mathematical operations including addition and multiplication comprising a loss function. Thus, the judicial exception is not integrated into a practical application (see MPEP 2106.04(d) I.), failing step 2A Prong 2. The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception under step 2B. Claim 38 is rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claim is dependent on claim 18 which included an abstract idea (see rejection for claim 18). This claim merely recites a further limitation on the memory and processor from claim 18 which were directed to generic computer components being used for the pre-existing purposes. The additional limitation “obtain the weight tensor of the neural network by flattening tensors representing learnable parameters of the neural network.” – is an insignificant pre-solution activity that does not impose meaningful limits on the claim such that it is not nominally or tangentially related to the invention. See MPEP 2106.05(g)(2). Additionally, the limitation is WURC. See e.g., DeepLizard, “Flatten, Reshape, and Squeeze Explained - Tensors for Deep Learning with PyTorch” (last updated 28 Sep. 2018), Johnson, Justin (Sep. 6, 2017) “Derivatives, Backpropagation and Vectorization” Stanford University (vectorization is a form of flattening). Thus, the judicial exception is not integrated into a practical application (see MPEP 2106.04(d) I.), failing step 2A Prong 2. The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception under step 2B. Regarding Claims 32 and 37: They recite substantially similar limitations as claim 18, therefore they are rejected under 35 USC 101 using the same rationale. Regarding claims 34 and 36: They recite substantially similar limitations as claim 20 and 22 respectively, therefore they are rejected under 35 USC 101 using the same rationale Allowable Subject Matter Claims 18, 20-32, 34 and 36-37 but would be allowable over the prior art, provided the 101 rejections are overcome. The closest prior art of record, Teig, et al. (US 11537870 B1) discloses obtaining a partial derivatives for multiple parameters of the loss function, i.e., a plurality of critical points (Teig at Cl. 15, Ln. 6-17). Additionally, Teig recites employing a loss function that promotes sparsity and encourages non-zero weights to be substantially equal to each other (see generally Teig at Cl. 1-2, Ln. 59-3). Lastly, Teig recites a weight vector, which is a one-dimensional weight tensor (see Teig at CL. 4, Ln. 47-58.) Yin, P., Esser, E. and Xin, J., 2014. Ratio and difference of l_1 and l_2 norms and sparse representation with coherent dictionaries. Commun. Inf. Syst., 14(2), pp.87-109. (Year: 2014) – Discusses the of how the ratio of l1 and l2 norms has been used empirically to enforce sparsity of scale invariant solutions. It studies the mathematical theory of the sparsity promoting properties of the ratio metric in the context of basis pursuit via over-complete dictionaries. However, while the prior art of record teaches the use of L1 and L2 norms and use in a loss function, the prior art of record does not teach using a loss function in combination with other features as recited in the claims. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to James K. Trujillo whose telephone number is (571)272-3677. The examiner can normally be reached M-F 8:00-4:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Dede Zecher can be reached at (571) 272-7771. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. JAMES K. TRUJILLO Supervisory Patent Examiner Art Unit 2151 /James Trujillo/Supervisory Patent Examiner, Art Unit 2151
Read full office action

Prosecution Timeline

Aug 13, 2021
Application Filed
Oct 28, 2024
Non-Final Rejection — §101
Jan 17, 2025
Response Filed
Mar 06, 2025
Final Rejection — §101
May 23, 2025
Response after Non-Final Action
Jun 05, 2025
Request for Continued Examination
Jun 09, 2025
Response after Non-Final Action
Jul 03, 2025
Non-Final Rejection — §101
Oct 08, 2025
Response Filed
Feb 12, 2026
Final Rejection — §101 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12561567
Neural Network Pruning With Cyclical Sparsity
2y 5m to grant Granted Feb 24, 2026
Patent null
SIMILIARITY MEASURES FOR SHORT SEGMENTS OF TEXT
Granted
Patent null
METHOD AND APPARATUS FOR PROVIDING INTERACTIVE RESPONSES TO REQUESTS FROM INTELLIGENT MACHINES
Granted
Patent null
Interactive Security Brokerage System
Granted
Patent null
INFORMATION PROCESSING SYSTEM AND METHOD OF ACQUIRING BACKUP IN AN INFORMATION PROCESSING SYSTEM
Granted
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
16%
Grant Probability
28%
With Interview (+12.0%)
4y 3m
Median Time to Grant
High
PTA Risk
Based on 25 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month