Prosecution Insights
Last updated: April 19, 2026
Application No. 17/544,115

DATA PROCESSING METHOD AND APPARATUS BASED ON NEURAL POPULATION CODING, STORAGE MEDIUM, AND PROCESSOR

Non-Final OA §101
Filed
Dec 07, 2021
Examiner
HAEFNER, KAITLYN RENEE
Art Unit
2148
Tech Center
2100 — Computer Architecture & Software
Assignee
Information Science Academy Of China Electronics Technology Group Corporation
OA Round
3 (Non-Final)
50%
Grant Probability
Moderate
3-4
OA Rounds
4y 2m
To Grant
99%
With Interview

Examiner Intelligence

Grants 50% of resolved cases
50%
Career Allow Rate
2 granted / 4 resolved
-5.0% vs TC avg
Strong +67% interview lift
Without
With
+66.7%
Interview Lift
resolved cases with interview
Typical timeline
4y 2m
Avg Prosecution
32 currently pending
Career history
36
Total Applications
across all art units

Statute-Specific Performance

§101
32.6%
-7.4% vs TC avg
§103
31.1%
-8.9% vs TC avg
§102
13.8%
-26.2% vs TC avg
§112
22.2%
-17.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 4 resolved cases

Office Action

§101
DETAILED ACTION This action is in response to the application filed 11/28/2025. Claims 1-7 and 9-20 are pending and have been examined. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 10/27/2025 has been entered. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-7 and 9-20 are rejected under 35 U.S.C 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding Claim 1: Subject Matter Eligibility Analysis Step 1: Claim 1 recites a data processing method based on neural population coding and is thus a process, one of the four statutory categories of patentable subject matter. Subject Matter Eligibility Analysis Step 2A Prong 1: Claim 1 recites performing a common spatial pattern transformation on the raw data to obtain transformed data (This limitation is a mental process as it encompasses a human mentally performing a common spatial pattern transformation.) the expression of the first target function is as follows: PNG media_image1.png 164 490 media_image1.png Greyscale where Q[C] is the first target function, C is the first matrix, PNG media_image2.png 89 281 media_image2.png Greyscale , PNG media_image3.png 109 614 media_image3.png Greyscale and β and m are non-negative constants, and m is a margin parameter; (This limitation is a mathematical concept as it encompasses mathematical equations.) updating the first matrix according to a preset gradient descent update rule, to obtain a second matrix, wherein the second matrix is a weight parameter of the first target function (This limitation is a mental process as it encompasses a human mentally updating the first matrix according to a preset gradient descent update rule.) updating the first target function based on the second matrix (This limitation is a mental process as it encompass a human mentally updating the first target function.) Therefore, claim 1 recites an abstract idea. Subject Matter Eligibility Analysis Step 2A Prong 2: Claim 1 further recites additional elements of obtaining raw data (This element does not integrate the abstract idea into a practical application because it recites insignificant extra-solution activity of data gathering (see MPEP 2106.05(g)).) wherein the raw data is any of image data, voice data, and signal data from any of image recognition, natural language processing, voice recognition, and signal analysis (This element does not integrate the abstract idea into a practical application because it recites a technological environment in which to apply a judicial exception (see MPEP 2106.05(h)).) obtaining, based on the transformed data, a first target function comprising a first matrix (This element does not integrate the abstract idea into a practical application because it recites insignificant extra-solution activity of data gathering (see MPEP 2106.05(g)).) the first target function is a target function of a neural population coding network model (This element does not integrate the abstract idea into a practical application because it recites a technological environment in which to apply a judicial exception (see MPEP 2106.05(h)).) first matrix is a weight parameter of the target function of the neural population coding network model (This element does not integrate the abstract idea into a practical application because it recites a technological environment in which to apply a judicial exception (see MPEP 2106.05(h)).) training the neural population coding network model based on the first target function to optimize the weight parameter of the neural population coding network model (This element does not integrate the abstract idea into a practical application because it recites insignificant extra solution activity of training a model based on a target function (see MPEP 2106.05(g)).) Therefore, claim 1 is not integrated into a practical application. Subject Matter Eligibility Analysis Step 2B The additional elements of Claim 1 do not provide significantly more than the abstract idea itself, taken alone and in combination, because obtaining raw data is the well understood, routine, and conventional activity of “transmitting or receiving data over a network” (see MPEP 2106.05(d)(II); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network)). wherein the raw data is any of image data, voice data, and signal data from any of image recognition, natural language processing, voice recognition, and signal analysis specifies a particular technological environment to perform the abstract idea and cannot provide significantly more (see MPEP 2106.05(h)). obtaining, based on the transformed data, a first target function comprising a first matrix is the well understood, routine, and conventional activity of “transmitting or receiving data over a network” (see MPEP 2106.05(d)(II); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network)). the first target function is a target function of a neural population coding network model specifies a particular technological environment to perform the abstract idea and cannot provide significantly more (see MPEP 2106.05(h)). first matrix is a weight parameter of the target function of the neural population coding network model specifies a particular technological environment to perform the abstract idea and cannot provide significantly more (see MPEP 2106.05(h)). training the neural population coding network model based on the first target function to optimize the weight parameter of the neural population coding network model is the well understood, routine, and conventional activity of training a model based on a target function (Fan et al., page 5, paragraph 0002, “As is well known, machine learning builds a hypothetical model based on sample data for a computer to make a prediction or a decision. The hypothetical model may be implemented as a classifier, which approximates a mapping function from input variables to output variables. The goal of machine learning is to make the hypothetical model as close as possible to a target function which always gives correct answers. This goal may be achieved by training the hypothetical model with more sample data.”)). Therefore, claim 1 is subject-matter ineligible. Regarding Claim 2: Subject Matter Eligibility Analysis Step 1: Claim 2 recites the same process as claim 1. Subject Matter Eligibility Analysis Step 2A Prong 1: Claim 2 recites determining an interactive information formula based on the input vector of the raw data and the neuron output vector (This limitation is a mental process as it could encompass a human mentally determining an interactive information formula.) determining a second target function comprising a covariance matrix and a transformation matrix (This limitation is a mental process as it could encompass a human mentally determining a second target function) transforming the raw data into the transformed data based on the transformation matrix (This limitation is a mental process as it could encompass a human transforming the raw data into transformed data.) Therefore, claim 2 recites an abstract idea. Subject Matter Eligibility Analysis Step 2A Prong 2: Claim 2 further recites additional elements of obtaining an input vector representing the raw data and a neuron output vector (This element does not integrate the abstract idea into a practical application because it recites insignificant extra-solution activity of data gathering (see MPEP 2106.05(g)).) obtaining the transformation matrix based on the interactive information formula and the second target function (This element does not integrate the abstract idea into a practical application because it recites insignificant extra-solution activity of data gathering (see MPEP 2106.05(g)).) Therefore, claim 2 is not integrated into a practical application. Subject Matter Eligibility Analysis Step 2B The additional elements of Claim 2 do not provide significantly more than the abstract idea itself, taken alone and in combination, because obtaining an input vector representing the raw data and a neuron output vector is the well understood, routine, and conventional activity of “transmitting or receiving data over a network” (see MPEP 2106.05(d)(II); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network)). obtaining the transformation matrix based on the interactive information formula and the second target function is the well understood, routine, and conventional activity of “transmitting or receiving data over a network” (see MPEP 2106.05(d)(II); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network)). Therefore, claim 2 is subject matter ineligible. Regarding Claim 3: Subject Matter Eligibility Analysis Step 1: Claim 3 recites the same process as claim 2. Subject Matter Eligibility Analysis Step 2A Prong 1: Claim 3 recites the same abstract ideas as claim 2 and therefore recites an abstract idea. Subject Matter Eligibility Analysis Step 2A Prong 2: Claim 3 further recites the additional elements of wherein in response to the number of neuron output vectors is greater than the number of vector dimensions of the raw data, the obtaining the transformation matrix based on the interactive information formula and the second target function comprises: obtaining a close approximation formula for the interactive information formula (This element does not integrate the abstract idea into a practical application because it recites insignificant extra-solution activity of data gathering (see MPEP 2106.05(g)).) obtaining the transformation matrix based on the close approximation formula and the second target function (This element does not integrate the abstract idea into a practical application because it recites insignificant extra-solution activity of data gathering (see MPEP 2106.05(g)).) Therefore, claim 3 is not integrated into a practical application. Subject Matter Eligibility Analysis Step 2B: The additional elements of Claim 3 do not provide significantly more than the abstract idea itself, taken alone and in combination, because wherein in response to the number of neuron output vectors is greater than the number of vector dimensions of the raw data, the obtaining the transformation matrix based on the interactive information formula and the second target function comprises: obtaining a close approximation formula for the interactive information formula is the well understood, routine, and conventional activity of “transmitting or receiving data over a network” (see MPEP 2106.05(d)(II); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network)). obtaining the transformation matrix based on the close approximation formula and the second target function is the well understood, routine, and conventional activity of “transmitting or receiving data over a network” (see MPEP 2106.05(d)(II); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network)). Therefore, claim 3 is subject matter ineligible. Regarding Claim 4: Subject Matter Eligibility Analysis Step 1: Claim 4 recites the same process as claim 1. Subject Matter Eligibility Analysis Step 2A Prong 1: Claim 4 recites updating the first matrix according to the preset gradient descent update rule, to obtain a third matrix (This limitation is a mental process as it could encompass a human mentally updating the first matrix.) determining the number of iterations (This limitation is a mental process as it could encompass a human mentally determining the number of iterations) determining whether the number of iterations reaches a preset number; and in response to the number of iterations reaches the preset number, outputting the third matrix as the second matrix (This limitation is a mental process as it could encompass a human mentally determining whether the number of iterations reaches a preset number.) in response to the number of iterations does not reach the preset number, assigning the third matrix to the first matrix, and returning to the step of updating the first matrix according to the preset gradient descent update rule, to obtain a third matrix (This limitation is a mental process as it could encompass a human mentally assigning the third matrix to the first matrix.) Therefore, claim 4 recites an abstract idea. Subject Matter Eligibility Analysis Step 2A Prong 2: Claim 4 further recites the additional element of wherein the number of iterations is used to indicate the number of times of updating the first matrix according to the preset gradient descent update rule (This element does not integrate the abstract idea into a practical application because it recites a technological environment in which to apply a judicial exception (see MPEP 2106.05(h)).); Therefore, Claim 4 is not integrated into a practical application. Subject Matter Eligibility Step 2B: The additional element of Claim 4 does not provide significantly more than the abstract idea itself, taken alone and in combination, because wherein the number of iterations is used to indicate the number of times of updating the first matrix according to the preset gradient descent update rule specifies a particular technological environment to perform the abstract idea and cannot provide significantly more (see MPEP 2106.05(h)). Therefore, claim 4 is subject matter ineligible. Regarding Claim 5: Subject Matter Eligibility Step 1: Claim 5 recites the same process as claim 4. Subject Matter Eligibility Step 2A Prong 1: Claim 5 recites calculating a derivative of the first target function with respect to the first matrix. This limitation is a mental process as it could encompass a human calculating a derivative. Therefore, claim 5 recites an abstract idea. Subject Matter Eligibility Step 2A Prong 2: Claim 5 has no additional elements that would integrate the abstract idea into a practical application. Therefore, Claim 5 is not integrated into a practical application. Subject Matter Eligibility Step 2B: Since there are no additional elements to provide significantly more than the abstract idea itself, taken alone and in combination, claim 5 is subject-matter ineligible. Regarding Claim 6: Subject Matter Eligibility Step 1: Claim 6 recites the same process as claim 1. Subject Matter Eligibility Step 2A Prong 1: Claim 6 recites performing an orthogonal transformation on the second matrix, to obtain an orthogonal result; (This limitation is a mental process as it could encompass a human mentally performing an orthogonal transformation.) updating a value of the first target function based on the orthogonal result (This limitation is a mental process as it could encompass a human mentally updating a value of the first target function.) Therefore, claim 6 recites an abstract idea. Subject Matter Eligibility Step 2A Prong 2: Claim 6 has no additional elements that would integrate the abstract idea into a practical application. Therefore, Claim 6 is not integrated into a practical application. Subject Matter Eligibility Step 2B: Since there are no additional elements to provide significantly more than the abstract idea itself, taken alone and in combination, claim 6 is subject-matter ineligible. Regarding Claim 7: Subject Matter Eligibility Step 1: Claim 7 recites the same process as claim 6. Subject Matter Eligibility Step 2A Prong 1: Claim 7 recites the same abstract ideas as claim 6. Therefore, claim 6 recites an abstract idea. Subject Matter Eligibility Step 2A Prong 2: Claim 7 further recites the additional element of wherein the orthogonal transformation is a Gram-Schmidt orthogonal transformation (This element does not integrate the abstract idea into a practical application because it recites a technological environment in which to apply a judicial exception (see MPEP 2106.05(h)).). Therefore, Claim 7 is not integrated into a practical application. Subject Matter Eligibility Step 2B: The additional element of Claim 7 does not provide significantly more than the abstract idea itself, taken alone and in combination, because wherein the orthogonal transformation is a Gram-Schmidt orthogonal transformation specifies a particular technological environment to perform the abstract idea and cannot provide significantly more (see MPEP 2106.05(h)). Therefore, claim 7 is subject matter ineligible. Regarding Claim 9: Subject Matter Eligibility Analysis Step 1: Claim 9 recites a non-transitory computer readable storage medium and is thus a machine, one of the four statutory categories of patentable subject matter. Subject Matter Eligibility Analysis Step 2A Prong 1: Claim 9 recites perform a data processing method based on neural population coding (This limitation is a mental process as it encompasses a human mentally performing a data processing method.) performing a common spatial pattern transformation on the raw data to obtain transformed data (This limitation is a mental process as it encompasses a human mentally performing a common spatial pattern transformation.) updating the first matrix according to a preset gradient descent update rule, to obtain a second matrix, wherein the second matrix is a weight parameter of the first target function (This limitation is a mental process as it encompasses a human mentally updating the first matrix according to a preset gradient descent update rule.) updating the first target function based on the second matrix (This limitation is a mental process as it encompass a human mentally updating the first target function.) the expression of the first target function is as follows: PNG media_image1.png 164 490 media_image1.png Greyscale where Q[C] is the first target function, C is the first matrix, PNG media_image2.png 89 281 media_image2.png Greyscale , PNG media_image3.png 109 614 media_image3.png Greyscale and β and m are non-negative constants, and m is a margin parameter; (This limitation is a mathematical concept as it encompasses mathematical equations.) Therefore, claim 9 recites an abstract idea. Subject Matter Eligibility Analysis Step 2A Prong 2: Claim 9 further recites additional elements of a non-transitory computer readable storage medium having stored thereon one or more programs which, when executed by a computing device having one or more processor, cause the computing device to perform a data processing method (This element does not integrate the abstract idea into a practical application because amounts to mere “apply it on a computer” (see MPEP 2106.05(f)).) obtaining raw data (This element does not integrate the abstract idea into a practical application because it recites insignificant extra-solution activity of data gathering (see MPEP 2106.05(g)).) wherein the raw data is any of image data, voice data, and signal data from any of image recognition, natural language processing, voice recognition, and signal analysis (This element does not integrate the abstract idea into a practical application because it recites a technological environment in which to apply a judicial exception (see MPEP 2106.05(h)).) obtaining, based on the transformed data, a first target function comprising a first matrix (This element does not integrate the abstract idea into a practical application because it recites insignificant extra-solution activity of data gathering (see MPEP 2106.05(g)).) the first target function is a target function of a neural population coding network model (This element does not integrate the abstract idea into a practical application because it recites a technological environment in which to apply a judicial exception (see MPEP 2106.05(h)).) first matrix is a weight parameter of the target function of the neural population coding network model (This element does not integrate the abstract idea into a practical application because it recites a technological environment in which to apply a judicial exception (see MPEP 2106.05(h)).) training the neural population coding network model based on the first target function to optimize the weight parameter of the neural population coding network model (This element does not integrate the abstract idea into a practical application because it recites insignificant extra solution activity of training a model based on a target function (see MPEP 2106.05(g)).) Therefore, claim 9 is not integrated into a practical application. Subject Matter Eligibility Analysis Step 2B The additional elements of Claim 9 do not provide significantly more than the abstract idea itself, taken alone and in combination, because a non-transitory computer readable storage medium having stored thereon one or more programs which, when executed by a computing device having one or more processor, cause the computing device to perform a data processing method uses a computer as a tool to perform the abstract idea and cannot provide significantly more (see MPEP 2106.05(f)). obtaining raw data is the well understood, routine, and conventional activity of “transmitting or receiving data over a network” (see MPEP 2106.05(d)(II); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network)). wherein the raw data is any of image data, voice data, and signal data from any of image recognition, natural language processing, voice recognition, and signal analysis specifies a particular technological environment to perform the abstract idea and cannot provide significantly more (see MPEP 2106.05(h)). obtaining, based on the transformed data, a first target function comprising a first matrix is the well understood, routine, and conventional activity of “transmitting or receiving data over a network” (see MPEP 2106.05(d)(II); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network)). the first target function is a target function of a neural population coding network model specifies a particular technological environment to perform the abstract idea and cannot provide significantly more (see MPEP 2106.05(h)). first matrix is a weight parameter of the target function of the neural population coding network model specifies a particular technological environment to perform the abstract idea and cannot provide significantly more (see MPEP 2106.05(h)). training the neural population coding network model based on the first target function to optimize the weight parameter of the neural population coding network model is the well understood, routine, and conventional activity of training a model based on a target function (Fan et al., page 5, paragraph 0002, “As is well known, machine learning builds a hypothetical model based on sample data for a computer to make a prediction or a decision. The hypothetical model may be implemented as a classifier, which approximates a mapping function from input variables to output variables. The goal of machine learning is to make the hypothetical model as close as possible to a target function which always gives correct answers. This goal may be achieved by training the hypothetical model with more sample data.”)). Therefore, claim 9 is subject-matter ineligible. Regarding Claim 10, claim 10 recites substantially similar limitations to claim 2 and is therefore rejected under the same analysis. Regarding Claim 11, claim 11 recites substantially similar limitations to claim 3 and is therefore rejected under the same analysis. Regarding Claim 12, claim 12 recites substantially similar limitations to claim 4 and is therefore rejected under the same analysis. Regarding Claim 13, claim 13 recites substantially similar limitations to claim 5 and is therefore rejected under the same analysis. Regarding Claim 14, claim 14 recites substantially similar limitations to claim 6 and is therefore rejected under the same analysis. Regarding Claim 15: Subject Matter Eligibility Analysis Step 1: Claim 15 recites a processor configured to perform a data processing method and is thus a machine, one of the four statutory categories of patentable subject matter. Subject Matter Eligibility Analysis Step 2A Prong 1: Claim 15 recites performing a common spatial pattern transformation on the raw data to obtain transformed data (This limitation is a mental process as it encompasses a human mentally performing a common spatial pattern transformation.) updating the first matrix according to a preset gradient descent update rule, to obtain a second matrix, wherein the second matrix is a weight parameter of the first target function (This limitation is a mental process as it encompasses a human mentally updating the first matrix according to a preset gradient descent update rule.) updating the first target function based on the second matrix (This limitation is a mental process as it encompass a human mentally updating the first target function.) the expression of the first target function is as follows: PNG media_image1.png 164 490 media_image1.png Greyscale where Q[C] is the first target function, C is the first matrix, PNG media_image2.png 89 281 media_image2.png Greyscale , PNG media_image3.png 109 614 media_image3.png Greyscale and β and m are non-negative constants, and m is a margin parameter; (This limitation is a mathematical concept as it encompasses mathematical equations.) Therefore, claim 1 recites an abstract idea. Subject Matter Eligibility Analysis Step 2A Prong 2: Claim 15 further recites additional elements of a processor configured to perform a data processing method (This element does not integrate the abstract idea into a practical application because amounts to mere “apply it on a computer” (see MPEP 2106.05(f)).) obtaining raw data (This element does not integrate the abstract idea into a practical application because it recites insignificant extra-solution activity of data gathering (see MPEP 2106.05(g)).) wherein the raw data is any of image data, voice data, and signal data from any of image recognition, natural language processing, voice recognition, and signal analysis (This element does not integrate the abstract idea into a practical application because it recites a technological environment in which to apply a judicial exception (see MPEP 2106.05(h)).) obtaining, based on the transformed data, a first target function comprising a first matrix (This element does not integrate the abstract idea into a practical application because it recites insignificant extra-solution activity of data gathering (see MPEP 2106.05(g)).) the first target function is a target function of a neural population coding network model (This element does not integrate the abstract idea into a practical application because it recites a technological environment in which to apply a judicial exception (see MPEP 2106.05(h)).) first matrix is a weight parameter of the target function of the neural population coding network model (This element does not integrate the abstract idea into a practical application because it recites a technological environment in which to apply a judicial exception (see MPEP 2106.05(h)).) training the neural population coding network model based on the first target function to optimize the weight parameter of the neural population coding network model (This element does not integrate the abstract idea into a practical application because it recites insignificant extra solution activity of training a model based on a target function (see MPEP 2106.05(g)).) Therefore, claim 15 is not integrated into a practical application. Subject Matter Eligibility Analysis Step 2B The additional elements of Claim 15 do not provide significantly more than the abstract idea itself, taken alone and in combination, because a processor configured to perform a data processing method uses a computer as a tool to perform the abstract idea and cannot provide significantly more (see MPEP 2106.05(f)). obtaining raw data is the well understood, routine, and conventional activity of “transmitting or receiving data over a network” (see MPEP 2106.05(d)(II); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network)). wherein the raw data is any of image data, voice data, and signal data from any of image recognition, natural language processing, voice recognition, and signal analysis specifies a particular technological environment to perform the abstract idea and cannot provide significantly more (see MPEP 2106.05(h)). obtaining, based on the transformed data, a first target function comprising a first matrix is the well understood, routine, and conventional activity of “transmitting or receiving data over a network” (see MPEP 2106.05(d)(II); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network)). the first target function is a target function of a neural population coding network model specifies a particular technological environment to perform the abstract idea and cannot provide significantly more (see MPEP 2106.05(h)). first matrix is a weight parameter of the target function of the neural population coding network model specifies a particular technological environment to perform the abstract idea and cannot provide significantly more (see MPEP 2106.05(h)). training the neural population coding network model based on the first target function to optimize the weight parameter of the neural population coding network model is the well understood, routine, and conventional activity of training a model based on a target function (Fan et al., page 5, paragraph 0002, “As is well known, machine learning builds a hypothetical model based on sample data for a computer to make a prediction or a decision. The hypothetical model may be implemented as a classifier, which approximates a mapping function from input variables to output variables. The goal of machine learning is to make the hypothetical model as close as possible to a target function which always gives correct answers. This goal may be achieved by training the hypothetical model with more sample data.”)). Therefore, claim 15 is subject-matter ineligible. Regarding Claim 16, claim 16 recites substantially similar limitations to claim 2 and is therefore rejected under the same analysis. Regarding Claim 17, claim 17 recites substantially similar limitations to claim 3 and is therefore rejected under the same analysis. Regarding Claim 18, claim 18 recites substantially similar limitations to claim 4 and is therefore rejected under the same analysis. Regarding Claim 19, claim 19 recites substantially similar limitations to claim 5 and is therefore rejected under the same analysis. Regarding Claim 20, claim 20 recites substantially similar limitations to claim 6 and is therefore rejected under the same analysis. Allowable Subject Matter Claims 1-7 and 9-20 would be allowable over the prior art of record if the 101 rejections are overcome in light of the instant amendments. Specifically, regarding claim 1, “the expression of the first target function is as follows: PNG media_image1.png 164 490 media_image1.png Greyscale where Q[C] is the first target function, C is the first matrix, PNG media_image2.png 89 281 media_image2.png Greyscale , PNG media_image3.png 109 614 media_image3.png Greyscale and β and m are non-negative constants, and m is a margin parameter” in conjunction with the other limitations of the claims are not taught by the prior art of record. The closest prior art is Deouell et al. (WO 2016/193979) (hereafter referred to as Deouell), Zhou (“Understanding the Convolutional Neural Networks with Gradient Descent and Backpropagation”) (hereafter referred to as Zhou) and Wu (“Online and Offline Domain Adaption for Reducing BCI Calibration Effort”). Deouell discloses a data processing method based on neural population coding, (Deouell, page 23, lines 1-3; lines 7-16), obtaining raw data and performing a common spatial pattern transformation (Deouell, page 23, lines 7-16: lines 20-24), obtaining a first target function comprising a first matrix (Deouell, page 44, lines 13-19), and training the neural population coding network model based on the first target function ( Deouell, page 44, lines 17-18). Deouell fails to explicitly disclose wherein the first target function is a weight parameter of the target function, the expression of the first target function as recited in claim 1, updating the first matrix according to a preset gradient descent rule to obtain a second matrix, updating the first target function, and training the model to optimize the weight parameter. Zhou discloses the first matrix is a weight parameter of the target function of the neural population coding network (Zhou, page 3), updating the first matrix according to a preset gradient descent update rule (Zhou, page 3), updating the first target function based on the second matrix (Zhou, page 3), and training the model based on the target function to optimize the weight parameter (Zhou, page 3). Zhou fails to disclose the expression of the first target function as recited in claim 1. Wu discloses a brain-computer interface environment that minimizes a loss function in mathematical terms (Wu, page 3, 1st column, last paragraph), but fails to disclose the expression of the first target function as recited in claim 1. Therefore, the prior art of record does not disclose claim 1 as a whole. Claims 2-7 are allowable at least due to their dependencies on claim 1 if the 101 rejections are overcome. Claim 9 recites substantially similar limitations as claim 1 and is therefore allowable under the same rationale if the 101 rejections are overcome. Claims 10-14 are allowable at least due to their dependencies on claim 9 if the 101 rejections are overcome. Claim 15 recites substantially similar limitations as claim 1 and is therefore allowable under the same rationale if the 101 rejections are overcome. Claims 16-20 are allowable at least due to their dependencies on claim 9 if the 101 rejections are overcome. Response to Arguments On page 13, Applicant argues: Firstly, the amended claims do not only provide the abstract idea. For example, the steps of training the neural population coding network model based on the first target function, which relates to model training and optimizing in computer science. Moreover, the raw data is obtained from applications such as image recognition, natural language processing, voice recognition, signal analysis. The method based on neural population coding can be used to training models in a field such as image recognition, natural language processing, voice recognition, signal analysis, etc. After being performed by a CSP transformation, differences between different classes of raw data will be preliminarily highlight, such that further learning and training are subsequently performed for classification to improve learning efficiency. Specifically, the application describes training the model associated with the first target function comprising a matrix (refers to a weight parameter of the target function of the neural population coding network model), and the matrix is updated according to a gradient descent update rule. The training of the model to optimize the weight parameter is so complex that it should be performed by the computer device rather than the human mind. Regarding the Applicant’s argument that the claims recite significantly more than an abstract idea, the Examine respectfully disagrees. Specifically, Examiner respectfully notes that the claim limitation “wherein the raw data is any of image data, voice data, and signal data from any of image recognition, natural language processing, voice recognition, and signal analysis” does not integrate the abstract idea into a practical application because it recites a technological environment in which to apply a judicial exception (see MPEP 2106.05(h)).). Additionally, this limitation specifies a particular technological environment to perform the abstract idea and cannot provide significantly more (see MPEP 2106.05(h)). Examiner further respectfully notes that updating the first matrix according to a gradient descent update rule is an abstract idea since it encompasses a human mentally updating a matrix according to a gradient descent update rule and thus cannot provide a specific improvement. Additionally, the limitation “the first matrix is a parameter of the target function of the neural population coding network model” specifies a particular technological environment to perform the abstract idea where the technological environment is the composition of the target function and thus, cannot provide significantly more (see MPEP 2106.05(h)). Therefore, these limitations inter alia, do not integrate the abstract idea into a practical application, nor do they provide significantly more. Regarding Applicant’s argument that training the model provides significantly more, Examiner respectfully disagrees. Specifically, training the model based on the first target function to optimize the weight parameter recites insignificant extra solution activity of training a model based on a target function (see MPEP 2106.05(g)). Furthermore, training the model based on the first target function to optimize the weight parameter is the well understood, routine, and conventional activity of training a model based on a target function (Fan et al., page 5, paragraph 0002, “As is well known, machine learning builds a hypothetical model based on sample data for a computer to make a prediction or a decision. The hypothetical model may be implemented as a classifier, which approximates a mapping function from input variables to output variables. The goal of machine learning is to make the hypothetical model as close as possible to a target function which always gives correct answers. This goal may be achieved by training the hypothetical model with more sample data.”)). Thus, training the model based on the first target function does not provide significantly more. On pages 13-14, Applicant argues: Secondly, the§ 101 rejection is rendered moot because the claims integrate the alleged abstract idea into a practical application. For example, the above additional elements together with the step of "training the neural population coding network model based on the first target function to optimize the weight parameter of the neural population coding network model" recites a specific manner of automatically and efficiently training the neural population coding network model, and further supporting the model training to implement and simplify the model training process, which provides a specific improvement over prior methods, thereby integrating the mental process asserted by the examiner into practical applications. For at least the above reasons, Applicants respectfully requests the rejection of claims 1-20 under 35 U.S.C. 101 be withdrawn Regarding the applicant’s argument that the 101 rejection of the independent claims should be withdrawn, the Examiner respectfully disagrees. Specifically, the limitation of “training the neural population coding network based on the first target function to optimize the weight parameter of the neural population coding network model” is insignificant extra solution activity of training a model based on a target function (see 2106.05(g)) and therefore, cannot integrate the abstract idea into a practical application. This limitation is additionally a well-understood, routine, and conventional activity (Fan et al. page 5, paragraph 0002) and therefore cannot provide significantly more. Regarding the Applicant’s argument that the dependent claims are allowable at least due in part to their dependency on the independent claims, the Examiner respectfully disagrees and notes the instant rejections and response to arguments regarding the independent claims above. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Lotte et al. (“A review of classification algorithms for EEG-based brain-computer interfaces: a 10 year update”) also describe methods used with EEGs. Any inquiry concerning this communication or earlier communications from the examiner should be directed to KAITLYN R HAEFNER whose telephone number is (571)272-1429. The examiner can normally be reached Monday - Thursday: 7:15 am - 5:15 pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Michelle Bechtold can be reached at (571) 431-0762. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /K.R.H./Examiner, Art Unit 2148 /MICHELLE T BECHTOLD/Supervisory Patent Examiner, Art Unit 2148
Read full office action

Prosecution Timeline

Dec 07, 2021
Application Filed
Apr 21, 2025
Non-Final Rejection — §101
Jul 25, 2025
Response Filed
Aug 20, 2025
Final Rejection — §101
Oct 27, 2025
Response after Non-Final Action
Nov 28, 2025
Request for Continued Examination
Dec 06, 2025
Response after Non-Final Action
Jan 13, 2026
Non-Final Rejection — §101 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602431
METHODS FOR PERFORMING INPUT-OUTPUT OPERATIONS IN A STORAGE SYSTEM USING ARTIFICIAL INTELLIGENCE AND DEVICES THEREOF
2y 5m to grant Granted Apr 14, 2026
Patent 12572828
METHOD FOR INDUSTRY TEXT INCREMENT AND ELECTRONIC DEVICE
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 2 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
50%
Grant Probability
99%
With Interview (+66.7%)
4y 2m
Median Time to Grant
High
PTA Risk
Based on 4 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month