DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
The present application was filed on 10/18/2021. This action is in response to amendments and remarks filed on 11/25/2025. In the current amendments, claims 4 and 13 have been amended. Claims 1-18 are pending and have been examined. Claim 1 is the independent claim.
Response to Amendment
In response to amendments and remarks filed on 11/25/2025, claims 1-2, 6 and 13 are no longer being interpreted under 35 U.S.C. 112(f), and the rejections of claims 1-18 under 112(a) and 35 USC 112(b), set forth in the previous office action, have been withdrawn have been withdrawn.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-18 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Claim 1:
Step 1: Claim 1 is directed to a controller circuit, which is directed to a machine, one of the statutory categories.
Step 2A Prong 1: The claim recites the limitations:
calculate the controller output signal from the controller input signal by summing at least a first signal depending on a current value of the controller input signal and a second signal generated…estimating an integral over time of the controller input signal - In the context of the claim limitation, this encompasses a mathematical concept of summing and integral over time.
Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites “a neural network”, “first neural network” – these are mere instructions to apply the judicial exception using a generic computer programmed with a generic class of computer algorithm. See MPEP 2106.05(f). Regarding the “neural network”, no details of the neural network or its training are recited and the neural network is recited at a high level of generality and can be constructed by hand with pen and paper. The claimed “neural network”, under the broadest reasonable interpretation (BRI), in light of the specification, could be constructed by hand with pen and paper based on a reasonable amount of observed data (i.e., the “input signal”). The neural network is recited at a high level of generality and therefore is being interpreted as performing an abstract idea (mental process) on a generic computer. See MPEP 2106.04(a)(2) § III.C which states that “a concept that is performed in the human mind and applicant is merely claiming that concept performed 1) on a generic computer, or 2) in a computer environment, or 3) is merely using a computer as a tool to perform the concept” still recite a mental process. The claim also recites “a controller output signal for input to a nonlinear plant”, “a controller input signal representing an error in an output of the nonlinear plant”, which recite insignificant extra-solution activity of mere data gathering and output. MPEP 2106.05(g). Accordingly, the additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element is directed to a mere instruction to apply the judicial exception. Mere instruction to apply a judicial exception does not amount to significantly more. See MPEP 2106.05(f). Furthermore, the recitations of “a controller output…”, “a controller input…” are directed to insignificant extra-solution activity that is well known, routine and conventional because the limitations are directed to receiving or transmitting data over a network, e.g., using the Internet to gather data. See MPEP 2106.05(d)(II), OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network). Nothing in the claim provides significantly more than this. As such, the claim is not patent eligible.
Claim 2:
Step 1: Claim 2 is directed to a controller circuit, which is directed to a machine, one of the statutory categories.
Step 2A Prong 1: The claim recites the limitations:
calculate the controller output by summing the first signal and the second signal with a third signal generated at least in part by a second neural network estimating a differential of the controller input signal - In the context of the claim limitation, this encompasses a mathematical concept of summing.
Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites “neural network” , “by a second neural network” – t these are mere instructions to apply the judicial exception using a generic computer programmed with a generic class of computer algorithm. See MPEP 2106.05(f). Regarding the “neural network” and “by a second neural network”, no details of the neural networks or its training are recited and the neural networks are recited at a high level of generality and can be constructed by hand with pen and paper. The claimed “neural network” and “second neural network”, under the broadest reasonable interpretation (BRI), in light of the specification, could be constructed by hand with pen and paper based on a reasonable amount of observed data (i.e., the “input signal”). The neural networks are recited at a high level of generality and therefore is being interpreted as performing an abstract idea (mental process) on a generic computer. See MPEP 2106.04(a)(2) § III.C which states that “a concept that is performed in the human mind and applicant is merely claiming that concept performed 1) on a generic computer, or 2) in a computer environment, or 3) is merely using a computer as a tool to perform the concept” still recite a mental process. Accordingly, the additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element is directed to a mere instruction to apply the judicial exception. Mere instruction to apply a judicial exception does not amount to significantly more. See MPEP 2106.05(f). Nothing in the claim provides significantly more than this. As such, the claim is not patent eligible.
Claim 3:
Step 1: Claim 3 is directed to a controller circuit, which is directed to a machine, one of the statutory categories.
Step 2A Prong 1: Please see analysis of claim 2.
Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites “wherein at least one of the first and second neural networks is a recurrent neural network” – these are mere instructions to apply the judicial exception using a generic computer programmed with a generic class of computer algorithm. See MPEP 2106.05(f). Regarding the “the first and second neural networks” and “a recurrent neural network”, no details of the neural networks or its training are recited and “the first and second neural networks” and the “recurrent neural network” are recited at a high level of generality and can be constructed by hand with pen and paper. The claimed “the first and second neural networks” and the “recurrent neural network”, under the broadest reasonable interpretation (BRI), in light of the specification, could be constructed by hand with pen and paper based on a reasonable amount of observed data (i.e., the “input signal”). The neural networks are recited at a high level of generality and therefore are being interpreted as performing an abstract idea (mental process) on a generic computer. See MPEP 2106.04(a)(2) § III.C which states that “a concept that is performed in the human mind and applicant is merely claiming that concept performed 1) on a generic computer, or 2) in a computer environment, or 3) is merely using a computer as a tool to perform the concept” still recite a mental process. Accordingly, the additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element is directed to a mere instruction to apply the judicial exception. Mere instruction to apply a judicial exception does not amount to significantly more. See MPEP 2106.05(f). Nothing in the claim provides significantly more than this. As such, the claim is not patent eligible.
Claim 4:
Step 1: Claim 4 is directed to a controller circuit, which is directed to a machine, one of the statutory categories.
Step 2A Prong 1: Please see analysis of claim 3.
Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites “wherein the second neural network is a recurrent neural network and a weighted version of the first signal is linked to an input of the second neural network” – these are mere instructions to apply the judicial exception using a generic computer programmed with a generic class of computer algorithm. See MPEP 2106.05(f). Regarding the “the second neural network”; “a recurrent neural network”, no details of the neural networks or its training are recited and the neural network is recited at a high level of generality and can be constructed by hand with pen and paper. The claimed “the second neural network”; “a recurrent neural network”, under the broadest reasonable interpretation (BRI), in light of the specification, could be constructed by hand with pen and paper based on a reasonable amount of observed data (i.e., the “weighted version of the first signal”). The neural network is recited at a high level of generality and therefore is being interpreted as performing an abstract idea (mental process) on a generic computer. See MPEP 2106.04(a)(2) § III.C which states that “a concept that is performed in the human mind and applicant is merely claiming that concept performed 1) on a generic computer, or 2) in a computer environment, or 3) is merely using a computer as a tool to perform the concept” still recite a mental process. Accordingly, the additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element is directed to a mere instruction to apply the judicial exception. Mere instruction to apply a judicial exception does not amount to significantly more. See MPEP 2106.05(f). Nothing in the claim provides significantly more than this. As such, the claim is not patent eligible.
Claim 5:
Step 1: Claim 5 is directed to a controller circuit, which is directed to a machine, one of the statutory categories.
Step 2A Prong 1: Please see analysis of claim 1.
Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites “wherein the input weights to the first neural network are non-trainable weights” – these are mere instructions to apply the judicial exception using a generic computer programmed with a generic class of computer algorithm. See MPEP 2106.05(f). Regarding the “the first neural network”, no details of the neural network or its training are recited and the neural network is recited at a high level of generality and can be constructed by hand with pen and paper. The claimed “the first neural network”, under the broadest reasonable interpretation (BRI), in light of the specification, could be constructed by hand with pen and paper based on a reasonable amount of observed data (i.e., the “input weights”). The neural network is recited at a high level of generality and therefore is being interpreted as performing an abstract idea (mental process) on a generic computer. See MPEP 2106.04(a)(2) § III.C which states that “a concept that is performed in the human mind and applicant is merely claiming that concept performed 1) on a generic computer, or 2) in a computer environment, or 3) is merely using a computer as a tool to perform the concept” still recite a mental process. Accordingly, the additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element is directed to a mere instruction to apply the judicial exception. Mere instruction to apply a judicial exception does not amount to significantly more. See MPEP 2106.05(f). Nothing in the claim provides significantly more than this. As such, the claim is not patent eligible.
Claim 6:
Step 1: Claim 6 is directed to a controller circuit, which is directed to a machine, one of the statutory categories.
Step 2A Prong 1: The claim recites the limitations:
calculate the controller output signal by summing the first and second signals using trainable weights - In the context of the claim limitation, this encompasses a mathematical concept of summing signals using weights.
Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites “the neural network” – these are mere instructions to apply the judicial exception using a generic computer programmed with a generic class of computer algorithm. See MPEP 2106.05(f). Regarding the “neural network”, no details of the neural network or its training are recited and the neural network is recited at a high level of generality and can be constructed by hand with pen and paper. The claimed “neural network”, under the broadest reasonable interpretation (BRI), in light of the specification, could be constructed by hand with pen and paper based on a reasonable amount of observed data (i.e., the “input signal”). The neural network is recited at a high level of generality and therefore is being interpreted as performing an abstract idea (mental process) on a generic computer. See MPEP 2106.04(a)(2) § III.C which states that “a concept that is performed in the human mind and applicant is merely claiming that concept performed 1) on a generic computer, or 2) in a computer environment, or 3) is merely using a computer as a tool to perform the concept” still recite a mental process. Accordingly, the additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element is directed to a mere instruction to apply the judicial exception. Mere instruction to apply a judicial exception does not amount to significantly more. See MPEP 2106.05(f). Nothing in the claim provides significantly more than this. As such, the claim is not patent eligible.
Claim 7:
Step 1: Claim 7 is directed to a controller circuit, which is directed to a machine, one of the statutory categories.
Step 2A Prong 1: Please see analysis of claim 2.
Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites “wherein the neural network further comprises at least one transfer neural network having at least one output from the first and second neural networks as an input, the calculated controller output signal being based on the output of the at least one transfer neural network” – these are mere instructions to apply the judicial exception using a generic computer programmed with a generic class of computer algorithm. See MPEP 2106.05(f). Regarding the “the neural network”; “transfer neural network", no details of the neural network and the transfer neural network and their training are recited and the neural networks are recited at a high level of generality and can be constructed by hand with pen and paper. The claimed “neural network” and “transfer neural network”, under the broadest reasonable interpretation (BRI), in light of the specification, could be constructed by hand with pen and paper based on a reasonable amount of observed data (i.e., the “output from the first and second neural networks as an input”). The neural networks are recited at a high level of generality and therefore is being interpreted as performing an abstract idea (mental process) on a generic computer. See MPEP 2106.04(a)(2) § III.C which states that “a concept that is performed in the human mind and applicant is merely claiming that concept performed 1) on a generic computer, or 2) in a computer environment, or 3) is merely using a computer as a tool to perform the concept” still recite a mental process. Accordingly, the additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element is directed to a mere instruction to apply the judicial exception. Mere instruction to apply a judicial exception does not amount to significantly more. See MPEP 2106.05(f). Nothing in the claim provides significantly more than this. As such, the claim is not patent eligible.
Claim 8:
Step 1: Claim 8 is directed to a controller circuit, which is directed to a machine, one of the statutory categories.
Step 2A Prong 1: The claim recites the limitations:
at least one rectified linear unit transfer function layer transforming an output x from one of the first, second, and third neurons to an output y of the at least one rectified linear unit transfer function layer according to
PNG
media_image1.png
74
570
media_image1.png
Greyscale
- In the context of the claim limitation, this encompasses a mathematical concept of transfer function layer.
Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites “at least one transfer neural network” – these are mere instructions to apply the judicial exception using a generic computer programmed with a generic class of computer algorithm. See MPEP 2106.05(f). Regarding the “transfer neural network”, no details of the neural network or its training are recited and the neural network is recited at a high level of generality and can be constructed by hand with pen and paper. The claimed “transfer neural network”, under the broadest reasonable interpretation (BRI), in light of the specification, could be constructed by hand with pen and paper based on a reasonable amount of observed data (i.e., the “output from the first and second neural networks as an input”). The neural network is recited at a high level of generality and therefore is being interpreted as performing an abstract idea (mental process) on a generic computer. See MPEP 2106.04(a)(2) § III.C which states that “a concept that is performed in the human mind and applicant is merely claiming that concept performed 1) on a generic computer, or 2) in a computer environment, or 3) is merely using a computer as a tool to perform the concept” still recite a mental process. Accordingly, the additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element is directed to a mere instruction to apply the judicial exception. Mere instruction to apply a judicial exception does not amount to significantly more. See MPEP 2106.05(f). Nothing in the claim provides significantly more than this. As such, the claim is not patent eligible.
Claim 9:
Step 1: Claim 9 is directed to a controller circuit, which is directed to a machine, one of the statutory categories.
Step 2A Prong 1: The claim recites the limitations:
wherein at least one of the vectors w and b comprises trainable parameters - In the context of the claim limitation, this encompasses mathematical concept of vector and trainable parameters.
Step 2A Prong 2: Please see analysis of claim 8.
Step 2B Analysis: Please see analysis of claim 8.
Claim 10:
Step 1: Claim 10 is directed to a controller circuit, which is directed to a machine, one of the statutory categories.
Step 2A Prong 1: Please see analysis of claim 8.
Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites “wherein the at least one transfer neural network comprises three rectified linear unit transfer function layers corresponding to the first, second, and third signals, respectively” – these are mere instructions to apply the judicial exception using a generic computer programmed with a generic class of computer algorithm. See MPEP 2106.05(f). Regarding the “transfer neural network”, no details of the neural network or its training are recited and the neural network is recited at a high level of generality and can be constructed by hand with pen and paper. The claimed “transfer neural network”, under the broadest reasonable interpretation (BRI), in light of the specification, could be constructed by hand with pen and paper based on a reasonable amount of observed data (i.e., the “output from the first and second neural networks as an input”). The neural network is recited at a high level of generality and therefore is being interpreted as performing an abstract idea (mental process) on a generic computer. See MPEP 2106.04(a)(2) § III.C which states that “a concept that is performed in the human mind and applicant is merely claiming that concept performed 1) on a generic computer, or 2) in a computer environment, or 3) is merely using a computer as a tool to perform the concept” still recite a mental process. Accordingly, the additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element is directed to a mere instruction to apply the judicial exception. Mere instruction to apply a judicial exception does not amount to significantly more. See MPEP 2106.05(f). Nothing in the claim provides significantly more than this. As such, the claim is not patent eligible.
Claim 11:
Step 1: Claim 11 is directed to a controller circuit, which is directed to a machine, one of the statutory categories.
Step 2A Prong 1: Please see analysis of claim 7.
Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites “wherein the at least one transfer neural network comprises any one or more of any of: a leaky rectified linear unit transfer function layer; a parametric rectified linear unit transfer function layer; and a Gaussian error linear unit” – these are mere instructions to apply the judicial exception using a generic computer programmed with a generic class of computer algorithm. See MPEP 2106.05(f). Regarding the “transfer neural network”, Reword to say, aside from reciting “one transfer neural network comprises three rectified linear unit transfer function layers corresponding to the first, second, and third signals”, no further details of the neural network are recited or its training are recited and the neural network is recited at a high level of generality and can be constructed by hand with pen and paper. The claimed “transfer neural network”, under the broadest reasonable interpretation (BRI), in light of the specification, could be constructed by hand with pen and paper based on a reasonable amount of observed data (i.e., the “output from the first and second neural networks as an input”). The neural network is recited at a high level of generality and therefore is being interpreted as performing an abstract idea (mental process) on a generic computer. See MPEP 2106.04(a)(2) § III.C which states that “a concept that is performed in the human mind and applicant is merely claiming that concept performed 1) on a generic computer, or 2) in a computer environment, or 3) is merely using a computer as a tool to perform the concept” still recite a mental process. Accordingly, the additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element is directed to a mere instruction to apply the judicial exception. Mere instruction to apply a judicial exception does not amount to significantly more. See MPEP 2106.05(f). Nothing in the claim provides significantly more than this. As such, the claim is not patent eligible.
Claim 12:
Step 1: Claim 12 is directed to a controller circuit, which is directed to a machine, one of the statutory categories.
Step 2A Prong 1: The claim recites the limitations:
a layer clamping a sum formed from at least the first and second signals to a predetermined range - In the context of the claim limitation, this encompasses a mathematical concept of sum of predetermined range.
Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites “the neural network” – these are mere instructions to apply the judicial exception using a generic computer programmed with a generic class of computer algorithm. See MPEP 2106.05(f). Regarding the “neural network”, the neural network further comprises a layer clamping a sum formed from at least the first and second signals to a predetermined range”, no further details of the neural network are recited or its training are recited and the neural network is recited at a high level of generality and can be constructed by hand with pen and paper. The claimed “neural network”, under the broadest reasonable interpretation (BRI), in light of the specification, could be constructed by hand with pen and paper based on a reasonable amount of observed data (i.e., the “input signal”). The neural network is recited at a high level of generality and therefore is being interpreted as performing an abstract idea (mental process) on a generic computer. See MPEP 2106.04(a)(2) § III.C which states that “a concept that is performed in the human mind and applicant is merely claiming that concept performed 1) on a generic computer, or 2) in a computer environment, or 3) is merely using a computer as a tool to perform the concept” still recite a mental process. Accordingly, the additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element is directed to a mere instruction to apply the judicial exception. Mere instruction to apply a judicial exception does not amount to significantly more. See MPEP 2106.05(f). Nothing in the claim provides significantly more than this. As such, the claim is not patent eligible.
Claim 13:
Step 1: Claim 13 is directed to a controller circuit, which is directed to a machine, one of the statutory categories.
Step 2A Prong 1: Please see analysis of independent claim 1.
Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites “the neural network comprises a feedback signal, based on a sum formed from at least the first and second signals, fed into the first neural network and configured to prevent integral windup in the first neural network” – these are mere instructions to apply the judicial exception using a generic computer programmed with a generic class of computer algorithm. See MPEP 2106.05(f). Regarding the “neural network”, the neural network comprises a feedback signal, based on a sum formed from at least the first and second signals, fed into the first neural network and configured to prevent integral windup in the first neural network”, no further details of the neural network are recited or its training are recited and the neural network is recited at a high level of generality and can be constructed by hand with pen and paper. The claimed “neural network”, under the broadest reasonable interpretation (BRI), in light of the specification, could be constructed by hand with pen and paper based on a reasonable amount of observed data (i.e., the “data points”). The neural network is recited at a high level of generality and therefore is being interpreted as performing an abstract idea (mental process) on a generic computer. See MPEP 2106.04(a)(2) § III.C which states that “a concept that is performed in the human mind and applicant is merely claiming that concept performed 1) on a generic computer, or 2) in a computer environment, or 3) is merely using a computer as a tool to perform the concept” still recite a mental process. Accordingly, the additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element is directed to a mere instruction to apply the judicial exception. Mere instruction to apply a judicial exception does not amount to significantly more. See MPEP 2106.05(f). Nothing in the claim provides significantly more than this. As such, the claim is not patent eligible.
Claim 14:
Step 1: Claim 14 is directed to a controller circuit, which is directed to a machine, one of the statutory categories.
Step 2A Prong 1: Please see analysis of independent claim 1.
Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites “nonlinear plant is a power converter circuit” – these are mere instructions to apply the judicial exception using a generic computer programmed with a generic class of computer algorithm. See MPEP 2106.05(f). Accordingly, the additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element is directed to a mere instruction to apply the judicial exception. Mere instruction to apply a judicial exception does not amount to significantly more. See MPEP 2106.05(f). Nothing in the claim provides significantly more than this. As such, the claim is not patent eligible.
Claim 15:
Step 1: Claim 15 is directed to a method of training a controller circuit, which is directed to a process, one of the statutory categories.
Step 2A Prong 1: The claim recites the limitations:
tuning trainable weights…using a reward function - In the context of the claim limitation, this encompasses a mental process of evaluating the weights according to the reward function.
Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites “of the neural network” – these are mere instructions to apply the judicial exception using a generic computer programmed with a generic class of computer algorithm. See MPEP 2106.05(f). Regarding the “neural network”, no details of the neural network or its training are recited and the neural network is recited at a high level of generality and can be constructed by hand with pen and paper. The claimed “neural network”, under the broadest reasonable interpretation (BRI), in light of the specification, could be constructed by hand with pen and paper based on a reasonable amount of observed data (i.e., the “data points”). The neural network is recited at a high level of generality and therefore is being interpreted as performing an abstract idea (mental process) on a generic computer. See MPEP 2106.04(a)(2) § III.C which states that “a concept that is performed in the human mind and applicant is merely claiming that concept performed 1) on a generic computer, or 2) in a computer environment, or 3) is merely using a computer as a tool to perform the concept” still recite a mental process. Accordingly, the additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element is directed to a mere instruction to apply the judicial exception. Mere instruction to apply a judicial exception does not amount to significantly more. See MPEP 2106.05(f). Nothing in the claim provides significantly more than this. As such, the claim is not patent eligible.
Claim 16:
Step 1: Claim 16 is directed to a method of training a Controller circuit, which is directed to a process, one of the statutory categories.
Step 2A Prong 1: The claim recites the limitations:
wherein the reward function is of the from:
PNG
media_image2.png
200
642
media_image2.png
Greyscale
- In the context of the claim limitation, this encompasses a mathematical concept of calculating reward function.
Step 2A Prong 2: Please see analysis of claim 15.
Step 2B Analysis: Please see analysis of claim 15.
Claim 17:
Step 1: Claim 17 is directed to a method of training a controller circuit, which is directed to a process, one of the statutory categories.
Step 2A Prong 1: Please see analysis of claim 15.
Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites “wherein the neural network comprises a transfer neural network and wherein the method comprises: training the neural network using one of a leaky rectified linear unit transfer function layer, a parametric rectified linear unit transfer function layer, and a Gaussian error linear unit for the transfer neural network; and using a rectified linear unit transfer function for the transfer neural network for subsequent operation of the neural network, using weights obtained from the training” – these are mere instructions to apply the judicial exception using a generic computer programmed with a generic class of computer algorithm. See MPEP 2106.05(f). Regarding the “neural network”, no details of the neural network or its training are recited and the neural network is recited at a high level of generality and can be constructed by hand with pen and paper. The claimed “neural network”, under the broadest reasonable interpretation (BRI), in light of the specification, could be constructed by hand with pen and paper based on a reasonable amount of observed data (i.e., the “data points”). The neural network is recited at a high level of generality and therefore is being interpreted as performing an abstract idea (mental process) on a generic computer. See MPEP 2106.04(a)(2) § III.C which states that “a concept that is performed in the human mind and applicant is merely claiming that concept performed 1) on a generic computer, or 2) in a computer environment, or 3) is merely using a computer as a tool to perform the concept” still recite a mental process. Accordingly, the additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element is directed to a mere instruction to apply the judicial exception. Mere instruction to apply a judicial exception does not amount to significantly more. See MPEP 2106.05(f). Nothing in the claim provides significantly more than this. As such, the claim is not patent eligible.
Claim 18:
Step 1: Claim 18 is directed to a method of training a controller circuit, which is directed to a process, one of the statutory categories.
Step 2A Prong 1: Please see analysis of claim 17.
Step 2A Prong 2: This judicial exception is not integrated into a practical application. The claim further recites “performing additional training of the neural network using the rectified linear unit transfer function for the transfer neural network” – these are mere instructions to apply the judicial exception using a generic computer programmed with a generic class of computer algorithm. See MPEP 2106.05(f). Regarding the “neural network”, no details of the neural network or its training are recited and the neural network is recited at a high level of generality and can be constructed by hand with pen and paper. The claimed “neural network”, under the broadest reasonable interpretation (BRI), in light of the specification, could be constructed by hand with pen and paper based on a reasonable amount of observed data (i.e., the “data points”). The neural network is recited at a high level of generality and therefore is being interpreted as performing an abstract idea (mental process) on a generic computer. See MPEP 2106.04(a)(2) § III.C which states that “a concept that is performed in the human mind and applicant is merely claiming that concept performed 1) on a generic computer, or 2) in a computer environment, or 3) is merely using a computer as a tool to perform the concept” still recite a mental process. Accordingly, the additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
Step 2B Analysis: The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element is directed to a mere instruction to apply the judicial exception. Mere instruction to apply a judicial exception does not amount to significantly more. See MPEP 2106.05(f). Nothing in the claim provides significantly more than this. As such, the claim is not patent eligible.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1, 6 and 15 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Hwang (“Reinforcement learning to adaptive control of nonlinear systems”).
Claim 1.
Hwang teaches a controller circuit, comprising a controller output signal for input to a nonlinear plant (Pages 514-515 & SECTION I. Introduction “In the linearizing scheme, the input v is a training signal, the control signal u controls an identified nonlinear plant to approximate linearization, the yd is the output of linear reference model, and y is the output of the plant…The structured ANNs can drive a nonlinear plant to behave like the reference model” teaches controller signal is the input of the nonlinear plant):
and a neural network configured to calculate the controller output signal from the controller input signal by summing at least a first signal depending on a current value of the controller input signal and a second signal generated at least in part by a first neural network estimating an integral over time of the controller input signal (Page 517 & III. LEARNING ALGORITHM OF RLLS “
PNG
media_image3.png
240
334
media_image3.png
Greyscale
” teaches neural network weights are updated and Page 517 & III. LEARNING ALGORITHM OF RLLS “1) Observe the current state vector
PNG
media_image4.png
55
369
media_image4.png
Greyscale
. 2) Use the EP to have utility index of the state s: p← utility(s). 3)Use the LC to select an action merit u^. 4)Use the RP to perform the reinforcement merit r~. 5) Execute the action u∼ψ(u^,σ), where σ=1/(1+e−r~). 6) Observe the successive new state s(t+1) and reinforcement r=yd−y. 7) Use the EP to perform utility index of next state vector s(t+1): pt+1 ←utility(s(t+1)). 8) p′←r+γpt+1. 9)Adjust the weights of the RP by the back-propagating reinforcement error er=r−r~” teaches linearizing control and reinforcement predictor select action which corresponding to current value of input. evaluation predictor is a long-term policy which corresponding to integrating over time which provide second signal).
Claim 6.
Hwang teaches the controller circuit of claim 1.
Hwang further teaches wherein the neural network is configured to calculate the controller output signal by summing the first and second signals using trainable weights (II. STRUCTURE OF RLLS & Page 515-516 “one needs to estimate the gradient ∂r/∂y; then, the output weights can be trained by the usual delta rule, and the usual gradient-based supervised learning algorithm can train the remaining weights… A simple way is to let u^ equal a weighted sum of the input of the LC network at time t” teaches trainable weighted sum of the input of corresponding output signal by summing the trainable weights).
Claim 15.
Hwang teaches a method of training a controller circuit according to any of claim 1,
Hwang further teaches the method comprising: tuning trainable weights of the neural network using a reward function (SECTION II. Structure of RLLS “the EP network adjusts its weights using (5). The weights of the LC network are also adjusted according to the same TD error…The external reinforcement signal r(t) provides the same rough information, and the heuristic reinforcement signal r^ supplies the processed signal to the LC to choose a higher action merit” teaches updating weight corresponding to tunning weight of the network using the reward function).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 2-5 and 7 are rejected under 35 U.S.C. 103 as being unpatentable over Hwang (“Reinforcement learning to adaptive control of nonlinear systems”) in view of Simard (US10671908B2).
Claim 2.
Hwang teaches the controller circuit of claim 1.
Hwang does not explicitly teach wherein the neural network is configured to calculate the controller output by summing the first signal and the second signal with a third signal generated at least in part by a second neural network estimating a differential of the controller input signal.
However, Simard teaches wherein the neural network is configured to calculate the controller output by summing the first signal and the second signal with a third signal generated at least in part by a second neural network estimating a differential of the controller input signal (SUMMARY & Page 23, Column 1 “ For each state being stored, the state component sub-program modifies and stores a current state by adding the previous stored state to a corresponding element of a state contribution vector output by a trainable transition and differential non-linearity component sub-program using the associated state loop and adder each time an input vector is input into the differential RNN” teaches based differential RNN that estimates differential components of the controller input corresponding to summing the signal of multi neural network).
Hwang and Simard are analogous art because they are both directed to using a neural network to contribute to a control signal and to sum outputs of trainable weights.
It would have been obvious for one of ordinary skill in the arts before the effective filing date of the claimed invention to incorporate the limitation(s) above as taught by Simard into the disclosed invention of Hwang.
One of ordinary skill in the arts would have been motivated to make this modification because of the following, the DRNN shows capability to approximate “to improve the convergence of the gradient descent method by increasing the curvature of the Hessian of the objective function in regions where it would otherwise be flat” (Simard, Page 24, Column 4 & SUMMARY).
Claim 3.
Hwang in view of Simard teaches the controller circuit of claim 2.
Simard further teaches wherein at least one of the first and second neural networks is a recurrent neural network (SUMMARY & Page 23, 1st column “Differential recurrent neural network (RNN) implementations described herein generally concern a type of neural network that handles dependencies that go arbitrarily far in time by allowing the network system to store states using recurrent loops” teaches differential neural network).
Hwang and Simard are analogous art because they are both directed to using a neural network to contribute to a control signal and to sum outputs of trainable weights.
It would have been obvious for one of ordinary skill in the arts before the effective filing date of the claimed invention to incorporate the limitation(s) above as taught by Simard into the disclosed invention of Hwang.
One of ordinary skill in the arts would have been motivated to make this modification because of the following, the DRNN shows capability to approximate “to improve the convergence of the gradient descent method by increasing the curvature of the Hessian of the objective function in regions where it would otherwise be flat” (Simard, Page 24, Column 4 & SUMMARY).
Claim 4.
Hwang in view of Simard teaches the controller circuit of claim 3.
Simard further teaches wherein the second neural network is a recurrent neural network and a weighted version of the first signal is linked to an input of the second neural network (SUMMARY & Page 24, 4th column “The positive and negative contribution gradient vectors are fed from the output of the trainable transition component of the differential RNN. The trainable transition component includes a neural network which is trained by using the gradient vector signals from its output to modify the weight matrix of the neural network via a backpropagation procedure” teaches differential RNN trained by using the gradient from the modifying weight of the matrix via back propagation which corresponding to linked to input of second recurrent neural network).
Hwang and Simard are analogous art because using because they are both directed to using a NN to contribute to a control signal and to sum outputs of trainable weights.
It would have been obvious for one of ordinary skill in the arts before the effective filing date of the claimed invention to incorporate the limitation(s) above as taught by Simard into the disclosed invention of Hwang.
One of ordinary skill in the arts would have been motivated to make this modification because of the following, the DRNN shows capability to approximate “to improve the convergence of the gradient descent method by increasing the curvature of the Hessian of the objective function in regions where it would otherwise be flat” (Simard, Page 24, Column 4 & SUMMARY).
Claim 5.
Hwang teaches the controller circuit of claim 1.
Simard further teaches wherein the input weights to the first neural network are non-trainable weights (Page 25 & Column 6 “If the function F that computes the next state xt+1 as a function of the previous state xt and an input it is defined by:x t+1 =F(W, x t ,i t) (1) Stability around a fixed point a=F(W, a, i)
PNG
media_image5.png
40
452
media_image5.png
Greyscale
” teaches fixed point F corresponding to non-trainable weights to the neural network).
Hwang and Simard are analogous art because they are both directed to using a neural network to contribute to a control signal and to sum outputs of trainable weights.
It would have been obvious for one of ordinary skill in the arts before the effective filing date of the claimed invention to incorporate the limitation(s) above as taught by Simard into the disclosed invention of Hwang.
One of ordinary skill in the arts would have been motivated to make this modification because of the following, the DRNN shows capability to approximate “to improve the convergence of the gradient descent method by increasing the curvature of the Hessian of the objective function in regions where it would otherwise be flat” (Simard, Page 24, Column 4 & SUMMARY).
Claim 7.
Hwang in view of Simard teaches the controller circuit of claim 2.
Simard further teaches wherein the neural network further comprises at least one transfer neural network having at least one output from the first and second neural networks as an input, the calculated controller output signal being based on the output of the at least one transfer neural network (SUMMARY & Page 24, 1st column “the differential RNN includes a state component sub-program for storing states. This state component sub-program includes a state loop with an adder for each state. For each state being stored, the state component sub-program modifies and stores a current state by adding the previous stored state to a corresponding element of a state contribution vector output by a trainable transition and differential non-linearity component sub-program using the associated state loop and adder each time an input vector is input into the differential RNN. During backpropagation, the state component sub-program accumulates gradients of a sequence used to train the differential RNN by adding them to the previous stored gradient and storing the new gradient at each time step starting from the end of the sequence” teaches state loop for each state combining control signal corresponding transfer neural network).
Hwang and Simard are analogous art because they are both directed to using a neural network to contribute to a control signal and to sum outputs of trainable weights.
It would have been obvious for one of ordinary skill in the arts before the effective filing date of the claimed invention to incorporate the limitation(s) above as taught by Simard into the disclosed invention of Hwang.
One of ordinary skill in the arts would have been motivated to make this modification because of the following, the DRNN shows capability to approximate “to improve the convergence of the gradient descent method by increasing the curvature of the Hessian of the objective function in regions where it would otherwise be flat” (Simard, Page 24, Column 4 & SUMMARY).
Claims 8 and 9 are rejected under 35 U.S.C. 103 as being unpatentable over Hwang (“Reinforcement learning to adaptive control of nonlinear systems”) in view of Simard (US10671908B2) and further in view of Kang (US20210264247A1).
Claim 8.
Hwang in view of Simard teaches the controller circuit of claim 7,
Hwang in view of Simard does not explicitly teach wherein the at least one transfer neural network comprises at least one rectified linear unit transfer function layer transforming an output x from one of the first, second, and third neurons to an output y of the at least one rectified linear unit transfer function layer according to:
PNG
media_image6.png
62
458
media_image6.png
Greyscale
.
However, Kang teaches wherein the at least one transfer neural network comprises at least one rectified linear unit transfer function layer transforming an output x from one of the first, second, and third neurons to an output y of the at least one rectified linear unit transfer function layer according to:
PNG
media_image6.png
62
458
media_image6.png
Greyscale
(Para [0024] “In the context of artificial neural networks, an ReLU provides an activation function that is generally referred to as “rectifier”, which is defined as the positive part of its argument: f(x)=x+=max(0,x), where x is the input to a neuron (102-110). FIG. 2 depicts a ReLU 210 according to one or more embodiments of the present invention. Here, consider that the ReLU 210 receives a vector {right arrow over (x)} as an input and {right arrow over (w)} is a vector of weight assigned to a neuron associated with the ReLU 210. The ReLU 210 computes an output y as a scalar dot product of the input {right arrow over (x)} and the weights {right arrow over (w)}. However, the ReLU 210 only outputs a positive y; if the product of {right arrow over (x)} and {right arrow over (w)} results in a negative value, the output y is 0 (zero)” and Figure 2 teaches rectified linear unit with weight, bias and return the positive value).
Hwang, Simard, and Kang are analogous art because neural network with trainable weight and bias.
It would have been obvious for one of ordinary skill in the arts before the effective filing date of the claimed invention to incorporate the limitation(s) above as taught by Kang into the disclosed invention of Hwang in view of Simard.
One of ordinary skill in the arts would have been motivated to make this modification because of the following, deep neural network provide improvement of many tasks such as “large-category image classification and recognition; speech recognition, and nature language processing. Neural networks have demonstrated an ability to learn such skills as face recognition, reading, and the detection of simple grammatical structure” (Kang, Para [0002]).
Claim 9.
Hwang in view of Simard further in view of Kang teaches the controller circuit of claim 8,
Kang further teaches wherein at least one of the vectors w and b comprises trainable parameters (Para [0027] “sb provides output of adder tree 320 at given cycle b. The final dot product can be represented as {right arrow over (x)}·{right arrow over (w)}=s0+2s1+ . . . +2B−1sB−1. Alternatively, or in addition, a total accumulated value at the adder tree 320 at any given cycle b can be represented as Sb=2B−b−1sB−1+2B−b−2sB−2+ . . . sb” and Figure 3 teaches weight and b are trainable parameters).
Hwang, Simard, and Kang are analogous art because they are each directed to using trainable weight and bias.
It would have been obvious for one of ordinary skill in the arts before the effective filing date of the claimed invention to incorporate the limitation(s) above as taught by Kang into the disclosed invention of Hwang in view of Simard.
One of ordinary skill in the arts would have been motivated to make this modification because of the following, Deep neural network provides improvement of many tasks such as “large-category image classification and recognition; speech recognition, and nature language processing. Neural networks have demonstrated an ability to learn such skills as face recognition, reading, and the detection of simple grammatical structure” (Kang, Para [0002]).
Claim 12 is re rejected under 35 U.S.C. 103 as being unpatentable over Hwang (“Reinforcement learning to adaptive control of nonlinear systems”) in view of Kang (US20210264247A1).
Claim 12.
Hwang teaches the controller circuit of claim1,
Hwang does not explicitly teach wherein the neural network further comprises a layer clamping a sum formed from at least the first and second signals to a predetermined range.
However, Kang teaches wherein the neural network further comprises a layer clamping a sum formed from at least the first and second signals to a predetermined range (Para [0019] “A neuron generally is a part of a neural network computer system that determines an output based on one or more inputs (that can be weighted), and the neuron can determine this output based on determining the output of an activation function with the possibly-weighted inputs… sigmoid, which produces an output that ranges between 0 and 1” teaches control output rages between 0 to 1 which clearly indicate signal are bounded).
Hwang and Kang are analogous art because they are each directed to using trainable weight and bias.
It would have been obvious for one of ordinary skill in the arts before the effective filing date of the claimed invention to incorporate the limitation(s) above as taught by Kang into the disclosed invention of Hwang.
One of ordinary skill in the arts would have been motivated to make this modification because of the following, Deep neural network provides improvement of many tasks such as “large-category image classification and recognition; speech recognition, and nature language processing. Neural networks have demonstrated an ability to learn such skills as face recognition, reading, and the detection of simple grammatical structure” (Kang, Para [0002]).
Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Hwang (“Reinforcement learning to adaptive control of nonlinear systems”) in view of Simard (US10671908B2) in view of Kang (US20210264247A1) and further in view of Hendrycks (“Bridging Nonlinearities and Stochastic Regularizers with Gaussian Error Linear Units”).
Claim 10.
Hwang in view of Simard further in view of Kang teaches the controller circuit of claim 8.
Hwang in view of Simard further in view of Kang does not explicitly teach wherein the at least one transfer neural network comprises three rectified linear unit transfer function layers corresponding to the first, second, and third signals, respectively.
However, Hendrycks teaches wherein the at least one transfer neural network comprises three rectified linear unit transfer function layers corresponding to the first, second, and third signals, respectively (Abstract & Page 1 “We perform an empirical evaluation of the GELU nonlinearity against the ReLU and ELU activations and find performance improvements across all tasks” and Figure 1: The Gaussian Error Linear Unit (µ = 0, σ = 1), the Rectified Linear Unit, and the Exponential Linear Unit (α = 1) teaches neural network comprising three gaussian error linear unit, rectified linear unit and exponential linear unit).
Hwang, Simard, Kang and Hendrycks are analogous art because they are each directed to nonlinear plant processes that use a NN.
It would have been obvious for one of ordinary skill in the arts before the effective filing date of the claimed invention to incorporate the limitation(s) above as taught by Hendrycks into the disclosed invention of Hwang in view of Simard further in view of Kang.
One of ordinary skill in the arts would have been motivated to make this modification because of the following, “We perform an empirical evaluation of the GELU nonlinearity against the ReLU and ELU activations and find performance improvements across all tasks” (Hendrycks, Page 1 & Abstract).
Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over Hwang (“Reinforcement learning to adaptive control of nonlinear systems”) in view of Simard (US10671908B2)and further in view of Hendrycks (“Bridging Nonlinearities and Stochastic Regularizers with Gaussian Error Linear Units”).
Claim 11.
Hwang in view of Simard teaches the controller circuit of claim 7,
Hwang in view of Simard does not explicitly teach wherein the at least one transfer neural network comprises any one or more of any of: a leaky rectified linear unit transfer function layer; a parametric rectified linear unit transfer function layer; and a Gaussian error linear unit.
However, Hendrycks teaches wherein the at least one transfer neural network comprises any one or more of any of: a leaky rectified linear unit transfer function layer; a parametric rectified linear unit transfer function layer; and a Gaussian error linear unit (Figure 1 “Figure 1: The Gaussian Error Linear Unit (µ = 0, σ = 1), the Rectified Linear Unit, and the Exponential Linear Unit (α = 1)” teaches the gaussian error linear unit).
Hwang, Simard and Hendrycks are analogous art because they are each directed to nonlinear plant processes that use a NN.
It would have been obvious for one of ordinary skill in the arts before the effective filing date of the claimed invention to incorporate the limitation(s) above as taught by Hendrycks into the disclosed invention of Hwang in view of Simard.
One of ordinary skill in the arts would have been motivated to make this modification because of the following, “We perform an empirical evaluation of the GELU nonlinearity against the ReLU and ELU activations and find performance improvements across all tasks” (Hendrycks, Page 1 & Abstract).
Claims 13-14 are rejected under 35 U.S.C. 103 as being unpatentable over Hwang (“Reinforcement learning to adaptive control of nonlinear systems”) in view of Sekiguchi (US20220051097A1).
Claim 13.
Hwang teaches the controller circuit of any one of claim1,
Hwang does not explicitly teach wherein the neural network comprises a feedback signal, based on a sum formed from at least the first and second signals, fed into the first neural network and configured to prevent integral windup in the first neural network.
However, Sekiguchi teaches wherein the neural network comprises a feedback signal, based on a sum formed from at least the first and second signals, fed into the first recurrent neural network and configured to prevent integral windup in the first recurrent neural network (Para [0008] “A control device according to the present disclosure controls an output voltage of a converter. The control device includes: a neural network configured to generate a control signal for controlling a power supply block based on a detection signal from an output stage of the power supply block that supplies power to a load of the converter; a model generator configured to generate a model of a nonlinear dynamic system by machine-learning from the detection signal” and Para [0103] “The PID model part performs a feedback control by using coefficients of a deviation from a target value and integration and differentiation of the deviation as parameters” teaches neural network receives multiple summed signal and use them to generate control in a power converter).
Hwang and Sekiguchi are analogous art because they are each directed to using NNs based on a controller to control mechanisms.
It would have been obvious for one of ordinary skill in the arts before the effective filing date of the claimed invention to incorporate the limitation(s) above as taught by Sekiguchi into the disclosed invention of Hwang.
One of ordinary skill in the arts would have been motivated to make this modification because of the following, controller of the neural network “due to the galvanic insulation, noise generated in the power supply block 30 is blocked by the galvanic insulation, so that noise resistance of the control block 20 is improved” (Sekiguchi, Para [0096]).
Claim 14.
Hwang teaches the controller circuit of any one of claim 1,
Hwang does not explicitly teach wherein the nonlinear plant is a power converter circuit.
However, Sekiguchi teaches wherein the nonlinear plant is a power converter circuit (Para [0008] “a neural network configured to generate a control signal for controlling a power supply block based on a detection signal from an output stage of the power supply block that supplies power to a load of the converter; a model generator configured to generate a model of a nonlinear dynamic system by machine-learning from the detection signal” teaches power supply block (power converter circuit)).
Hwang and Sekiguchi are analogous art because they are each directed to using NNs based on a controller to control mechanisms.
It would have been obvious for one of ordinary skill in the arts before the effective filing date of the claimed invention to incorporate the limitation(s) above as taught by Sekiguchi into the disclosed invention of Hwang.
One of ordinary skill in the arts would have been motivated to make this modification because of the following, controller of the neural network “due to the galvanic insulation, noise generated in the power supply block 30 is blocked by the galvanic insulation, so that noise resistance of the control block 20 is improved” (Sekiguchi, Para [0096]).
Response to Arguments
Applicant's arguments filed on 11/25/2025 with respect to 35 U.S.C. 101 rejections of claims 1-18 have been fully considered but they are not persuasive.
With respect to the 35 U.S.C. 101 rejection of claim 1, applicant asserts, “The details of the Office Action's analysis are incorrect in several respects. First, for example, the Office Action asserts that the claims' recitations of a "neural network" are nothing more than 'mere instructions to apply the judicial application using generic computer programmed with generic computer equipment." This is wrong. When a controller circuit is configured to implement a neural network, it is no longer a "generic computer," if it ever was. Further, the controller circuit as claimed does substantially more than simply "sum" things or "integrate" things over time. It takes a controller input signal that represents an error in an output of a nonlinear plant and calculates a controller output signal for controlling the nonlinear plant based, in part, on a "second signal generated at least in part by a first neural network estimating an integral over time of the controller input signal." Thus, the claimed controller circuit applies a particular circuit structure (configured to implement a neural network) to carry out certain operations based on an error signal obtained from a nonlinear plant, to generate a control signal for controlling the nonlinear plant. This is clearly not simply "math," calculated using a generic computer, but is an application of that math, using a specific circuit structure to control a specific class of devices” (Remarks Pg.11).
Examiner Response:
The examiner respectfully disagrees. Claim recites “controller circuit, comprising…and a neural network” – which is generically recited as stated in 101 rejection above. Regarding the “neural network”, no details of the neural network or its training are recited and the neural network is recited at a high level of generality and can be constructed by hand with pen and paper. The claimed “neural network”, under the broadest reasonable interpretation (BRI), in light of the specification, could be constructed by hand with pen and paper based on a reasonable amount of observed data (i.e., the “input signal”). The neural network is recited at a high level of generality and therefore is being interpreted as performing an abstract idea (mental process) on a generic computer. See MPEP 2106.04(a)(2) § III.C which states that “a concept that is performed in the human mind and applicant is merely claiming that concept performed 1) on a generic computer, or 2) in a computer environment, or 3) is merely using a computer as a tool to perform the concept” still recite a mental process. Applicant’s argument that the controller circuit performs more than summing or integrating signals is acknowledged. However, the claimed steps still amount to mathematical operations performed on input data because generating a control signal are all described at a high level of abstraction and do not specify how specify how the neural network is implemented or how the control is technologically improved. The mere presence of a “nonlinear plant” terminology does not confer eligibility. The claim does not recite any particular improvement to the operation of a control technology itself, nor do they specify a concrete technological solution to a technical problem. Instead, the claims broadly apply a mathematical model to a result-oriented function of generating a control signal. Therefore, the rejections under 35 U.S.C. 101 are maintained.
Regarding the 5 U.S.C. 101 rejection of claim 1, applicant asserts, “The Applicant notes that the Director of the United States Patent and Trademark Office recently intervened in an appeal to the Patent Trial and Appeal Board, vacating the Board's decision finding certain machine-learning model claims to be "non-statutory subject matter." The Director's Decision on Request for Rehearing ("the Director's Decision"), in appeal 2024- 0005671, dated 26 September 2025, expressly found that the challenged claim 1, "when considered as a whole, integrates an abstract idea into a practical application." (Director's Decision p. 10.)
If anything, claim 1 in the application at issue in the Director's Decision is considerably more "generic" than Applicant's claim 1, for several reasons, and yet the Director still found it to be eligible under 35 U.S.C. § 101. Claim 1 of the application at issue in the Director's Decision reads as follows:
1.A computer-implemented method of training a machine learning model having a
plurality of parameters,
wherein the machine learning model has been trained on a first machine learning task
to determine first values of the parameters of the machine learning model, and
wherein the method comprises:
determining, for each of the plurality of parameters, a respective measure of an importance of the parameter to the machine learning model achieving acceptable performance on the first machine learning task; obtaining training data for training the machine learning model on a second, different machine learning task; and
training the machine learning model on the second machine learning task by
training the machine learning model on the training data to adjust the first values of the parameters so that the machine learning model achieves an acceptable level of performance on the second machine learning task while maintaining an acceptable level of performance on the first machine learning task,
wherein, during the training of the machine learning model on the second
machine learning task, values of parameters that were more important in the machine learning model achieving acceptable performance on the first machine learning task are more strongly constrained to not deviate from the first values than values of parameters that were less important in the machine learning model achieving acceptable performance on the first machine learning task.
In this claim, every step is clearly implemented on a computer. What's more, unlike the Applicant's claim, no inputs or outputs tied to any practical application (such as to the control of a nonlinear plant) are specified. Nevertheless, the Director's Decision found this claim to be directed to patent-eligible under 35 U.S.C. § 101 because it is directed to an "improvement to how the machine learning model itself operates, and not, for example, the identified mathematical calculation." (Director's Decision p. 9.)” (Remarks Pg.12-13).
Examiner Response:
The examiner respectfully disagrees. Regarding applicant’s apparent reliance on the decision of the Appeals Review Panel in Ex parte Desjardins, No. 2024-000567 (P.T.A.B. Sept. 26, 2025), in Desjardins, unlike in the claims at issue here, the appellants specifically argued that the claimed invention “address[es] challenges in continual learning and model efficiency by reducing storage requirements and preserving task performance across sequential training”. Desjardins, op. at 7. That is, the appellant in Desjardins specifically alleged that the claimed subject matter improves machine learning itself. By contrast, Applicant in the instant case does not point to any specific claim language that characterizing an improvement, and does not point to any claim language that is analogous to the claims at issue in Desjardins. Regarding the Director's Decision cited by applicant is based on training machine learning. The eligibility determination was grounded in a finding that the claims improved machine learning technology, rather than merely applying mathematical calculations. In contrast, the present claims do not recite the functioning of a neural network or to machine learning technology itself. The claims merely recite using a neural network to estimate an integral over time and to generate control signal, without specifying any improvement to how the neural network is used a tool to perform mathematical processing, rather than being the subject of a technological improvement. Therefore, the rejections under 35 U.S.C. 101 are maintained.
Applicant's arguments filed on 11/25/2025 with respect to 35 U.S.C. 102 rejections of claims have been fully considered but they are not persuasive.
Regarding the 35 U.S.C. 102 rejection of claim 1, applicant asserts, “More particularly, claim 1 specifies that the neural network is configured to calculate a controller output signal by summing a first signal and a second signal, where the first signal depends "on a current value of the controller input signal" and the second signal that is "generated at least in part by a first neural network estimating an integral over time of the controller input signal." Note that claim 1 specifies that the "controller input signal" represents "an error in an output of the nonlinear plant." The Office Action refers to "Page 517 & III. Learning Algorithm of RLLS" in Hwang's disclosure, but does not explain where a sum that involves a "first signal depending on a current value of the controller input signal" and a second sigal that is "generated at least in part by a first neural network estimating an integral over time of the controller input signal" can be found in that disclosure. To be sure, numerous sums are disclosed in that section of Hwang, and the section refers several time to "back-propagating" certain errors, but the section does not appear to disclose the summing of any of those error with a signal that is "generated at least in part by a first neural network estimating an integral over time of the controller input signal," i.e., a signal generated by a neural network that estimates an integral over time of a signal that represents error in the output of a nonlinear plant…Furthermore, while the Office Action states that Hwang's "long-term policy" represents an "integrating over time," Hwang does not actually state that the long-term policy selector integrates anything, much less that it integrates a controller input signal that represents an integral over time of a controller input signal. To the extent that the Office Action contends that this is an inherent feature of Hwang's long-term policy selector (or any other component of Hwang's system), the Applicant respectfully requests that the Office Action provide evidence of this” (Remarks Pg.14-16).
Examiner Response:
The examiner respectfully disagrees. As disclosed on page 517 of Hwang, the reinforcement predictor network produces an output signal that is a function of the current state vector, which corresponds to the current controller input signal representing error. This satisfies the claimed first signal that depends on the current value of the controller input signal. Further, Hwang’s learning algorithm of RLLS (Pg. 517 Section II and III) explicitly discloses an evaluation predictor (EP) is long-term policy, which computes a utility index based on successive state transitions. This utility index is updated using both current and future state information, thereby representing an accumulated (i.e. integrated) measure of system performance over time. Since the controller input signal represents error, and the utility index is derived from repeated observation of that signal across time steps, The EP network estimates an integrated effect of the controller input signal over time. Thus, the output of the long-term policy selector corresponds to an integrated measure of system performance over time, even if the term “integral” is not explicitly used. Therefore, the rejections under 35 U.S.C. 102 are maintained.
Applicant's arguments filed on 11/25/2025 with respect to 35 U.S.C. 103 rejections of claims have been fully considered but they are not persuasive.
Regarding the 35 U.S.C. 103 rejection of claims, applicant asserts, “Each of the remaining claims depends from claim 1 and is rejected as allegedly obvious over Hwang in combination with one or more of several additional references. The Office Action does not show that any of these additional references cures the problems discussed above with the rejection of claim 1. The rejections of the dependent claims as allegedly obvious over Hwang should therefore be withdrawn for the same reasons given above” (Remarks Pg. 16).
Examiner Response:
The examiner respectfully disagrees. Each of the remaining claims depends on claim 1. All of the limitations of dependent claims are taught by Hwang in view of various other references, as detailed in the section 103 rejection above.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to LOKESHA PATEL whose telephone number is (571)272-6267. The examiner can normally be reached 8 AM - 4 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kamran Afshar can be reached at (571) 272-7796. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/LOKESHA PATEL/Examiner, Art Unit 2125
/KAMRAN AFSHAR/Supervisory Patent Examiner, Art Unit 2125
1 Examiner notes that applicant is apparently referring to Appeals Review Panel in Ex parte Desjardins, No. 2024-000567 (hereinafter “Desjardins”).