Prosecution Insights
Last updated: April 19, 2026
Application No. 18/648,235

MACHINE-LEARNING BASED SOLVER OF COUPLED-PARTIAL DIFFERENTIAL EQUATIONS

Final Rejection §101
Filed
Apr 26, 2024
Examiner
STARKS, WILBERT L
Art Unit
2122
Tech Center
2100 — Computer Architecture & Software
Assignee
Ansys, Inc.
OA Round
4 (Final)
76%
Grant Probability
Favorable
5-6
OA Rounds
3y 6m
To Grant
80%
With Interview

Examiner Intelligence

Grants 76% — above average
76%
Career Allow Rate
493 granted / 653 resolved
+20.5% vs TC avg
Minimal +4% lift
Without
With
+4.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 6m
Avg Prosecution
47 currently pending
Career history
700
Total Applications
across all art units

Statute-Specific Performance

§101
40.3%
+0.3% vs TC avg
§103
13.1%
-26.9% vs TC avg
§102
35.7%
-4.3% vs TC avg
§112
6.0%
-34.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 653 resolved cases

Office Action

§101
DETAILED ACTION Claims 1-13 and 15-21 have been examined. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 U.S.C. § 101 35 U.S.C. § 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. The invention, as taught in Claims 1-13 and 15-21, is directed to “mental steps” and “mathematical concepts” without significantly more. The claims recite: • generating a set of geometry and boundary condition encodings for all variables of a partial differential equation (PDE) (i.e., mathematical steps) • encoding the set of geometry and boundary conditions as a set of geometry and boundary condition encodings in a latent space (i.e., mathematical steps) • the set of geometry and boundary conditions are represented with a particular dimension (i.e., mathematical steps) • the set of geometry and boundary condition encodings are represented in a lower dimension than the particular dimension in the latent space (i.e., mathematical steps) • training a set of neighborhood generative neural networks for neighboring ones of the subdomains (i.e., mathematical steps) • the set of neighborhood generative neural networks are trained to encode a concatenation of latent vectors of the separate latent spaces and the set of geometry and boundary condition encodings to generate solutions for the variables of the PDE (i.e., mathematical steps) Claim 1 Step 1 inquiry: Does this claim fall within a statutory category? The preamble of the claim recites “1. A computer implemented method…” Therefore, it is a “method” (or “process”), which is a statutory category of invention. Therefore, the answer to the inquiry is: “YES.” Step 2A (Prong One) inquiry: Are there limitations in Claim 1 that recite abstract ideas? YES. The following limitations in Claim 1 recite abstract ideas that fall within at least one of the groupings of abstract ideas enumerated in the 2019 PEG. Specifically, they are “mental steps” and “mathematical concepts”: • generating a set of geometry and boundary condition encodings for all variables of a partial differential equation (PDE) (i.e., mathematical steps) • encoding the set of geometry and boundary conditions as a set of geometry and boundary condition encodings in a latent space (i.e., mathematical steps) • the set of geometry and boundary conditions are represented with a particular dimension (i.e., mathematical steps) • the set of geometry and boundary condition encodings are represented in a lower dimension than the particular dimension in the latent space (i.e., mathematical steps) • training a set of neighborhood generative neural networks for neighboring ones of the subdomains (i.e., mathematical steps) • the set of neighborhood generative neural networks are trained to encode a concatenation of latent vectors of the separate latent spaces and the set of geometry and boundary condition encodings to generate solutions for the variables of the PDE (i.e., mathematical steps) Step 2A (Prong Two) inquiry: Are there additional elements or a combination of elements in the claim that apply, rely on, or use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that it is more than a drafting effort designed to monopolize the exception? Applicant’s claims contain no “additional elements.” Therefore, the answer to the inquiry is “NO”, no additional elements integrate the claimed abstract idea into a practical application. Step 2B inquiry: Does the claim provide an inventive concept, i.e., does the claim recite additional element(s) or a combination of elements that amount to significantly more than the judicial exception in the claim? Applicant’s claims contain no “additional elements.” Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)). Therefore, the answer to the inquiry is “NO”, no additional elements provide an inventive concept that is significantly more than the claimed abstract ideas the claimed abstract idea into a practical application. Claim 1 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 2 Claim 2 recites: 2. The computer implemented method of claim 1, wherein the first set of generative neural networks includes a first generative neural network and a second generative neural network, wherein the first generative neural network is trained using a first set of subdomains contained within the domain, and wherein the second generative neural network is trained using a second set of subdomains contained within the domain. Applicant’s Claim 2 merely teaches a set of trained neural networks (i.e., mathematical steps). It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 2 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 3 Claim 3 recites: 3. The computer implemented method of claim 2, wherein the first generative neural network is trained at a first resolution, wherein the second generative neural network is trained at a second resolution, and wherein the first resolution and the second resolution are different grid resolutions within the domain. Applicant’s Claim 3 merely teaches a set of trained neural networks (i.e., mathematical steps). It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 3 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 4 Claim 4 recites: 4. The computer implemented method as in claim 2, wherein solutions of the PDE by the first generative neural network are generated independently of the solutions of the PDE by the second generative neural network, wherein each subdomain in the first set of subdomains has a first size and each subdomain in the second set of subdomains has a second size that is different than the first size, and wherein the PDE has coupled variables. Applicant’s Claim 4 merely teaches solutions of the PDE (i.e., mathematical steps). It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 4 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 5 Claim 5 recites: 5. The computer implemented method as in claim 2 further comprising: combining trained solutions from the first generative neural network and the second generative neural network to provide a combined set of trained solutions; and training a third generative neural network to provide solutions for decoupled variables of the PDE based on the combined set of trained solutions. Applicant’s Claim 5 merely teaches combining trained solutions and training a neural network (i.e., mathematical steps). It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 5 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 6 Claim 6 recites: 6. The computer implemented method as in claim 5, wherein the third generative neural network is trained in separate latent vector spaces, with a latent vector space for each of the decoupled variables such that a solution for a decoupled variable has its own representation in its own encoded latent vector space. Applicant’s Claim 6 merely teaches a trained neural network (i.e., mathematical steps). It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 6 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 7 Claim 7 recites: 7. The computer implemented method as in claim 5 further comprising: training a set of neighborhood generative neural networks to learn relationships, in an encoded latent vector space, among neighboring subdomains within the domain, wherein the set of neighborhood generative neural networks comprises a plurality of neighborhood generative neural networks. Applicant’s Claim 7 merely teaches training a set of neural networks (i.e., mathematical steps). It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 7 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 8 Claim 8 recites: 8. The computer implemented method as in claim 7 further comprising: classifying each of the subdomains, based upon their position in the domain, into one of three classes: an interior subdomain; a boundary subdomain; and a corner subdomain; and wherein the plurality of neighborhood generative neural networks comprises: an interior subdomain neighborhood generative neural network to train based on data associated with subdomains classified as an interior subdomain; a boundary subdomain neighborhood generative neural network to train based on data associated with subdomains classified as a boundary subdomain; and a corner subdomain neighborhood generative neural network to train based on data associated with subdomains classified as a corner subdomain. Applicant’s Claim 8 merely teaches classifying subdomains and three neural networks (i.e., mathematical steps). It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 8 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 9 Claim 9 recites: 9. The computer implemented method as in claim 5, wherein all trained solutions from the first set of generative neural networks are combined in a combined set of subdomains, and wherein the combining comprises sampling values from all of the trained solutions. Applicant’s Claim 9 merely teaches sampling values from all of the trained solutions (i.e., mathematical steps). It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 9 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 10 Claim 10 recites: 10. The computer implemented method as in claim 9, wherein the third generative neural network comprises: encoder networks that encode PDE solutions to latent space vectors for each of the decoupled variables; and decoder networks that decode latent space vectors to PDE solutions for each of the decoupled variables. Applicant’s Claim 10 merely teaches encoder neural networks and decoder neural networks (i.e., mathematical steps). It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 10 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 11 Claim 11 recites: 11. The computer implemented method as in claim 10, wherein the decoupled variables are coupled by the set of neighborhood generative neural networks by concatenating their latent vectors in the latent spaces. Applicant’s Claim 11 merely teaches concatenating latent vectors in the latent spaces (i.e., mathematical steps). It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 11 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 12 Claim 12 recites: 12. The computer implemented method as in claim 11 further comprising: generating a deployable neural network for use in a solver to solve PDEs in simulations of physical systems, the deployable neural network based on results from the first set of generative neural networks, the third generative neural network and the set of neighborhood neural networks. Applicant’s Claim 12 merely teaches generating a neural network (i.e., mathematical steps). It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 12 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 13 Step 1 inquiry: Does this claim fall within a statutory category? The preamble of the claim recites “A non-transitory computer readable medium…” Therefore, it is a “computer readable medium” (or “product of manufacture”), which is a statutory category of invention. Therefore, the answer to the inquiry is: “YES.” Step 2A (Prong One) inquiry: Are there limitations in Claim 13 that recite abstract ideas? YES. The following limitations in Claim 13 recite abstract ideas that fall within at least one of the groupings of abstract ideas enumerated in the 2019 PEG. Specifically, they are “mental steps” and “mathematical concepts”: • training a generative neural network by a set of training generative neural networks that operate at different resolutions in a solution space (i.e., mathematical concepts) • geometry data that specifies a domain (i.e., mathematical concepts) • boundary conditions (i.e., mathematical concepts) • solutions of a partial differential equation (PDE) (i.e., mathematical concepts) • initializing the domain with an initial solution for each variable in the PDE (i.e., mathematical steps) • encoding a solution for each variable in a latent vector space (i.e., mathematical steps) • a trained encoder neural network of a generative neural network (i.e., mathematical steps) • updating the encoded solution • trained neighborhood generative neural network Step 2A (Prong Two) inquiry: Are there additional elements or a combination of elements in the claim that apply, rely on, or use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that it is more than a drafting effort designed to monopolize the exception? Applicant’s claims contain the following “additional elements”: (1) A “receiving” of “geometry data” (2) A “receiving” of “one or more boundary conditions” A “receiving” of “geometry data” is a broad term which is described at a high level. M.P.E.P. § 2106.05(d)(I)(2) recites in part: 2. A factual determination is required to support a conclusion that an additional element (or combination of additional elements) is well-understood, routine, conventional activity. Berkheimer v. HP, Inc., 881 F.3d 1360, 1368, 125 USPQ2d 1649, 1654 (Fed. Cir. 2018). However, this does not mean that a prior art search is necessary to resolve this inquiry. Instead, examiners should rely on what the courts have recognized, or those in the art would recognize, as elements that are well-understood, routine, conventional activity in the relevant field when making the required determination. For example, in many instances, the specification of the application may indicate that additional elements are well-known or conventional. See, e.g., Intellectual Ventures v. Symantec, 838 F.3d at 1317; 120 USPQ2d at 1359 ("The written description is particularly useful in determining what is well-known or conventional"); Internet Patents Corp. v. Active Network, Inc., 790 F.3d 1343, 1348, 115 USPQ2d 1414, 1418 (Fed. Cir. 2015) (relying on specification’s description of additional elements as "well-known", "common" and "conventional"); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 614, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (Specification described additional elements as "either performing basic computer functions such as sending and receiving data, or performing functions ‘known’ in the art."). Further, M.P.E.P. § 2106.05(d)(II) recites: The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity. i. Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network); … Merely using the conventional computer to receive data is well known, understood, and conventional. Thus, it adds nothing significantly more to the judicial exception. This “receiving” of “geometry data” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)). A “receiving” of “one or more boundary conditions” is a broad term which is described at a high level. M.P.E.P. § 2106.05(d)(I)(2) recites in part: 2. A factual determination is required to support a conclusion that an additional element (or combination of additional elements) is well-understood, routine, conventional activity. Berkheimer v. HP, Inc., 881 F.3d 1360, 1368, 125 USPQ2d 1649, 1654 (Fed. Cir. 2018). However, this does not mean that a prior art search is necessary to resolve this inquiry. Instead, examiners should rely on what the courts have recognized, or those in the art would recognize, as elements that are well-understood, routine, conventional activity in the relevant field when making the required determination. For example, in many instances, the specification of the application may indicate that additional elements are well-known or conventional. See, e.g., Intellectual Ventures v. Symantec, 838 F.3d at 1317; 120 USPQ2d at 1359 ("The written description is particularly useful in determining what is well-known or conventional"); Internet Patents Corp. v. Active Network, Inc., 790 F.3d 1343, 1348, 115 USPQ2d 1414, 1418 (Fed. Cir. 2015) (relying on specification’s description of additional elements as "well-known", "common" and "conventional"); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 614, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (Specification described additional elements as "either performing basic computer functions such as sending and receiving data, or performing functions ‘known’ in the art."). Further, M.P.E.P. § 2106.05(d)(II) recites: The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity. i. Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network); … Merely using the conventional computer to receive data is well known, understood, and conventional. Thus, it adds nothing significantly more to the judicial exception. This “receiving” of “one or more boundary conditions” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)). The answer to the inquiry is “NO”, no additional elements integrate the claimed abstract idea into a practical application. Step 2B inquiry: Does the claim provide an inventive concept, i.e., does the claim recite additional element(s) or a combination of elements that amount to significantly more than the judicial exception in the claim? Applicant’s claims contain the following “additional elements”: (1) A “receiving” of “geometry data” (2) A “receiving” of “one or more boundary conditions” A “receiving” of “geometry data” is a broad term which is described at a high level. M.P.E.P. § 2106.05(d)(I)(2) recites in part: 2. A factual determination is required to support a conclusion that an additional element (or combination of additional elements) is well-understood, routine, conventional activity. Berkheimer v. HP, Inc., 881 F.3d 1360, 1368, 125 USPQ2d 1649, 1654 (Fed. Cir. 2018). However, this does not mean that a prior art search is necessary to resolve this inquiry. Instead, examiners should rely on what the courts have recognized, or those in the art would recognize, as elements that are well-understood, routine, conventional activity in the relevant field when making the required determination. For example, in many instances, the specification of the application may indicate that additional elements are well-known or conventional. See, e.g., Intellectual Ventures v. Symantec, 838 F.3d at 1317; 120 USPQ2d at 1359 ("The written description is particularly useful in determining what is well-known or conventional"); Internet Patents Corp. v. Active Network, Inc., 790 F.3d 1343, 1348, 115 USPQ2d 1414, 1418 (Fed. Cir. 2015) (relying on specification’s description of additional elements as "well-known", "common" and "conventional"); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 614, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (Specification described additional elements as "either performing basic computer functions such as sending and receiving data, or performing functions ‘known’ in the art."). Further, M.P.E.P. § 2106.05(d)(II) recites: The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity. i. Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network); … Merely using the conventional computer to receive data is well known, understood, and conventional. Thus, it adds nothing significantly more to the judicial exception. Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)). A “receiving” of “one or more boundary conditions” is a broad term which is described at a high level. M.P.E.P. § 2106.05(d)(I)(2) recites in part: 2. A factual determination is required to support a conclusion that an additional element (or combination of additional elements) is well-understood, routine, conventional activity. Berkheimer v. HP, Inc., 881 F.3d 1360, 1368, 125 USPQ2d 1649, 1654 (Fed. Cir. 2018). However, this does not mean that a prior art search is necessary to resolve this inquiry. Instead, examiners should rely on what the courts have recognized, or those in the art would recognize, as elements that are well-understood, routine, conventional activity in the relevant field when making the required determination. For example, in many instances, the specification of the application may indicate that additional elements are well-known or conventional. See, e.g., Intellectual Ventures v. Symantec, 838 F.3d at 1317; 120 USPQ2d at 1359 ("The written description is particularly useful in determining what is well-known or conventional"); Internet Patents Corp. v. Active Network, Inc., 790 F.3d 1343, 1348, 115 USPQ2d 1414, 1418 (Fed. Cir. 2015) (relying on specification’s description of additional elements as "well-known", "common" and "conventional"); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 614, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (Specification described additional elements as "either performing basic computer functions such as sending and receiving data, or performing functions ‘known’ in the art."). Further, M.P.E.P. § 2106.05(d)(II) recites: The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity. i. Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network); … Merely using the conventional computer to receive data is well known, understood, and conventional. Thus, it adds nothing significantly more to the judicial exception. Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)). Therefore, the answer to the inquiry is “NO”, no additional elements provide an inventive concept that is significantly more than the claimed abstract ideas the claimed abstract idea into a practical application. Claim 13 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 15 Claim 15 recites: 15. The medium as in claim 13, wherein the method further comprises: dividing the domain into the subdomains that are classified into different classes of subdomains based on their positions in the domain. Applicant’s Claim 15 merely teaches dividing the domain into the subdomains (i.e., mathematical steps). It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 15 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 16 Claim 16 recites: 16. The medium as in claim 13, wherein an encoder encodes, during the encoding, a current solution in each subdomain for each variable in the PDE in the latent vector space that is decoupled, from the other variables, in the PDE; and wherein the solving, after the encoding, comprises: coupling the PDE variables by concatenating their latent vectors derived from the encoding in each subdomain; determining, by a set of neighborhood generative neural networks, a next encoded solution for each variable in the PDE in the latent vector space; uncoupling, after the next encoded solutions have been determined, the PDE variables and enforcing the boundary conditions and geometry encodings in the latent vector space; iterating, until convergence of a solution of the PDE is achieved based upon a convergence criterion, the operations of coupling, determining, and uncoupling; generating, by a trained decoder neural network of the generative neural network, solutions from converged encoded solutions in the latent vector space in each subdomain and deriving, from network connectivity data for subdomains of the domain, final solutions for all variables of the PDE over the domain. Applicant’s Claim 16 merely teaches encoding a current solution in each subdomain for each variable in the PDE, concatenating latent vectors derived from the encoding in each subdomain, determining a next encoded solution, uncoupling the PDE variables, enforcing the boundary conditions and geometry encodings in the latent vector space, iterating the operations of coupling, determining, and uncoupling, and generating solutions (i.e., mathematical steps). It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 16 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 17 Step 1 inquiry: Does this claim fall within a statutory category? The preamble of the claim recites “A non-transitory computer readable medium…” Therefore, it is a “computer readable medium” (or “product of manufacture”), which is a statutory category of invention. Therefore, the answer to the inquiry is: “YES.” Step 2A (Prong One) inquiry: Are there limitations in Claim 17 that recite abstract ideas? YES. The following limitations in Claim 17 recite abstract ideas that fall within at least one of the groupings of abstract ideas enumerated in the 2019 PEG. Specifically, they are “mental steps” and “mathematical concepts”: • generating a set of geometry and boundary condition encodings for all variables of a partial differential equation (PDE) (i.e., mathematical steps) • encoding the set of geometry and boundary conditions as a set of geometry and boundary condition encodings in a latent space (i.e., mathematical steps) • the set of geometry and boundary conditions are represented with a particular dimension (i.e., mathematical steps) • the set of geometry and boundary condition encodings are represented in a lower dimension than the particular dimension in the latent space (i.e., mathematical steps) • training a set of neighborhood generative neural networks for neighboring ones of the subdomains (i.e., mathematical steps) • the set of neighborhood generative neural networks are trained to encode a concatenation of latent vectors of the separate latent spaces and the set of geometry and boundary condition encodings to generate solutions for the variables of the PDE (i.e., mathematical steps) Step 2A (Prong Two) inquiry: Are there additional elements or a combination of elements in the claim that apply, rely on, or use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that it is more than a drafting effort designed to monopolize the exception? Applicant’s claims contain no “additional elements.” Therefore, the answer to the inquiry is “NO”, no additional elements integrate the claimed abstract idea into a practical application. The answer to the inquiry is “NO”, no additional elements integrate the claimed abstract idea into a practical application. Step 2B inquiry: Does the claim provide an inventive concept, i.e., does the claim recite additional element(s) or a combination of elements that amount to significantly more than the judicial exception in the claim? Applicant’s claims contain no “additional elements.” Therefore, the answer to the inquiry is “NO”, no additional elements integrate the claimed abstract idea into a practical application. Therefore, the answer to the inquiry is “NO”, no additional elements provide an inventive concept that is significantly more than the claimed abstract ideas the claimed abstract idea into a practical application. Claim 17 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 18 Claim 18 recites: 18. The medium as in claim 17, wherein the first set of generative neural networks includes a first generative neural network and a second generative neural network, wherein the first generative neural network is trained using a first set of subdomains contained within the domain, and wherein the second generative neural network is trained using a second set of subdomains contained within the domain. Applicant’s Claim 18 merely teaches a set of trained neural networks (i.e., mathematical steps). It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 18 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 19 Claim 19 recites: 19. The medium of claim 18, wherein the first generative neural network is trained at a first resolution, wherein the second generative neural network is trained at a second resolution, and wherein the first resolution and the second resolution are different grid resolutions within the domain. Applicant’s Claim 19 merely teaches a set of trained neural networks (i.e., mathematical steps). It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 19 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 20 Claim 20 recites: 20. The medium as in claim 18, wherein solutions of the PDE by the first generative neural network are generated independently of the solutions of the PDE by the second generative neural network, wherein each subdomain in the first set of subdomains has a first size and each subdomain in the second set of subdomains has a second size that is different than the first size, and wherein the PDE has coupled variables. Applicant’s Claim 20 merely teaches solutions of the PDE (i.e., mathematical steps). It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 20 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 21 Claim 21 recites: 21. The medium as in claim 17, wherein the method further comprises: combining trained solutions from the first generative neural network and the second generative neural network to provide a combined set of trained solutions; and training a third generative neural network to provide solutions for decoupled variables of the PDE based on the combined set of trained solutions. Applicant’s Claim 21 merely teaches combining trained solutions and training a neural network (i.e., mathematical steps). It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 21 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Reasons Why the Claims are Not Rejected Under Art The claims are not rejected because the closest prior art of Joshi, et al., Generative Models for Solving Nonlinear Partial Differential Equations, Proceedings of NeurIPS Workshop on Machine Learning for Physics, 2019, pp. 1-6 fails to expressly teach: • "neighborhood generative neural networks" • "concatenation of latent vectors" Response to Arguments Applicant's arguments filed 02 FEB 2026 have been fully considered but they are not persuasive. Specifically, Applicant argues: Argument 1 Under Step 2A (Prong One), and as previously stated, the claimed methods are not a mental process. Nor are the claims merely "mathematical in nature." The claim language and specification make clear the focus is a computer-implemented training and deployment pipeline that executes on processors and memory to train multiple neural networks in latent spaces, concatenate latent vectors across subdomain neighborhoods, and produce trained weights. Such operations require computer execution and are not practically performable in the human mind. More particularly, training and operating multiple neural networks in latent spaces, concatenating latent vectors across subdomain neighborhoods, and generating a deployable trained-weights model require processor execution and memory operations. For example, the present application describes a data-processing system architecture with processors, volatile and non-volatile memory, buses, and I/O controllers that execute a solver and store trained weights, and explains that the system manipulates and transforms data represented as physical (electronic) quantities within device registers and memories. (See US 2024/0273892 A1, e.g., paragraphs [0049]-[0051] and [0056], and FIG. 7). Applicant previously cited USPTO Example 39 to explain that training neural networks of the recited scope cannot practically be performed in the human mind. The Office Action responds by re-labeling the claim limitations as "mathematical" rather than addressing its non-mental character. There are mathematical steps in the claims. The following is a non-exhaustive list of those abstract ideas: • generating a set of geometry and boundary condition encodings for all variables of a partial differential equation (PDE) (i.e., mathematical steps) • encoding the set of geometry and boundary conditions as a set of geometry and boundary condition encodings in a latent space (i.e., mathematical steps) • the set of geometry and boundary conditions are represented with a particular dimension (i.e., mathematical steps) • the set of geometry and boundary condition encodings are represented in a lower dimension than the particular dimension in the latent space (i.e., mathematical steps) Applicant's argument is unpersuasive. The rejections stand. Argument 2 Whereas the rejection appears to be based on the interpretation of the claims limitations as being "mathematical steps" or "mathematical in nature" (see Office Action, e.g., pages 4 and 34-36), the amendments clarify that the claimed methods and products of manufacture produce a concrete computer artifact. More particularly, the claims recite generating a deployable model including trained weights that is generated for storage in a non-transitory computer readable medium for use in simulations of a physical object. This deployable artifact, together with the recited latent-space encodings, decoupling into separate latent spaces via several generative neural networks, and neighborhood generative neural networks trained to encode concatenations of latent vectors with geometry and boundary condition encodings, integrates any alleged mathematical concepts into a specific practical application. Accordingly, for the above reasons, the claim limitations as a whole do not recite abstract ideas. There are mathematical steps in the claims. The following is a non-exhaustive list of those abstract ideas: • generating a set of geometry and boundary condition encodings for all variables of a partial differential equation (PDE) (i.e., mathematical steps) • encoding the set of geometry and boundary conditions as a set of geometry and boundary condition encodings in a latent space (i.e., mathematical steps) • the set of geometry and boundary conditions are represented with a particular dimension (i.e., mathematical steps) • the set of geometry and boundary condition encodings are represented in a lower dimension than the particular dimension in the latent space (i.e., mathematical steps) Applicant's argument is unpersuasive. The rejections stand. Argument 3 Under Step 2A (Prong Two), the claimed architecture is non-conventional and field-specific. Even to the extent mathematics underlies PDE models, the claims integrate any alleged abstract concepts into a specific practical application - a field-specific solver pipeline that yields an executable, deployable artifact used in simulations of physical systems. More particularly, the claims recite a particular solver process flow tailored to simulations governed by coupled PDEs, not a generic data-processing flow. In particular, (i) geometry and boundary conditions are encoded into lower-dimensional latent vectors used by a solver network (See US 2024/0273892 A1, e.g., paragraphs [0045]-[0046] and FIG. 5C); (ii) coupled PDE variables are explicitly decoupled into separate latent spaces and learned by several generative neural networks (See US 2024/0273892 A1, e.g., paragraphs [0041]-[0042] and FIG. 4B); and (iii) neighborhood generative neural networks operate on concatenated latent vectors across subdomain neighbors together with geometry/boundary encodings to enforce local solution consistency at subdomain interfaces (See US 2024/0273892 A1, e.g., paragraphs [0043]-[0044] and FIG. 5B). These elements are recited as part of an integrated process flow that culminates in a deployable artifact - trained weights for both several generative neural networks and neighborhood generative neural networks - generated for storage in a non-transitory computer-readable medium for use in simulations of a physical object. (See US 2024/0273892 A1, e.g., paragraphs [0009], [0034], [0047] and FIGS. 2B and 6). The process flow includes concrete, field-specific steps tied to a solution grid and runtime workflow. Accordingly, the combination of elements in the claim integrate any claimed abstract idea into a practical application. “Deployable” does not mean “deployed.” The argued “process flow” is a series of mathematical steps. No practical application is recited in the claims. Applicant's argument is unpersuasive. The rejections stand. Argument 4 Under Step 2B, the ordered combination amounts to significantly more than any alleged abstract idea. The claims do not merely contain the additional elements of "receiving geometry data" or "receiving one or more boundary conditions." (See Office Action, e.g., page 16.) Instead, the claims recite a non-conventional arrangement of elements: dimensionality-reduced geometry/boundary encodings by pretrained encoders; separate latent spaces for decoupled variables learned by distinct generative neural networks; neighborhood generative neural networks operating on concatenated latent vectors with geometry/boundary encodings to enforce local continuity; and generation of a deployable trained-weights model stored in memory for use in simulations. (See US 2024/0273892 A1, e.g., paragraphs [0034] and [0042]-[0047], and FIGS. 4B and 5B). There is no record evidence that such claimed arrangement was well-understood, routine, and conventional. Accordingly, the claims provide an inventive concept. For the reasons above, the claims are anchored to concrete computer operations and storage, not mere mathematical "steps" or abstract ideas that are "mathematical in nature." The dimensionality-reduced geometry/boundary encodings, per-variable latent-space decoupling via several generative neural networks, neighborhood generative neural networks enforcing local consistency through concatenated latent vectors and encodings, and the generation for storage of a deployable trained-weights model together provide a practical application and amount to significantly more than any alleged abstract idea. Accordingly, Applicant respectfully requests withdrawal of the rejection. Applicant argues the following as “nonconventional”, however, each is abstract: • dimensionality-reduced geometry/boundary encodings by pretrained encoders; (i.e., mathematical steps) • separate latent spaces for decoupled variables learned by distinct generative neural networks; (i.e., mathematical steps) • neighborhood generative neural networks (trained) operating on concatenated latent vectors with geometry/boundary encodings to enforce local continuity; and (i.e., mathematical steps) • generation of a deployable trained-weights model stored in memory for use in simulations (i.e., mathematical steps…the method of “generation” is completely unspecified and the broadest reasonable interpretation includes mathematical steps) Applicant's argument is unpersuasive. The rejections stand. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiries concerning this communication or earlier communications from the examiner should be directed to Wilbert L. Starks, Jr., who may be reached Monday through Friday, between 8:00 a.m. and 5:00 p.m. EST. or via telephone at (571) 272-3691 or email: Wilbert.Starks@uspto.gov. If you need to send an Official facsimile transmission, please send it to (571) 273-8300. If attempts to reach the examiner are unsuccessful the Examiner’s Supervisor (SPE), Kakali Chaki, may be reached at (571) 272-3719. Hand-delivered responses should be delivered to the Receptionist @ (Customer Service Window Randolph Building 401 Dulany Street, Alexandria, VA 22313), located on the first floor of the south side of the Randolph Building. Finally, information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Moreover, status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have any questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) toll-free @ 1-866-217-9197. /WILBERT L STARKS/ Primary Examiner, Art Unit 2122 WLS 31 MAR 2026
Read full office action

Prosecution Timeline

Apr 26, 2024
Application Filed
Jan 10, 2025
Non-Final Rejection — §101
Apr 14, 2025
Response Filed
Jun 18, 2025
Final Rejection — §101
Sep 23, 2025
Request for Continued Examination
Sep 29, 2025
Response after Non-Final Action
Sep 29, 2025
Non-Final Rejection — §101
Jan 16, 2026
Examiner Interview Summary
Jan 16, 2026
Applicant Interview (Telephonic)
Feb 02, 2026
Response Filed
Mar 31, 2026
Final Rejection — §101 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12561587
DATA PROCESSING METHOD, ELECTRONIC DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Feb 24, 2026
Patent 12555007
METHOD AND SYSTEM FOR INFERRING DEVICE FINGERPRINT
2y 5m to grant Granted Feb 17, 2026
Patent 12541694
GENERATING A DOMAIN-SPECIFIC KNOWLEDGE GRAPH FROM UNSTRUCTURED COMPUTER TEXT
2y 5m to grant Granted Feb 03, 2026
Patent 12525251
METHOD, SYSTEM AND PROGRAM PRODUCT FOR PERCEIVING AND COMPUTING EMOTIONS
2y 5m to grant Granted Jan 13, 2026
Patent 12518149
IMPLICIT VECTOR CONCATENATION WITHIN 2D MESH ROUTING
2y 5m to grant Granted Jan 06, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
76%
Grant Probability
80%
With Interview (+4.4%)
3y 6m
Median Time to Grant
High
PTA Risk
Based on 653 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month