Prosecution Insights
Last updated: April 19, 2026
Application No. 17/963,814

FOUNDATION MODEL BASED FLUID SIMULATIONS

Non-Final OA §103§112
Filed
Oct 11, 2022
Examiner
WECHSELBERGER, ALFRED H.
Art Unit
2187
Tech Center
2100 — Computer Architecture & Software
Assignee
Deep Forest Sciences Inc.
OA Round
1 (Non-Final)
58%
Grant Probability
Moderate
1-2
OA Rounds
3y 8m
To Grant
94%
With Interview

Examiner Intelligence

Grants 58% of resolved cases
58%
Career Allow Rate
122 granted / 212 resolved
+2.5% vs TC avg
Strong +36% interview lift
Without
With
+36.5%
Interview Lift
resolved cases with interview
Typical timeline
3y 8m
Avg Prosecution
42 currently pending
Career history
254
Total Applications
across all art units

Statute-Specific Performance

§101
30.0%
-10.0% vs TC avg
§103
38.9%
-1.1% vs TC avg
§102
3.8%
-36.2% vs TC avg
§112
24.0%
-16.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 212 resolved cases

Office Action

§103 §112
DETAILED ACTION Claims 1 – 20 have been presented for examination. This office action is in response to submission of the application on 10/11/2022. Claims 1 – 20 have been considered under “2019 Revised Patent Subject Matter Eligibility Guidance” 84 Fed. Reg. 50 (7 January 2019), the instant claims are viewed as not reciting an abstract idea under step 2A(i). Specifically, the “deploy” and “reconfigure” amount to executing a foundation model in a pipeline, or to modifying an already deployed foundation model on computing infrastructure, which cannot reasonably be performed in the mind, nor does it explicitly recite any mathematical calculations. Accordingly, the claims are deemed eligible under 35 U.S.C. 101. Priority The later-filed application must be an application for a patent for an invention which is also disclosed in the prior application (the parent or original nonprovisional application or provisional application). The disclosure of the invention in the parent application and in the later-filed application must be sufficient to comply with the requirements of 35 U.S.C. 112(a) or the first paragraph of pre-AIA 35 U.S.C. 112, except for the best mode requirement. See Transco Products, Inc. v. Performance Contracting, Inc., 38 F.3d 551, 32 USPQ2d 1077 (Fed. Cir. 1994). The disclosure of the prior-filed application, Application No. 63/254309, fails to provide adequate support or enablement in the manner provided by 35 U.S.C. 112(a) or pre-AIA 35 U.S.C. 112, first paragraph for one or more claims of this application since there is no explicit disclosure of the recited “deploy the received fluid foundation model into a downstream machine learning pipeline for a fluid dynamics application; reconfigure the fluid foundation model for the fluid dynamics application”. Specifically, Application No. 63/254309 discloses using a pretrained fluid or acoustic foundation model for downstream usecases (see Figure 1), it does not explicitly disclose usage in a machine learning pipeline. Therefore, the priority date is 10/11/2022. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Specifically, claim 20 recites means for “receiving” and “deploying” and “reconfiguring” and “outputting”. A review of the specification shows that the various means for comprise general purpose computer elements (see the instant application Paragraph 60 – 61). Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. Claim 9 recites the limitation "the inductive priors". There is insufficient antecedent basis for this limitation in the claim since it depends from claim 1, however “a inductive priors” is recited in claim 8. The limitation is interpreted for examination purposes as having proper antecedent basis. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: Determining the scope and contents of the prior art. Ascertaining the differences between the prior art and the claims at issue. Resolving the level of ordinary skill in the pertinent art. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1 – 2, 8, 14, 17 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over CN110348059 (henceforth “‘059”) in view of Johnson et al. (US 2019/0129764) (henceforth “Johnson (764)”). ‘059 and Johnson (764) are analogous art because they solve the same problem of simulating fluid behavior, and because they are from the same field of endeavor of simulation of physical systems. With regard to claim 1, ;059 teaches receive a fluid foundation model that is pretrained on fluid data; (Page 6 Top part of the network parameters previously completed by training is used as a pre-training model; reconstructing geometric parameters, boundary conditions and fluid parameters known from a flow heat transfer model including a temperature field, a pressure field and a velocity field.) reconfigure the fluid foundation model for the fluid dynamics application; and (Page 6, Top the optimizer adopts the SGD gradient descent algorithm in the training process, the initial learning rate is set to 0.001, and then the learning rate is attenuated every 100 steps. It is 1/10 of the original learning rate) output results from the machine learning pipeline for the fluid dynamics application based on the reconfigured fluid foundation model. (Page 7, Bottom The network adopts a ResNet network architecture symmetric with the G Net network architecture, with a total of 18 layers, and the final full connection layer output is changed to 2, and the specific network structure is shown in FIG; the mutual game between the G Net generation network and the D Net resolution network is successfully obtained to reconstruct the flow field.). ‘059 does not appear to explicitly disclose: a processor; and a memory that stores code executable by the processor; deploy the received fluid foundation model into a downstream machine learning pipeline for a fluid dynamics application. However, Johnson (764) teaches: a processor; and a memory that stores code executable by the processor; and a memory that stores code executable by the processor to (Paragraph 87 instructions are preferably executed by computer-executable components preferably integrated with the system and one or more portions of the processors and/or the controllers. The computer-readable medium can be stored on any suitable computer-readable media such as RAMs, ROMs, flash memory) receive a foundation model that is pretrained on data (Paragraph 6 subscriber machine learning model includes a machine learning model that is external to the machine learning tuning service and implemented by a subscriber to the machine learning tuning service) deploy the received fluid foundation model into a downstream machine learning pipeline for a fluid dynamics application. (Paragraph 6 and 60 – 62 successful enrollment, S205 may function to provide or transmit an API key to the user. The API key provided by S205 may function to enable the user to interact with the API account as well as implement one or more functionalities of the intelligent API and also, generate API requests and access one or more web resources via the intelligent API; initializes an operation of each of a plurality of distinct tuning worker instances of the machine learning tuning service that execute distinct tuning tasks for tuning the hyperparameters of the subscriber machine learning model according to the plurality of tuning parameters) reconfigure the foundation model for the application (Paragraph 6 and 62 initializes an operation of each of a plurality of distinct tuning worker instances of the machine learning tuning service that execute distinct tuning tasks for tuning the hyperparameters of the subscriber machine learning model according to the plurality of tuning parameters; may function to use the first API function to create or build an optimization work request that sets a plurality of tuning parameters for tuning or optimizing hyperparameters of a machine learning model (e.g., external model) of a subscriber to the one or more services of the intelligent optimization platform) output results from the machine learning pipeline for the reconfigured model (Paragraph 75 execute the second API call function to instantiate a surrogate machine learning model for the external machine learning model of a subscriber to the services of the platform; may function to use the surrogate machine learning model to test a performance of the plurality of raw values of the hyperparameters based on input into a structure of the surrogate model of the plurality of raw values for each of the hyperparameters of the external machine learning model). It would have been obvious to one of ordinary skill in the art to combine the machine learning pipeline disclosed by ‘059 with the computer system disclosed by Johnson (764). One of ordinary skill in the art would have been motivated to make this modification in order to enable computerized processing of the algorithm (Johnson (764) Abstract). With regard to claim 14, it recites the same steps as claim 1 which is taught by ‘059 in view of Johnson (764). Claim 14 further recites: a computer program product comprising executable program code stored on a non-transitory computer readable storage medium, the executable program code executable by a processor to perform operations, the operations comprising the steps of claim 1. Johnson (764) teaches: a computer program product comprising executable program code stored on a non-transitory computer readable storage medium, the executable program code executable by a processor to perform operations (Paragraph 87 instructions are preferably executed by computer-executable components preferably integrated with the system and one or more portions of the processors and/or the controllers. The computer-readable medium can be stored on any suitable computer-readable media such as RAMs, ROMs, flash memory). It would have been obvious to one of ordinary skill in the art to combine the machine learning pipeline disclosed by ‘059 with the computer system disclosed by Johnson (764). One of ordinary skill in the art would have been motivated to make this modification in order to enable computerized processing of the algorithm (Johnson (764) Abstract). With regard to claim 20, it recites the same steps as claim 1 which is taught by ‘059 in view of Johnson (764). Claim 14 further recites: means for the steps (see Claim Interpretation) Johnson (764) teaches: means for the steps (see Claim Interpretation) (Paragraph 87 instructions are preferably executed by computer-executable components preferably integrated with the system and one or more portions of the processors and/or the controllers. The computer-readable medium can be stored on any suitable computer-readable media such as RAMs, ROMs, flash memory). It would have been obvious to one of ordinary skill in the art to combine the machine learning pipeline disclosed by ‘059 with the computer system disclosed by Johnson (764). One of ordinary skill in the art would have been motivated to make this modification in order to enable computerized processing of the algorithm (Johnson (764) Abstract). With regard to claim 2, ‘059 in view of Johnson (764) teaches all the elements of the parent claim 1, and further teaches: wherein the fluid data comprises computational fluid data, experimental fluid data, or some combination thereof (‘059 Page 6, Middle the experimental method can only use the infrared imaging to obtain the temperature field Information and the experimental equipment is expensive, and the invention can retain the complete model without constructing the actual model). With regard to claim 8 and 17, ‘059 in view of Johnson (764) teaches all the elements of the parent claim 1 and 14, and further teaches: wherein the code is executable by the processor to pretrain the fluid foundation model by adding inductive priors during pretraining, the inductive priors comprising one or more physical constraints associated with fluids. (‘059 Page 6, Bottom to 7, Top 8 fluid parameter Information (nanofluid volume fraction), lnputn, i Known information data for fluid channels under certain conditions). Claims 3 – 6 and 15 are rejected under 35 U.S.C. 103 as being unpatentable over ‘059 in view of Johnson (764), and further in view of US 2005/0099420 (henceforth “Hoppe (420)”). ‘059 and Johnson (764) and Hoppe (420) are analogous art because they are from the same field of endeavor of simulation of physical systems. With regard to claim 3 and 15, ‘059 in view of Johnson (764) teaches all the elements of the parent claim 1 and 14, and further teaches: the fluid data comprises fluid mesh data describing one or more meshes (Page 7 the mesh node information is derived, and the initial flow field data Fieldon is obtained. j, g and grid node position information). ‘059 in view of Johnson (764) does not appear to explicitly disclose: wherein the fluid data comprises three-dimensional fluid mesh data describing one or more meshes that have areas of differing resolutions. However, Hoppe (420) teaches: wherein the fluid data comprises three-dimensional fluid mesh data describing one or more meshes that have areas of differing resolutions. (Paragraph 53 and 77 and 114 The triangle faces fl and fr disappear after the collapse is completed, and the connectivity for faces fn0,fn1, fn2, and fn3 is readjusted. Splits are used to define multi-resolution hierarchies for arbitrary meshes. The vertex hierarchy (FIG. 6) is constructed from a geometrically optimized sequence of edge collapse/splits in a progressive mesh representation. As shown in FIG. 3, vu and vi are collapsed into vertex vs.). It would have been obvious to one of ordinary skill in the art to combine the machine learning pipeline disclosed by ‘059 in view of Johnson (764) with the mesh data disclosed by Hoppe (420). One of ordinary skill in the art would have been motivated to make this modification in order to improve the efficiency of the system (Hoppe (420) Abstract). With regard to claim 4, ‘059 in view of Johnson (764), and further in view of Hoppe (420) teaches all the elements of the parent claim 3 and 14, and further teaches: wherein the code is executable by the processor to assign weights of different importance to the areas of the one or more meshes during pretraining of the fluid foundation model. (‘059 Page 6 and 8 Where wi is the weight of the reconstructed flow field data and the original flow field data absolute value at each mesh node, and the weight may be appropriately increased at the boundary layer mesh to improve the accuracy at the boundary layer). fluid data that comprises three-dimensional fluid mesh data describing one or more meshes that have areas of differing resolutions (Paragraph 53 and 77 and 114 The triangle faces fl and fr disappear after the collapse Is completed, and the connectivity for faces fn0,fn1, fn2, and fn3 is readjusted. Splits are used to define multi-resolution hierarchies for arbitrary meshes. The vertex hierarchy (FIG. 6) is constructed from a geometrically optimized sequence of edge collapse/splits in a progressive mesh representation. As shown in FIG. 3, vu and vi are collapsed into vertex vs.) It would have been obvious to one of ordinary skill in the art to combine the machine learning pipeline disclosed by ‘059 in view of Johnson (764) with the mesh data disclosed by Hoppe (420). One of ordinary skill in the art would have been motivated to make this modification in order to improve the efficiency of the system (Hoppe (420) Abstract). With regard to claim 5, ‘059 in view of Johnson (764), and further in view of Hoppe (420) teaches all the elements of the parent claim 3, and further teaches: wherein the three-dimensional fluid mesh data comprises a vector field describing a velocity and a density of a fluid at each point of the mesh. (‘059 Page 6 and 8 boundary condition information (Reynolds number, wall heat flux density); velocity). With regard to claim 6, ‘059 in view of Johnson (764), and further in view of Hoppe (420) teaches all the elements of the parent claim 3, and further teaches: wherein the three-dimensional fluid mesh data comprises mesh data for one or more meshes that have dynamic resolutions. (Hoppe (420) Paragraph 52 and 76 - 78 updating meshes Is performed dynamically as viewing parameters change, and is referred herein as view-dependent level-of-detail (LOD) control. The general issue is to locally adjust the complexity of the approximating mesh to satisfy a screen-space pixel tolerance). It would have been obvious to one of ordinary skill in the art to combine the machine learning pipeline disclosed by ‘059 in view of Johnson (764) with the mesh data disclosed by Hoppe (420). One of ordinary skill in the art would have been motivated to make this modification in order to improve the efficiency of the system (Hoppe (420) Abstract). Claims 7 and 15 are rejected under 35 U.S.C. 103 as being unpatentable over ‘059 in view of Johnson (764), and further in view of Falahatpisheh et al. (US 10345132) (henceforth “Falahatpisheh (132)”). ‘059 and Johnson (764) and Falahatpisheh (132) analogous art because they solve the same problem of simulating fluid behavior, and because they are from the same field of endeavor of simulation of physical systems. With regard to claim 7 and 16, ‘059 in view of Johnson (764) teaches all the elements of the parent claim 1 and 14, and further teaches: wherein the code is executable by the processor to pretrain the fluid foundation model by processing the fluid data (‘059 Page 7). ‘059 in view of Johnson (764) does not appear to explicitly disclose: wherein the code is executable by the processor to pretrain the fluid foundation model using image processing algorithms by processing the fluid data as image data. However, Falahatpisheh (132) teaches: code is executable by the processor to pretrain a model using image processing algorithms by processing the fluid data as image data (Column 8, Middle and Figure 9 – 10 fluid flow is modeled using the velocity data (pretrain a model by processing fluid data), and Figure 2 the velocities are obtained using a camera setup (image processing algorithms)). It would have been obvious to one of ordinary skill in the art to combine the machine learning pipeline disclosed by ‘059 in view of Johnson (764) with the velocity imaging and modeling disclosed by Falahatpisheh (132). One of ordinary skill in the art would have been motivated to make this modification in order to improve velocity modeling (Falahatpisheh (132) Column 2, Lines 20 – 28). Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over ‘059 in view of Johnson (764), and further in view of Dey et al. “GROUP EQUIVARIANT GENERATIVE ADVERSARIAL NETWORKS” (henceforth “Dey”). ‘059 and Johnson (764) and Dey analogous art because they solve the same problem of simulating fluid behavior, and because they are from the same field of endeavor of simulation of physical systems. With regard to claim 9, ‘059 in view of Johnson (764) teaches all the elements of the parent claim 1, and does not appear to explicitly disclose: wherein the inductive priors comprise one or more equivariance to symmetry groups. However, Dey teaches: wherein the inductive priors comprise one or more equivariance to symmetry groups. (Dey Page 3, Top). It would have been obvious to one of ordinary skill in the art to combine the machine learning pipeline disclosed by ‘059 in view of Johnson (764) with equivariance to symmetry groups disclosed by Dey. One of ordinary skill in the art would have been motivated to make this modification in order to improve the synthesis output data (Dey Abstract generation of conditionals synthesis is improved). Claims 10 – 12 and 18 - 19 are rejected under 35 U.S.C. 103 as being unpatentable over ‘059 in view of Johnson (764), and further in view of Andersen, M. (US 20110301882) (henceforth “Andersen (882)”). ‘059 and Johnson (764) and Andersen (882) are analogous art because they solve the same problem of simulating fluid behavior, and because they are from the same field of endeavor of simulation of physical systems. With regard to claim 10 and 18, ‘059 in view of Johnson (764) teaches all the elements of the parent claim 1 and 14, and does not appear to explicitly disclose: wherein the code is further executable by the processor to receive an acoustic foundation model that is pretrained using acoustic field information. However, Andersen (882) teaches: wherein the code is further executable by the processor to receive an acoustic foundation model that is pretrained using acoustic field information. (Paragraph 10 and 82 and 93 Moving onto another application, the basic principle for passive-acoustic pig detection is simple: As for sand monitoring an acoustic detector is mounted onto the production pipe and acts as a microphone for the ultrasonic frequency range.; (One could also apply corrections accounting for flow parameter variation, e.g. using external flow input or extracted signal features into empirical models)). It would have been obvious to one of ordinary skill in the art to combine the machine learning pipeline disclosed by ‘059 in view of Johnson (764) with the acoustic field data used for modeling disclosed by Andersen (882). One of ordinary skill in the art would have been motivated to make this modification in order to improve modeling fluid data (Andersen Paragraph 10). With regard to claim 11 and 19, ‘059 in view of Johnson (764), and further in view of Andersen (882) teaches all the elements of the parent claim 10 and 17, and further teaches: wherein the code is executable by the processor to pretrain the acoustic foundation model using audio processing algorithms by processing the acoustic field information as audio signals. (Andersen (882) Para 22 and 89). It would have been obvious to one of ordinary skill in the art to combine the machine learning pipeline disclosed by ‘059 in view of Johnson (764) with the acoustic field data used for modeling disclosed by Andersen (882). One of ordinary skill in the art would have been motivated to make this modification in order to improve modeling fluid data (Andersen (882) Paragraph 10). With regard to claim 12, ‘059 in view of Johnson (764), and further in view of Andersen (882) teaches all the elements of the parent claim 10, and further teaches: wherein the code is executable by the processor to deploy the acoustic foundation model into the machine learning pipeline (Johnson (764) Paragraph 6 and 60 – 62) together with the fluid foundation model responsive to the fluid dynamics application having acoustic field properties (Andersen (882) Paragraph 22 – 25 and 96 utilizing detectable acoustic signals during fluid flow are directly related to fluid flow characterization itself “Examples of related applications with fluid-carrying piping e.g. include fluid flow characterization”). It would have been obvious to one of ordinary skill in the art to combine the machine learning pipeline disclosed by ‘059 with the computer system disclosed by Johnson (764). One of ordinary skill in the art would have been motivated to make this modification in order to enable computerized processing of the algorithm (Johnson (764) Abstract). It would have been obvious to one of ordinary skill in the art to combine the machine learning pipeline disclosed by ‘059 in view of Johnson (764) with the acoustic modeling disclosed by Andersen (882). One of ordinary skill in the art would have been motivated to make this modification in order to improve modeling fluid data (Andersen (882) Paragraph 10). Claim 13 is rejected under 35 U.S.C. 103 as being unpatentable over ‘059 in view of Johnson (764), and further in view of Leister et al. “On the Importance of Frictional Energy Dissipation in the Prevention of Undesirable Self-Excited Vibrations in Gas Foil Bearing Rotor Systems” (henceforth “Leister”). ‘059 and Johnson (764) and Leister are analogous art because they solve the same problem of simulating fluid behavior, and because they are from the same field of endeavor of simulation of physical systems. With regard to claim 13, ‘059 in view of Johnson (764) teaches all the elements of the parent claim 1, and does not appear to explicitly disclose: wherein the fluid dynamics application comprises at least one of a vehicle mesh design, a plane design, an electric vertical takeoff and landing aircraft design, and a weather simulation. However, Leister teaches: wherein the fluid dynamics application comprises at least one of a vehicle mesh design, a plane design, an electric vertical takeoff and landing aircraft design, and a weather simulation. (Abstract and Page 1 a foundation model is used for fluid-structure interaction (fluid dynamics application) which is usabe for the bearings in commercial aircraft (a plane design)). It would have been obvious to one of ordinary skill in the art to combine the machine learning pipeline disclosed by ‘059 in view of Johnson (764) with a foundation model in an FSI model disclosed by Leister. One of ordinary skill in the art would have been motivated to make this modification in order to improve modeling fluid data (Leister Abstract). Examiner General Comments With regard to the prior art rejection(s), any cited portion of the relied upon reference(s), either by pointing to specific sections or as quotations, is intended to be interpreted in the context of the reference(s) as a whole as would be understood by one of ordinary skill in the art. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested that, in preparing responses, the applicant fully consider the references in their entirety as potentially teaching all or part of the claimed invention since the entire reference is considered to provide disclosure relating to the cited portions. Further, the claims and only the claims form the metes and bounds of the invention. Office personnel are to give the claims their broadest reasonable interpretation in light of the supporting disclosure. Unclaimed limitations appearing in the specification are not read into the claim. Prior art was referenced using terminology familiar to one of ordinary skill in the art. Such an approach is broad in concept and can be either explicit or implicit in meaning. Examiner’s notes are provided with the cited references to assist the applicant to better understand how the examiner interprets the applied prior art. Such comments are entirely consistent with the intent and spirit of compact prosecution. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to ALFRED H. WECHSELBERGER whose telephone number is (571)272-8988. The examiner can normally be reached M - F, 10am to 6pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Emerson Puente can be reached at 571-272-3652. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ALFRED H. WECHSELBERGER/ExaminerArt Unit 2187 /EMERSON C PUENTE/Supervisory Patent Examiner, Art Unit 2187
Read full office action

Prosecution Timeline

Oct 11, 2022
Application Filed
Mar 07, 2026
Non-Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12561501
SYSTEM AND METHOD FOR EXCESS GAS UTILIZATION
2y 5m to grant Granted Feb 24, 2026
Patent 12517804
GENERATING TECHNOLOGY ENVIRONMENTS FOR A SOFTWARE APPLICATION
2y 5m to grant Granted Jan 06, 2026
Patent 12468581
INTER-KERNEL DATAFLOW ANALYSIS AND DEADLOCK DETECTION
2y 5m to grant Granted Nov 11, 2025
Patent 12462075
RESOURCE PREDICTION SYSTEM FOR EXECUTING MACHINE LEARNING MODELS
2y 5m to grant Granted Nov 04, 2025
Patent 12450145
ADVANCED SIMULATION MANAGEMENT TOOL FOR A MEDICAL RECORDS SYSTEM
2y 5m to grant Granted Oct 21, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
58%
Grant Probability
94%
With Interview (+36.5%)
3y 8m
Median Time to Grant
Low
PTA Risk
Based on 212 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month