Prosecution Insights
Last updated: April 19, 2026
Application No. 17/867,286

Aggregating a dataset into a function term with the aid of transformer networks

Non-Final OA §101§103§112
Filed
Jul 18, 2022
Examiner
WAJE, CARLO C
Art Unit
2151
Tech Center
2100 — Computer Architecture & Software
Assignee
Robert Bosch GmbH
OA Round
1 (Non-Final)
69%
Grant Probability
Favorable
1-2
OA Rounds
3y 0m
To Grant
99%
With Interview

Examiner Intelligence

Grants 69% — above average
69%
Career Allow Rate
155 granted / 225 resolved
+13.9% vs TC avg
Strong +33% interview lift
Without
With
+32.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
45 currently pending
Career history
270
Total Applications
across all art units

Statute-Specific Performance

§101
25.3%
-14.7% vs TC avg
§103
26.3%
-13.7% vs TC avg
§102
11.1%
-28.9% vs TC avg
§112
33.7%
-6.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 225 resolved cases

Office Action

§101 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority The present application, 17867286, filed 07/18/2022 claims foreign priority to DE10 2021 207 936.9, filed 07/23/2021. Information Disclosure Statement The information disclosure statement (IDS) submitted on 07/18/2022 and 09/07/2022 are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Objections Claim 18 is objected to under 37 C.F.R. 1.71(a) which requires “full, clear, concise, and exact terms” as to enable any person skilled in the art or science to which the invention or discovery appertains, or with which it is most nearly connected, to make and use the same. The following should be corrected. A. Claim 18 recites “its/their” in line 18 which creates confusion as to which term the pronouns are referring to . Perhaps Applicant may want to amend the claim to recite “further comprising: conveying the position of one or more elementary function expressions of at least one candidate function term in the candidate function term to the transformer network” for better clarity. Claim Interpretation The broadest reasonable interpretation of a method (or process) claim having contingent limitations requires only those steps that must be performed and does not include steps that are not required to be performed because the condition(s) precedent are not met. For example, assume a method claim requires step A if a first condition happens and step B if a second condition happens. If the claimed invention may be practiced without either the first or second condition happening, then neither step A or B is required by the broadest reasonable interpretation of the claim. If the claimed invention requires the first condition to occur, then the broadest reasonable interpretation of the claim requires step A. If the claimed invention requires both the first and second conditions to occur, then the broadest reasonable interpretation of the claim requires both steps A and B. See MPEP 2111.04 II for more information. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 17-31 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 17 recites “assembling of the renewed sampled expressions to form one or more complete candidate function terms will likely improve the evaluation then obtained” in lines 20-21. It is unclear whether the improved evaluation is required or optional because of the use of the word likely. Claims 30-31 recite the same limitations and are rejected for the same reason. Claims 18-29 inherit the same deficiency as claim 17 by reason of dependence. Furthermore, there is insufficient antecedent basis for the term “the evaluation” in the claim. For purposes of examination, this is interpreted as assembling of the renewed sampled function expressions to form one or more complete candidate function terms will improve an evaluation of the one or more complete candidate function terms. Claims 30-31 recite the same limitations and are rejected for the same reason. Claims 18-29 inherit the same deficiency as claim 17 by reason of dependence. Further, claim 17 recites “the sampling of elementary function expressions” in claim 22. It is unclear whether this refers to the sampling of the one or the plurality of elementary function expressions or the sampling of the further elementary function expressions. For purposes of examination, this is interpreted to refer to the sampling of the one or the plurality of elementary function expressions. Claims 30-31 recite the same limitations and are rejected for the same reason. Claims 18-29 inherit the same deficiency as claim 17 by reason of dependence. Further, claim 17 recites “the best evaluation” in line 25. There is insufficient antecedent basis for this limitation in the claim. For purposes of examination, this is interpreted as a best evaluation. Claims 30-31 recite the same limitations and are rejected for the same reason. Claims 18-29 inherit the same deficiency as claim 17 by reason of dependence. Claim 19 recites “the elementary function expressions” in lines 2-3; and “the candidate function term” in 3. It is unclear whether the “the elementary function expressions” refers to the one or plurality of elementary function expressions or to the further elementary function expressions and whether “the candidate function term” refers to the one candidate function term or to the one complete candidate function term. For purposes of examination, these are interpreted to refer to the one or plurality of elementary function expressions and the one or more candidate function terms respectively. Claims 20 and 29 recites instances of “the elementary function expressions” and/or “the candidate function term” and are rejected for the same reasons. Claims 20-22 inherit the same deficiency as claim 19 by reason of dependence. Claims 21-22 inherit the same deficiency as claim 20 by reason of dependence. Claim 20 recites “operators or functions on the one hand and operands on the other hand form the nodes, and a node which belongs to an operator or a function has as children the nodes that belong to the operands that are processed by the operator or this function” in lines 5-8. There is insufficient antecedent basis for the underlined limitations in the claim. For purposes of examination, these are interpreted as operators or functions on one hand and operands on another hand form nodes, and a node which belongs to an operator or a function has as children nodes that belong to the operands that are processed by the operator or the function. Claims 21-22 inherit the same deficiency as claim 20 by reason of dependence. Claim 22 recites “the numerical codes” in line 1. It is unclear whether this is supposed to refer to the numerical codes assigned to the elementary function expressions in claim 19 or to the numerical codes are assigned to non-occupied positions in the tree in claim 21 or both. For purposes of examination, this is interpreted to refer to the numerical codes assigned to the elementary function expressions. Claim 25 recites “the output variable values” in lines 1-2. There is insufficient antecedent basis for this limitation in the claim. For purposes of examination, this is interpreted to refer to the output variable value instead. Claim 26 inherit the same deficiency as claim 25 by reason of dependence. Claim 26 recites “the output variable” in line 1. There is insufficient antecedent basis for this limitation in the claim. For purposes of examination, this is interpreted to refer to the output variable value instead. Claim 28 recites “the evaluation” in lines 3 and 4. There is insufficient antecedent basis for this limitation in the claim. For purposes of examination, the first recitation is interpreted as an evaluation instead. Claim 29 recites “the optimization” in line 3. There is insufficient antecedent basis for this limitation in the claim. For purposes of examination, this is interpreted as an optimization. Claim 31 recites “One or more computers configured to aggregate a dataset, which respectively assigns an output variable value to a plurality of input variable vectors, into a function term, the one or more computers configured to: sample … aggregated.” This limitation is unclear because claim 30 is a machine claim and “A machine is a "concrete thing, consisting of parts, or of certain devices and combination of devices." However, claim 30 does not recite any parts and combination of parts of the one or more computers. See MPEP 2106.03 I for more information. Examiner suggest reciting structural components of the one or more computers performing the recited functions. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 17-31 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Under Step 1, claims 17-29 recite a series of steps and, therefore, is a process. Claim 30 recites a non-transitory machine-readable data carrier and, therefore, is an article of manufacture. Claim 31 recites one or more computers and, therefore, is a machine. Under Step 2A prong 1, claim 17 recites A method for aggregating a dataset, which respectively assigns an output variable value to a plurality of input variable vectors, into a function term, the method including the following steps: sampling one or a plurality of elementary function expressions from a given alphabet using a neural network, the neural network being a transformer network; assembling the one or plurality of elementary function expressions to form one or more candidate function terms; checking whether the one or more candidate function terms is complete; based on the one or more candidate function terms being not yet complete, branching back for sampling further elementary function expressions; based on the one or more candidate function terms being complete, respectively mapping the input variable vectors onto associated candidate output variable values using each of the one or more candidate function terms; evaluating a deviation between the associated candidate output variable values and corresponding output variable values from the dataset using a predefined metric; checking whether a predefined abort condition is satisfied; based on the abort condition not being satisfied: updating parameters that characterize a behavior of the transformer network with a goal that a renewed sampling of function expressions and assembling of the renewed sampled expressions to form one or more complete candidate function terms will likely improve the evaluation then obtained, and branching back to the sampling of elementary function expressions using the transformer network; and based on the predefined abort condition being satisfied, ascertaining a candidate function term of the one or more candidate function terms having the best evaluation as a desired function term into which the dataset is aggregated. The above limitations are related to symbolic regression which searches a mathematical function that best models correlation included in the dataset between the input variable and the output variables (see page 1-2 of the specification) which falls within the “Mathematical Concepts” and/or “Mental Processes” grouping of abstract ideas. See also MPEP 2106.04(a)(2) I.A which states that “organizing information and manipulating information through mathematical correlations” as an example of mathematical concept grouping of abstract ideas. Further, the steps of “sampling”, “assembling”, “checking”, “evaluating”, and “checking” is a process that under its broadest reasonable interpretation, covers performance of the limitation in the mind. That is, other than reciting “using a neural network, the neural network being a transformer network”, nothing in the claim element precludes the steps from practically being performed in the human mind. For example, but for the “using a neural network, the neural network being a transformer network”, the encompasses manually sampling (selecting) functions 3a and 3b from a given alphabet of functions 3a-3d; combining the functions 3a and 3b to form a candidate function term; checking that the can function term is complete; mapping the input variables on the candidate function term to produce candidate output variable values; evaluating a deviation between the output variable values and the candidate output variable values using the formulas in page 8-9; checking whether an abort condition is satisfied based on the deviation; updating parameters of the network to improve the deviation on renewed sampling if the abort condition is not satisfied; and ascertaining the function terms as the function term that best models correlation included in the dataset between the input variable and the output variables if the abort condition not satisfied using pen and paper. Accordingly, the claim is directed to recite an abstract idea. Under step 2A prong 2, the claim recites the following additional elements: a neural network, the neural network being a transformer network. However, the additional elements of “a neural network” and “a transformer network” are recited at a high-level of generality (i.e., as generic neural networks without reciting any specific structural configuration of the neural networks) such that they amount to no more than merely reciting the words “apply it” (or an equivalent) with the judicial exception or mere instructions using a generic computer component or merely as tools to implement the abstract idea. The additional elements do not, individually or in combination, integrate the exception into a practical application. Accordingly, the claim is not integrated into a practical application. Under step 2B, claim 17 does not include additional elements that, individually or in combination, are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements “a neural network” and “a transformer network” are recited at a high-level of generality (i.e., as generic neural networks without reciting any specific structural configuration of the neural networks) such that they amount to no more than merely reciting the words “apply it” (or an equivalent) with the judicial exception or mere instructions using a generic computer component or merely as tools to implement the abstract idea. The claim does not recite additional elements that alone or in combination amount to an inventive concept. Accordingly, the claim does not amount to significantly more than the abstract idea. Under step 2A prong 1, claims 18-29 recite the same abstract idea as claim 17 by reason of dependence. Further, claim 19 recites further abstract idea of “numerical codes are respectively assigned to the elementary function expressions from the alphabet, and their positions in the candidate function term, at least one candidate function term is converted into a representation formed from the numerical codes”; claim 20 recites further details of the abstract idea “wherein the numerical codes for the positions of elementary function expressions in the candidate function term indicate positions of the elementary function expressions in a semantic expression tree of the candidate function term, in which: operators or functions on the one hand and operands on the other hand form the nodes, and a node which belongs to an operator or a function has as children the nodes that belong to the operands that are processed by the operator or this function”; claim 21 recites further details of the abstract idea “wherein numerical codes are assigned also to non-occupied positions in the tree”; claim 22 recites further details of the abstract idea “wherein the numerical codes include vectors that respectively have separate components for levels of the tree, and each component assigned to a level indicates a direction in which branching took place on a path from a root of the tree to the node in a transition to the respective level”; claim 23 recites further details of the abstract idea “wherein the parameters that characterize the behavior are optimized toward a goal of improving an evaluation averaged across a plurality or distribution of candidate function terms”; claim 24 recites further details of the abstract idea “wherein only deviations that stem from a selection of best-evaluated candidate function terms are used for updating the parameters”; claim 27 recites further details of the abstract idea “mapped as components of the input variable vectors, using the ascertained function term, to output variable values; an actuation signal is formed from the output variable values”; claim 28 recites further details of the abstract idea “wherein: the alphabet is restricted to operators or functions that are available for the evaluation of the ascertained function term, and the predefined embedded platform is set up for the evaluation of the ascertained function term” which falls within the “Mathematical Concepts” and/or “Mental Processes” grouping of abstract ideas. In particular claims 20-24 do not include additional elements that would require further analysis under step 2A prong 2 and step 2B. Accordingly, the claims are directed to recite an abstract idea. Under step 2A prong 2, claim 18 recites the following additional elements: wherein one or more elementary function expressions of at least one candidate function term and its /their positions in the candidate function term is/are additionally conveyed to the transformer network. Claim 19 recites the following additional elements: the representation is supplied to the transformer network. Claim 25 recites the following additional elements: wherein the input variable vectors and/or the output variable values, include measured data that were recorded using at least one sensor. Claim 26 recites the following additional elements: wherein the output variable is a measured variable of a first sensor, and the input variable vectors include measured variables of further sensors from which the measured variable of the first sensor is ascertainable at least as an approximation. Claim 27 recites the following additional elements: measured data that were recorded using at least one sensor, and a vehicle is actuated using the actuation signal. Claim 28 recites the following additional elements: a predefined embedded platform. Claim 29 recites the following additional elements: wherein the elementary function expressions of at least one best-evaluated candidate function term and their positions in the best-evaluated candidate function term in multiple epochs of the optimization are supplied to the transformer network. However, the additional elements of “wherein the input variable vectors and/or the output variable values, include measured data that were recorded using at least one sensor” in claim 25; “wherein the output variable is a measured variable of a first sensor, and the input variable vectors include measured variables of further sensors from which the measured variable of the first sensor is ascertainable at least as an approximation” in claim 26; and “measured data that were recorded using at least one sensor” in claim 27 are merely adding extra-solution activities, i.e. mere data gathering and are merely generally linking the use of a judicial exception to a particular technological environment or field of use by limiting the data gathering step to a particular type and source of data (measured data from at least one sensor in claim 25; measured variable of a first sensor, and measured variables of further sensors in claim 26; and measured data from at least one sensor in claim 27). The additional element of “a vehicle is actuated using the actuation signal” in claim 27; and “predefined embedded platform” in claim 28 are also merely generally linking the use of a judicial exception to a particular technological environment or field of use that includes a vehicle that is actuated by a signal generated from the output variables and wherein the alphabet of elementary function expressions are restricted to functions available on a predefined embedded platform. See 2106.05(h) for more information. The additional elements of “wherein one or more elementary function expressions of at least one candidate function term and its/their positions in the candidate function term is/are additionally conveyed to the transformer network” in claim 18; “the representation is supplied to the transformer network” in claim 19; and “wherein the elementary function expressions of at least one best-evaluated candidate function term and their positions in the best-evaluated candidate function term in multiple epochs of the optimization are supplied to the transformer network” in claim 29 are merely adding insignificant extra-solution activities. The additional elements do not, individually or in combination, integrate the exception into a practical application. Accordingly, the claims are not integrated into a practical application. Under step 2B, claims 18-19 and 25-29 do not include additional elements that, individually or in combination, are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of the additional elements of “wherein the input variable vectors and/or the output variable values, include measured data that were recorded using at least one sensor” in claim 25; “wherein the output variable is a measured variable of a first sensor, and the input variable vectors include measured variables of further sensors from which the measured variable of the first sensor is ascertainable at least as an approximation” in claim 26; and “measured data that were recorded using at least one sensor” in claim 27 are merely adding extra-solution activities, i.e. mere data gathering and are merely generally linking the use of a judicial exception to a particular technological environment or field of use by limiting the data gathering step to a particular type and source of data (measured data from at least one sensor in claim 25; measured variable of a first sensor, and measured variables of further sensors in claim 26; and measured data from at least one sensor in claim 27). The additional element of “a vehicle is actuated using the actuation signal” in claim 27; and “predefined embedded platform” in claim 28 are also merely generally linking the use of a judicial exception to a particular technological environment or field of use that includes a vehicle that is actuated by a signal generated from the output variables and wherein the alphabet of elementary function expressions are restricted to functions available on a predefined embedded platform. See 2106.05(h) for more information. The additional elements of “wherein one or more elementary function expressions of at least one candidate function term and its /their positions in the candidate function term is/are additionally conveyed to the transformer network” in claim 18; “the representation is supplied to the transformer network” in claim 19; and “wherein the elementary function expressions of at least one best-evaluated candidate function term and their positions in the best-evaluated candidate function term in multiple epochs of the optimization are supplied to the transformer network” in claim 29 are merely adding insignificant extra-solution activities. See MPEP 2106.05(d)(II) which states that the courts have recognized computer functions such as “Receiving or transmitting data over a network” and “Storing and retrieving information in memory” as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity. The claims do not recite additional elements that alone or in combination amount to an inventive concept. Accordingly, the claims do not amount to significantly more than the abstract idea. Regarding claim 30, it is directed to a non-transitory machine-readable data carrier on which is stored a computer program including machine-readable instructions to perform the method of claim 17. Claim 17 analysis applies equally to claim 30. Additional limitations recited in claim 30 will be discussed below. Regarding claim 31, it is directed to one or more computers configured to execute the method of claim 17. Claim 17 analysis applies equally to claim 31. Additional limitations recited in claim 31 will be discussed below. Under step 2A prong 2, claim 30 recites the following additional elements: a computer. Claim 31 recites the following additional elements: one or more computers. However, the additional elements of “a computer” in claim 30; and “one or more computers” in claim 31 are recited at a high-level of generality (i.e., as a generic computer executing a series of mathematical operations) such that they amount to no more than mere instructions using a generic computer component or merely as tools to implement the abstract idea. Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general purpose computer or computer components after the fact to an abstract idea (e.g., a fundamental economic practice or mathematical equation) does not integrate a judicial exception into a practical application or provide significantly more. See MPEP 2106.05(f) for more information. The additional elements do not, individually or in combination, integrate the exception into a practical application. Accordingly, the claims are not integrated into a practical application. Under step 2B, claims 30 and 31 do not include additional elements that, individually or in combination, are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of “a computer” in claim 30; and “one or more computers” in claim 31 are recited at a high-level of generality (i.e., as a generic computer executing a series of mathematical operations) such that they amount to no more than mere instructions using a generic computer component or merely as tools to implement the abstract idea. Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general purpose computer or computer components after the fact to an abstract idea (e.g., a fundamental economic practice or mathematical equation) does not integrate a judicial exception into a practical application or provide significantly more. See MPEP 2106.05(f) for more information. The claims do not recite additional elements that alone or in combination amount to an inventive concept. Accordingly, the claims do not amount to significantly more than the abstract idea. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 17-20, 22-24 and 28-31 are rejected under 35 U.S.C. 103 as being unpatentable over Buchner et al. (NPL – “An Artificial-Intelligence-Based Method to Automatically Create Interpretable Models from Data Targeting Embedded Control Applications”), hereinafter Buchner, in view of Petersen (NPL – “Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients”), and Biggio et al. (NPL – “Neural Symbolic Regression that Scales”), hereinafter Biggio. Petersen is cited in the IDS submitted on 07/18/2022. Regarding claim 31, Buchner teaches one or more computers configured to aggregate a dataset, which respectively assigns an output variable value to a plurality of input variable vectors, into a function term, the one or more computers configured to (Buchner section 3.10 “All computations have been carried out on a standard laptop computer (Intel(R) Core(TM) i7-8650 CPU @ 1.90GHz, 32 GB RAM) on a single core”; section 3.3; one or more computers – computer; dataset – dataset; output variable value – model output; plurality of input variable vectors – m inputs; function term – Fi): sample one or a plurality of elementary function expressions from a given alphabet (Buchner section 3.1 and 3.2; one or a plurality of elementary function expressions – functional expressions; alphabet – function set); assemble the one or plurality of elementary function expressions to form one or more candidate function terms (Buchner section 3.2; one or more candidate function terms – model/directed graphs with a tree structure representing the functional expressions); ; based on the one or more candidate function terms being complete, respectively map the input variable vectors onto associated candidate output variable values using each of the one or more candidate function terms (Buchner section 3.2; associated candidate output variable values – model output; y m o d , i ); evaluate a deviation between the associated candidate output variable values and corresponding output variable values from the dataset using a predefined metric (Buchner section 3.4-3.6; deviation – fitness (equations 4-6); corresponding output variable values - y m e a s ; predefined metric – fitness method (RMSE, MSE, ABS and/or Parsimony Coefficient); check whether a predefined abort condition is satisfied (Bucher Fig. 1 and section 3.5; predefined abort condition – termination criteria); based on the abort condition not being satisfied: update parameters that characterize a behavior of the (Buchner Fig. 1 and section 3.7-3.8); and based on the predefined abort condition being satisfied, ascertain a candidate function term of the one or more candidate function terms having the best evaluation as a desired function term into which the dataset is aggregated (Buchner Fig. 1 and section 3.5; desired function term – model with the value OF ( y m o d , i ; y m e a s ) smaller than the termination threshold). Buchner does not explicitly teach sample one or a plurality of elementary function expressions from a given alphabet using a neural network, the neural network being a transformer network; check whether the one or more candidate function terms is complete; based on the one or more candidate function terms being not yet complete, branch back for sampling further elementary function expressions; and update parameters that characterize a behavior of the transformer network with a goal that a renewed sampling of function expressions and assembling of the renewed sampled expressions to form one or more complete candidate function terms will likely improve the evaluation then obtained, and branch back to the sampling of elementary function expressions using the transformer network. However, on the same field of endeavor, Petersen discloses sampling one or a plurality of elementary function expressions from a given alphabet using a neural network; checking whether the one or more candidate function terms is complete; based on the one or more candidate function terms being not yet complete, branching back for sampling further elementary function expressions (Petersen page 11; neural network – RNN). Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention, to modify Buchner using Petersen and use a neural network to perform the sampling of the one or the plurality of elementary function expressions from the given alphabet so that the probability of sampling can be defined from the probability distribution of a softmax layer (Petersen Fig. 1 and section 3.1 second and third paragraph; Algorithm 2). Further, include a test to check whether the one or more candidate function terms is complete and based on the one or more candidate function terms being not yet complete, branching back for sampling further elementary function expressions in order to complete the expression before mapping the input variable vectors to the expression (Petersen Fig.1 and Algorithm 2). As disclosed in Buchner section 3.2, each model (candidate function terms) includes the number of inputs to the graph, therefore, it would be obvious to completely build the graph first before evaluating it. Therefore, the combination of Buchner as modified in view of Petersen teaches sample one or a plurality of elementary function expressions from a given alphabet using a neural network, check whether the one or more candidate function terms is complete; based on the one or more candidate function terms being not yet complete, branch back for sampling further elementary function expressions. Buchner as modified in view of Petersen does not currently teach the neural network being a transformer network; update parameters that characterize a behavior of the transformer network with a goal that a renewed sampling of function expressions and assembling of the renewed sampled expressions to form one or more complete candidate function terms will likely improve the evaluation then obtained, and branch back to the sampling of elementary function expressions using the transformer network. However, on the same field of endeavor, Biggio discloses sampling one or a plurality of elementary function expressions from a given alphabet using a neural network, the neural network being a transformer network (Biggio Fig. 1 and section 3.1; transformer network – transformer). Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention, to modify Buchner using Petersen and use a transformer network to perform the sampling of the one or the plurality of elementary function expressions from the given alphabet in order handle large datasets because as the number n of input-output pairs can grow to large values, and the computation in Set Transformers scales as O(nm) instead of O(n2), where m ≪ n (Biggio section 4.1; page 11 section A.1). Therefore, the combination of Buchner as modified in view of Petersen and Biggio teaches the neural network being a transformer network; update parameters that characterize a behavior of the transformer network with a goal that a renewed sampling of function expressions and assembling of the renewed sampled expressions to form one or more complete candidate function terms will likely improve the evaluation then obtained, and branch back to the sampling of elementary function expressions using the transformer network. Regarding claim 30, it is directed to a non-transitory machine-readable data carrier on which is stored a computer program including machine-readable instructions executed by the one or more computers of claim 31. Claim 31 analysis applies equally to claim 30. Regarding claim 17, it is directed to a method practiced by the one or more computers of claim 31. Claim 31 analysis applies equally to claim 17. Regarding claim 18, Buchner as modified in view of Petersen and Biggio teaches all the limitations of claim 17 as stated above. Further, Buchner as modified in view of Petersen and Biggio teaches wherein one or more elementary function expressions of at least one candidate function term and its /their positions in the candidate function term is/are additionally conveyed to the transformer network (Buchner section 3.2; Petersen Fig. 1; positions in the candidate function term – position/number in the graph). Regarding claim 19, Buchner as modified in view of Petersen and Biggio teaches all the limitations of claim 17 as stated above. Further, Buchner as modified in view of Petersen and Biggio teaches wherein: numerical codes are respectively assigned to the elementary function expressions from the alphabet, and their positions in the candidate function term, at least one candidate function term is converted into a representation formed from the numerical codes; and the representation is supplied to the transformer network Buchner section 4.1 and Equation 7 and section 3.2 “Starting with this step, the models are represented as directed graphs with a tree structure (Augusto and Barbosa (2000)); see provided copy of Augusto and Barbosa (2000) in at least Figs. 3-5 as evidence; numerical codes – value assigned to each vertex). Regarding claim 20, Buchner as modified in view of Petersen and Biggio teaches all the limitations of claim 19 as stated above. Further, Buchner as modified in view of Petersen and Biggio teaches wherein the numerical codes for the positions of elementary function expressions in the candidate function term indicate positions of the elementary function expressions in a semantic expression tree of the candidate function term, in which: operators or functions on the one hand and operands on the other hand form the nodes, and a node which belongs to an operator or a function has as children the nodes that belong to the operands that are processed by the operator or this function (Buchner section 4.1 and Equation 7 and section 3.2 “Starting with this step, the models are represented as directed graphs with a tree structure (Augusto and Barbosa (2000)); see provided copy of Augusto and Barbosa (2000) as evidence. Alternatively, Petersen Fig. 1). Regarding claim 22, Buchner as modified in view of Petersen and Biggio teaches all the limitations of claim 20 as stated above. Further, Buchner as modified in view of Petersen and Biggio teaches wherein the numerical codes include vectors that respectively have separate components for levels of the tree, and each component assigned to a level indicates a direction in which branching took place on a path from a root of the tree to the node in a transition to the respective level (Buchner section 4.1 and Equation 7 and section 3.2 “Starting with this step, the models are represented as directed graphs with a tree structure (Augusto and Barbosa (2000)); see provided copy of Augusto and Barbosa (2000) as evidence; Alternatively, Petersen section 3.1 “The sampling process is illustrated in Figure 1; pseudocode is provided in Algorithm 2 in Appendix A. Starting at the root node, a token is sampled from the emitted categorical distribution. Subsequent tokens are sampled autoregressively until the tree is complete (i.e. all tree branches reach terminal nodes). The resulting sequence of tokens is the tree’s pre-order traversal, which can be used to reconstruct the tree and instantiate its corresponding expression”; “In general, a pre-order traversal is insufficient to uniquely reconstruct the tree. However, here we know how many children each node has based on its value, e.g. “multiply” is a binary operator and thus has two children”). Regarding claim 23, Buchner as modified in view of Petersen and Biggio teaches all the limitations of claim 17 as stated above. Further, Buchner as modified in view of Petersen and Biggio teaches wherein the parameters that characterize the behavior of the transformer network are optimized toward a goal of improving an evaluation averaged across a plurality or distribution of candidate function terms (Buchner section 3.6). Regarding claim 24, Buchner as modified in view of Petersen and Biggio teaches all the limitations of claim 17 as stated above. Further, Buchner as modified in view of Petersen and Biggio teaches wherein only deviations that stem from a selection of best-evaluated candidate function terms are used for updating the parameters (Buchner section 3.7 “Fraction Top: The fraction of the population defines the size of the high-fitness group. The group is filled with the graphs having the highest fitness. [0.1]. Probability Top: Probability to employ the high-fitness group for fitness-based selection. [0.9]”). Regarding claim 28, Buchner as modified in view of Petersen and Biggio teaches all the limitations of claim 17 as stated above. Further, Buchner as modified in view of Petersen and Biggio teaches wherein: the alphabet is restricted to operators or functions that are available on a predefined embedded platform for the evaluation of the ascertained function term, and the predefined embedded platform is set up for the evaluation of the ascertained function term (Buchner section 4.2 “The following algorithm settings were changed from their default values (section 3). For the given system abstraction and inputs, a purely equations-based description was not expected to be suitable. Hence, the employed functions were set as x+y, x-y, x*y, x/y, 1D Map with 10 sampling points, 2D Map with 5x5 sampling points”). Regarding claim 29, Buchner as modified in view of Petersen and Biggio teaches all the limitations of claim 23 as stated above. Further, Buchner as modified in view of Petersen and Biggio teaches wherein the elementary function expressions of at least one best-evaluated candidate function term and their positions in the best-evaluated candidate function term in multiple epochs of the optimization are supplied to the transformer network (Buchner Figs. 1 and 3 and section 3.2 and 4.2; the position of the elementary function expressions are supplied to the transformer network to generate the model shown in Fig. 3). Claim 21 is rejected under 35 U.S.C. 103 as being unpatentable over Buchner in view of Petersen and Biggio as applied to claim 20 above, and further in view of Song et al. (NPL – “Design of a Flexible Wearable Smart sEMG Recorder Integrated Gradient Boosting Decision Tree Based Hand Gesture Recognition”), hereinafter Song. Regarding claim 21, Buchner as modified in view of Petersen and Biggio teaches all the limitations of claim 20 as stated above. Buchner does not explicitly teach wherein numerical codes are assigned also to non-occupied positions in the tree. However, on the same field of endeavor, Song discloses assigning values to non-occupied positions in a tree (Song Fig. 10 and page 1568 section IV.B). Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention, to modify Buchner and generalize the teaching of Song by adding dummy nodes with dummy values to non-occupied positions in the tree in order to generate a binary tree with regular structure (Song Fig. 10 and page 1568 section IV.B). Therefore, the combination of Buchner as modified in view of Petersen, Biggio and Song teaches wherein numerical codes are assigned also to non-occupied positions in the tree. Claims 25-27 are rejected under 35 U.S.C. 103 as being unpatentable over Buchner in view of Petersen and Biggio as applied to claim 17 above, and further in view of Trainham (US 20230021908 A1). Regarding claim 25, Buchner as modified in view of Petersen and Biggio teaches all the limitations of claim 17 as stated above. Further, Buchner as modified in view of Petersen and Biggio teaches wherein the input variable vectors and/or the output variable values, include measured data that were recorded (Buchner section 4.2 system setup “2813 measurements at stationary operation points have been carried out for the quantities listed in Tab. 4, with equidistant variations in their range of operation”). Buchner does not explicitly teach wherein the input variable vectors and/or the output variable values, include measured data that were recorded using at least one sensor. However, on the same field of endeavor, Trainham discloses input variable vectors and/or the output variable values that include measured data that were recorded using at least one sensor (Trainham Fig. 1 and paragraphs [0029-0030] at least one sensor – at least one of 111-121). Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention, to modify Buchner using Trainham and record the input variable vectors such as the engine speed and/or the output variable values such as the engine torque using at least one sensor in order to model the torque in a vehicle system (Buchner section 3.2 and Trainham paragraphs [0029-0030]). Therefore, the combination of Buchner as modified in view of Petersen, Biggio and Trainham teaches wherein the input variable vectors and/or the output variable values, include measured data that were recorded using at least one sensor. Regarding claim 26, Buchner as modified in view of Petersen and Biggio teaches all the limitations of claim 25 as stated above. Further, Buchner as modified in view of Petersen and Biggio teaches wherein the output variable is a measured variable of a first sensor, and the input variable vectors include measured variables of further sensors from which the measured variable of the first sensor is ascertainable at least as an approximation (Trainham Fig. 1 paragraphs [0029-0030]; first sensor – speed sensor 118; further sensors – at least sensors 114 and 121). Regarding claim 27, Buchner as modified in view of Petersen and Biggio teaches all the limitations of claim 17 as stated above. Further, Buchner as modified in view of Petersen and Biggio teaches wherein: measured data that were recorded (Buchner section 4.2) Buchner does not explicitly teach wherein: measured data that were recorded using at least one sensor are mapped as components of the input variable vectors, using the ascertained function term, to output variable values; an actuation signal is formed from the output variable values; and a vehicle is actuated using the actuation signal. However, on the same field of endeavor, Trainham discloses measuring data using at least one sensor; forming an actuation signal from the measured; and actuating a vehicle using the actuation signal (Trainham Fig. 1 and paragraphs [0029-0030]). Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention, to modify Buchner using Trainham and record the measured data using at least one sensor that is used as inputs to generate the model output; form an actuation signal from the model output; and actuating a vehicle using the actuation signal in order to develop control strategies, for components of physical systems such as a vehicle system (Buchner Introduction and Trainham paragraphs [0028-0031]). Therefore, the combination of Buchner as modified in view of Petersen, Biggio and Trainham teaches wherein: measured data that were recorded using at least one sensor are mapped as components of the input variable vectors, using the ascertained function term, to output variable values; an actuation signal is formed from the output variable values; and a vehicle is actuated using the actuation signal. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Augusto et al. (NPL – “Symbolic Regression via Genetic Programming”) generally related to traversing a symbolic regression tree beginning in the root and then visiting the left child followed by the right child known as pre-order. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Carlo Waje whose telephone number is (571)272-5767. The examiner can normally be reached 9:00-6:00 M-F. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, James Trujillo can be reached at (571) 272-3677. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Carlo Waje/Examiner, Art Unit 2151 (571)272-5767
Read full office action

Prosecution Timeline

Jul 18, 2022
Application Filed
Feb 26, 2026
Non-Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596529
CORDIC COMPUTATION OF SIN/COS USING COMBINED APPROACH IN ASSOCIATIVE MEMORY
2y 5m to grant Granted Apr 07, 2026
Patent 12591409
CONVERTER FOR CONVERTING DATA TYPE, CHIP, ELECTRONIC DEVICE, AND METHOD THEREFOR
2y 5m to grant Granted Mar 31, 2026
Patent 12585431
SEMICONDUCTOR DEVICE AND ELECTRONIC DEVICE
2y 5m to grant Granted Mar 24, 2026
Patent 12578924
ADDER CELL AND INTEGRATED CIRCUIT INCLUDING THE SAME
2y 5m to grant Granted Mar 17, 2026
Patent 12561114
PARALLEL PROCESSING OF A SOFTMAX OPERATION BY DIVIDING AN INPUT VECTOR INTO A PLURALITY OF FRAGMENTS
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
69%
Grant Probability
99%
With Interview (+32.6%)
3y 0m
Median Time to Grant
Low
PTA Risk
Based on 225 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month