DETAILED ACTION
This office action is made final. Claims 1-3 are pending. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
Applicant’s amendment date 02/05/2026, amended claim 1.
Response to Amendment
The previously pending rejection to claims 1-3, under 35 USC 101 (Alice), will be maintained.
Response to Arguments
Applicant’s arguments received on date 02/05/2026 have been fully considered, but they are not persuasive. Moreover, any new grounds of rejection have been necessitated by Applicant's amendments to the claims. The art rejection has been updated to address these amendments.
Response to Arguments under 35 USC 101:
Applicant asserts that “Applicant respectfully submits that the pending claims are not directed to an abstract idea under Prong One.” Examiner respectfully disagrees.
Pursuant to 2019 Revised Patent Subject Matter Eligibility Guidance, in order to determine whether a claim is directed to an abstract idea, under Step 2A, we first (1) determine whether the claims recite limitations, individually or in combination, that fall within the enumerated subject matter groupings of abstract ideas (mathematical concepts, certain methods of organizing human activity, or mental processes), and (2) determine whether any additional elements beyond the recited abstract idea, individually and as an ordered combination, integrate the judicial exception into a practical application. 84 Fed. Reg. 52, 54-55. Next, if a claim (1) recites an abstract idea and (2) does not integrate that exception into a practical application, in order to determine whether the claim recites an “inventive concept,” under Step 2B, we then determine whether the claim recites any of the additional elements beyond the recited abstract idea, individually and in combination, are significantly more than the abstract idea itself. 84 Fed. Reg. 56.
Here, under the first prong of Step 2A, the claims (claim 1) recite “receiving a first data set associated with a multivariate process; analyzing, the first data set to determine at least one interaction between at least two factors in the multivariate process and at least one correlation associated with the at least one interaction; identifying at least one factor to optimize to improve an execution of the multivariate process, responsive to the analyzing; generating a randomized controlled experiment to execute to determine whether optimizing the at least one identified factor improves the execution of the multivariate process by an amount exceeding a threshold amount of improvement; identifying a second data set needed to execute the randomized controlled experiment; acquiring the second data set; executing the randomized controlled experiment; generating a recommendation for optimizing the at least one factor; and providing the recommendation.”
A claim recites mental processes when the claim recites concepts performed in the human mind (including an observation, evaluation, judgment, opinion), wherein if the claim, under its broadest reasonable interpretation, covers the claim being practically performed in the mind but for the recitation of generic computer components, then the claim is in the mental process category. Id. at 52 n.14. Therefore, contrary to Applicant’s assertions, the claims are directed to mental processes.
Applicant asserts that “Applicant argues that any abstraction of the limitations in the pending claim as hereby amended is integrated into a practical application due to the execution of the assessment engine, the analysis engine, and the computing device that execute the steps recited in the pending claims.” Examiner respectfully disagrees.
As discussed above, under the second prong of Step 2A, we determine whether any additional elements beyond the recited abstract idea, individually and as an ordered combination, integrate the judicial exception into a practical application. 84 Fed. Reg. 52, 54-55.
Here, under the second prong of Step 2A, the only additional elements beyond the recited abstract idea of claim 1, is the recitation of “receiving a first data set associated with a multivariate process; analyzing, the first data set to determine at least one interaction between at least two factors in the multivariate process and at least one correlation associated with the at least one interaction; identifying at least one factor to optimize to improve an execution of the multivariate process, responsive to the analyzing; generating a randomized controlled experiment to execute to determine whether optimizing the at least one identified factor improves the execution of the multivariate process by an amount exceeding a threshold amount of improvement; identifying a second data set needed to execute the randomized controlled experiment; acquiring the second data set; executing the randomized controlled experiment; generating a recommendation for optimizing the at least one factor; and providing the recommendation,” and these additional elements, individually and in combination, are nothing more than computing elements recited at high level of generality implementing the abstract idea on a computer (i.e. apply it), and thus, are no more than applying the abstract idea with generic computer components. Accordingly, contrary to Applicant’s assertions, the judicial exception is not integrated into a practical application under the second prong of Step 2A.
Applicant asserts that “Applicant respectfully submits that the claims amount to "significantly more" than an abstract idea.” Examiner respectfully disagrees.
The MPEP discusses that "the second part of the Alice/Mayo test [(Step 2B)] is often referred to as a search for an inventive concept," and "an 'inventive concept' is furnished by an element or combination of elements that is recited in the claim in addition to (beyond) the judicial exception, and is sufficient to ensure that the claim as a whole amounts to significantly more than the judicial exception itself." MPEP 2106.05 (emphasis added). Further, the MPEP goes on to describe "Step 2B asks: Does the claim recite additional elements that amount to significantly more than the judicial exception? Examiners should answer this question by first identifying whether there are any additional elements (features/limitations/steps) recited in the claim beyond the judicial exception(s), and then evaluating those additional elements individually and in combination to determine whether they contribute an inventive concept (i.e., amount to significantly more than the judicial exception(s)).” MPEP 2106.05 (emphasis added).
The search for an inventive concept under § 101 is distinct from demonstrating novel and non-obviousness. See SAP America Inc. v. Investpic, LLC, No. 2017-2081, slip op. at 2-3 (Fed Cir. May 15, 2018) (citing Synopsys, Inc. v. Mentor Graphics Corp., 839 F.3d 1138, 1151 (Fed. Cir. 2016); Intellectual Ventures I LLC v. Symantec Corp., 838 F.3d 1307, 1315 (Fed. Cir. 2016). Even novel and newly discovered judicial exceptions are still exceptions, despite their novelty. July 2015 Update, p. 3; see SAP America at 2. In Step 2B, “[w]hat is needed is an inventive concept in the non-abstract application realm.” SAP America at 11. As discussed in SAP America, no matter how much of an advance the claims recite, when “the advance lies entirely in the realm of abstract ideas, with no plausibly alleged innovation in the non-abstract application realm,” “[a]n advance of that nature is ineligible for patenting.” Id. at 3.
Here, under Step 2B, the only additional elements beyond the recited abstract idea of claim 1, is the recitation of “receiving a first data set associated with a multivariate process; analyzing, the first data set to determine at least one interaction between at least two factors in the multivariate process and at least one correlation associated with the at least one interaction; identifying at least one factor to optimize to improve an execution of the multivariate process, responsive to the analyzing; generating a randomized controlled experiment to execute to determine whether optimizing the at least one identified factor improves the execution of the multivariate process by an amount exceeding a threshold amount of improvement; identifying a second data set needed to execute the randomized controlled experiment; acquiring the second data set; executing the randomized controlled experiment; generating a recommendation for optimizing the at least one factor; and providing the recommendation are carried out by at least one computing device,” and these additional elements, individually and in combination, are nothing more than computing elements recited at high level of generality implementing the abstract idea on a computer (i.e. apply it), and thus, are no more than applying the abstract idea with generic computer components. Accordingly, contrary to Applicant’s assertions, the claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception under Step 2B.
Response to Arguments under 35 USC 102:
Applicant asserts that “Givot fails to teach or suggest identifying, by a machine learning engine, one or more factors that could be optimized and for which an experimentation engine is to generate a randomized controlled experiment.” Examiner respectfully disagrees.
Givot discloses identifying, by a machine learning engine, one or more factors that could be optimized and for which an experimentation engine is to generate a randomized controlled experiment (see Givot, paras [0059] & [0061], wherein artificial intelligence (AI) applications will appreciate that the machine learning tool learns optimized settings that evolve over time and constantly improve the grinding performance…..; paras [0139] & [0012], wherein automatically generating a Design of Experiments (DoE) design (i.e. a DoE tests involves not only execution of experimental the selection of suitable independent, dependent, and control variables, but planning of the delivery of the experiment under statistically optimal conditions given the constraints of available resources) for an input variable space of the industrial process where an experimental design is processed by the data model and results are stored based on preferred results. The predictive algorithm makes predictions of a predetermined number of best next trials for the DoE and outputs parameter values for the predetermined number of best next trials and probabilities of improved results using the parameter values; para [0128], wherein Fig. 25 is a plot illustrating a preliminary predictive model of the harness data of FIG. 24 as generated using the multivariate statistical analysis tool Simca by Umetrics that plots the correlations 2500 among the collected parameters. To provide a visualization of the product groupings, 14 settings were defined with a random generator; para [0075], wherein a DoE tests involves not only execution of experimental the selection of suitable independent, dependent, and control variables, but planning of the delivery of the experiment under statistically optimal conditions given the constraints of available resources. In sample embodiments, such experimental setups are provided to enable the generation of suitable input variables and output variables for generation of a data model of the operation of the industrial process of interest. Typically, the input variables include process variables and the output variables including result variables from the operation of the industrial process, and the resulting data model represents contributions to changes in the output or result variables by the respective input or process variables in accordance with the experimental arrangement; and paras [0020] & [0065]-[0066], wherein the predictive algorithm identifying parameter values for input variables expected to have a most significant impact on selected output variables during performance of the industrial process, making predictions of a predetermined number of best next trials for the DoE, and outputting parameter values for the predetermined number of best next trials and probabilities of improved results using the parameter values; and providing the parameter values as the input variables to the industrial process to optimize the selected output variables).
Applicant asserts that “Givot fails to teach or suggest identifying, by the experimentation engine, a second data set needed to execute the randomized controlled experiment.” Examiner respectfully disagrees.
Givot discloses identifying, by the experimentation engine, a second data set needed to execute the randomized controlled experiment (see Givot, para [0076], wherein the model development software and model runtime engine 48 performs a pre-treatment process for the centering and scaling of variables including the Y-space scores vector(s) and the X-space scores vector(s). Through calculation of the transfer vector between X-space vectors and Y-space vectors, the predictive model is defined. The post-treatment process (i.e., reverse centering and scaling of variables) includes running sample data or the DoE data through the model and presenting the variables and results; para [0075], wherein a DoE tests involves not only execution of experimental the selection of suitable independent, dependent, and control variables, but planning of the delivery of the experiment under statistically optimal conditions given the constraints of available resources. In sample embodiments, such experimental setups are provided to enable the generation of suitable input variables and output variables for generation of a data model of the operation of the industrial process of interest; paras [0012] & [0080], wherein a principal component analysis is run on the updated dataset; and para [0128], wherein Fig. 25 is a plot illustrating a preliminary predictive model of the harness data of FIG. 24 as generated using the multivariate statistical analysis tool Simca by Umetrics that plots the correlations 2500 among the collected parameters. To provide a visualization of the product groupings, 14 settings were defined with a random generator).
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-3 are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. Specifically, claims 1-3 are directed to an abstract idea without additional elements amounting to significantly more than the abstract idea.
With respect to Step 2A Prong One of the framework, claim 1 recites an abstract idea. Claim 1 includes “receiving a first data set associated with a multivariate process; analyzing, the first data set to determine at least one interaction between at least two factors in the multivariate process and at least one correlation associated with the at least one interaction; identifying at least one factor to optimize to improve an execution of the multivariate process, responsive to the analyzing; generating a randomized controlled experiment to execute to determine whether optimizing the at least one identified factor improves the execution of the multivariate process by an amount exceeding a threshold amount of improvement; identifying a second data set needed to execute the randomized controlled experiment; acquiring the second data set; executing the randomized controlled experiment; generating a recommendation for optimizing the at least one factor; and providing the recommendation”.
The limitations above recite an abstract idea under Step 2A Prong One. More particularly, the elements above recite mental processes-concepts performed in the human mind (including an observation, evaluation, judgment, opinion) because the elements describe a process for generating a recommendation for optimizing. As a result, claim 1 recites an abstract idea under Step 2A Prong One.
Claims 2-3 further describe the process for generating a recommendation for optimizing. As a result, claims 2-3 recite an abstract idea under Step 2A Prong One for the same reasons as stated above with respect to claim 1.
With respect to Step 2A Prong Two of the framework, claim 1 does not include additional elements that integrate the abstract idea into a practical application. claim 1 includes additional elements that do not recite an abstract idea under Step 2A Prong One. The additional elements of claim 1 includes a computing device, an experimentation engine, and a machine learning engine. When considered in view of the claim as a whole, the additional elements do not integrate the abstract idea into a practical application because the additional computing elements are generic computing elements that are merely used as a tool to perform the recited abstract idea. As a result, claim 1 does not include additional elements that integrate the abstract idea into a practical application under Step 2A Prong Two.
Claims 2-3 do not include any additional elements beyond those recited with respect to claim 1. As a result, claims 2-3 do not include additional elements that integrate the abstract idea into a practical application under Step 2A Prong Two for the same reasons as stated above with respect to claim 1.
With respect to Step 2B of the framework, claim 1 does not include additional elements amounting to significantly more than the abstract idea. As noted above, claim 1 includes additional elements that do not recite an abstract idea under Step 2A Prong One. The additional elements of claim 1 includes a computing device, an experimentation engine, and a machine learning engine. The additional elements do not amount to significantly more than the abstract idea because the additional computing elements are generic computing elements that are merely used as a tool to perform the recited abstract idea. Further, looking at the additional elements as an ordered combination adds nothing that is not already present when considering the additional elements individually. As a result, independent claim 1 does not include additional elements that amount to significantly more than the abstract idea under Step 2B.
Claims 2-3 do not include any additional elements beyond those recited with respect to claim 1. As a result, claims 2-3 do not include additional elements that amount to significantly more than the abstract idea under Step 2B for the same reasons as stated above with respect to claim 1.
Therefore, the claims are directed to an abstract idea without additional elements amounting to significantly more than the abstract idea. Accordingly, claims 1-3 are rejected under 35 U.S.C. 101 as being directed to non-statutory subject matter.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1 and 3 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Givot et al. (US Pub No. 2022/0373981) (hereinafter Givot et al.), hereinafter Givot.
Regarding claim 1, Givot discloses a method for automated generation and execution of randomized controlled experiments to determine whether to optimize an automatically identified optimizable factor (see Givot, para [0011], wherein
, the method comprising:
receiving, by a computing device, a first data set associated with a multivariate process (see Givot, paras [0139] & [0011], wherein a computer performing multivariate analysis of input variables and output variables generated during the operation of an industrial process to generate a data model of the operation of the industrial process);
analyzing executed by the computing device, by at least one machine learning engine, the first data set to determine at least one interaction between at least two factors in the multivariate process and at least one correlation associated with the at least one interaction (see Givot, paras [0139] & [0065]-[0066], wherein statistical tools such as multivariate analysis and a partial least squares (PLS)/orthogonal projections to latent structures (OPLS) analysis may be used by the machine learning tool 24 to extract information from historic grinding data (or Design of Experiment (DoE)-test grindings) to create models describing variable contributions to performance with good predictive ability…….a 3-level full factorial was used, although other experimental designs may be used. In an even more extensive design, more data points than the 3-level full factorial may be used where experiments are also added on the middle of all edges; and para [0053], wherein Fig. 24 as generated using a multivariate statistical analysis tool that plots the correlations among the collected parameters);
identifying, by the at least one machine learning engine, at least one factor to optimize to improve an execution of the multivariate process, responsive to the analyzing (see Givot, paras [0059] & [0061], wherein artificial intelligence (AI) applications will appreciate that the machine learning tool learns optimized settings that evolve over time and constantly improve the grinding performance…..an optimization tool has been developed that supports simultaneous change of multiple variables. This optimization tool provided a significant increase in speed of the optimization process. Such rapid optimization allows movement from the point of operation in the direction of the optimum machine setting defined by the model and calibration dataset, which decreases risk of damaging a product or machine, increases customer value by each step, and provides a safer and more stable work process);
generating, by an experimentation engine executed by the computing device, a randomized controlled experiment to execute to determine whether optimizing the at least one identified factor improves the execution of the multivariate process by an amount exceeding a threshold amount of improvement (see Givot, paras [0139] & [0012], wherein automatically generating a Design of Experiments (DoE) design for an input variable space of the industrial process where an experimental design is processed by the data model and results are stored based on preferred results. The predictive algorithm makes predictions of a predetermined number of best next trials for the DoE and outputs parameter values for the predetermined number of best next trials and probabilities of improved results using the parameter values; para [0128], wherein Fig. 25 is a plot illustrating a preliminary predictive model of the harness data of FIG. 24 as generated using the multivariate statistical analysis tool Simca by Umetrics that plots the correlations 2500 among the collected parameters. To provide a visualization of the product groupings, 14 settings were defined with a random generator; para [0075], wherein a DoE tests involves not only execution of experimental the selection of suitable independent, dependent, and control variables, but planning of the delivery of the experiment under statistically optimal conditions given the constraints of available resources. In sample embodiments, such experimental setups are provided to enable the generation of suitable input variables and output variables for generation of a data model of the operation of the industrial process of interest. Typically, the input variables include process variables and the output variables including result variables from the operation of the industrial process, and the resulting data model represents contributions to changes in the output or result variables by the respective input or process variables in accordance with the experimental arrangement; and paras [0020] & [0065]-[0066], wherein the predictive algorithm identifying parameter values for input variables expected to have a most significant impact on selected output variables during performance of the industrial process, making predictions of a predetermined number of best next trials for the DoE, and outputting parameter values for the predetermined number of best next trials and probabilities of improved results using the parameter values; and providing the parameter values as the input variables to the industrial process to optimize the selected output variables);
identifying, by the experimentation engine, a second data set needed to execute the randomized controlled experiment (see Givot, para [0076], wherein the model development software and model runtime engine 48 performs a pre-treatment process for the centering and scaling of variables including the Y-space scores vector(s) and the X-space scores vector(s). Through calculation of the transfer vector between X-space vectors and Y-space vectors, the predictive model is defined. The post-treatment process (i.e., reverse centering and scaling of variables) includes running sample data or the DoE data through the model and presenting the variables and results; para [0075], wherein a DoE tests involves not only execution of experimental the selection of suitable independent, dependent, and control variables, but planning of the delivery of the experiment under statistically optimal conditions given the constraints of available resources. In sample embodiments, such experimental setups are provided to enable the generation of suitable input variables and output variables for generation of a data model of the operation of the industrial process of interest; paras [0012] & [0080], wherein a principal component analysis is run on the updated dataset; and para [0128], wherein Fig. 25 is a plot illustrating a preliminary predictive model of the harness data of FIG. 24 as generated using the multivariate statistical analysis tool Simca by Umetrics that plots the correlations 2500 among the collected parameters. To provide a visualization of the product groupings, 14 settings were defined with a random generator);
acquiring the second data set (see Givot, paras [0079]-[0080], wherein generic test data is collected (e.g., into spreadsheet) and data is collected relating to the roll (e.g., roll diameter, roll width), the wheel ( e.g., wheel diameter, wheel width, abrasive grit hardness, hardness number, structure, bond, etc.),……... New variables are generated as appropriate. A principal component analysis is run on the updated dataset);
executing, by the experimentation engine, the randomized controlled experiment (see Givot, paras [0012] & [0128], wherein Fig. 25 is a plot illustrating a preliminary predictive model of the harness data of FIG. 24 as generated using the multivariate statistical analysis tool Simca by Umetrics that plots the correlations 2500 among the collected parameters. To provide a visualization of the product groupings, 14 settings were defined with a random generator; and para [0075], wherein a DoE tests involves not only the selection of suitable independent, dependent, and control variables, but planning of the delivery of the experiment under statistically optimal conditions given the constraints of available resources. In sample embodiments, such experimental setups are provided to enable the generation of suitable input variables and output variables for generation of a data model of the operation of the industrial process of interest. Typically, the input variables include process variables and the output variables including result variables from the operation of the industrial process, and the resulting data model represents contributions to changes in the output or result variables by the respective input or process variables in accordance with the experimental arrangement);
generating, by the experimentation engine, a recommendation for optimizing the at least one factor (see Givot, paras [0012] & [0013], wherein the method may include recommending a product having parameter values that optimize for the selected output variables in the industrial process; and para [0067], wherein a 3 level-full factorial DoE on the selected 5 machine parameters, a design window was defined. The model was then used to predict performance for the 243 resulting machine setting alternatives. The highest 10-20 were then presented to the user to select the setting to use); and
providing, by the computing device, the recommendation (see Givot, paras [0139] & [0078], wherein provide product recommendations and/or to optimize product portfolios by identifying holes and/or overlaps in product lines at 53).
Regarding claim 3, Givot discloses the method of claim 1 further comprising generating a confidence interval associated with the identified at least one factor to optimize (see Givot, para [0104], wherein values larger than the 95% confidence limit are suspect, and values larger than the 99% confidence limit may be considered as serious; and para [0086], wherein optimization is then performed by defining variables to optimize, typically the machine variables. Ranges are defined for each variable. High/low settings are used where the process is known to be stable and the operator of
the machine being optimized is comfortable).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim 2 is rejected under 35 U.S.C. 103 as being unpatentable over Givot et al. (US Pub No. 2022/0373981) (hereinafter Givot et al.) in view of Benoit et al. (US Pub No. 2022/0121971) (hereinafter Benoit et al.).
Regarding claim 2, Givot discloses the method of claim 1 further comprising identifying a possible interaction between at least two factors in the multivariate process (see Givot, paras [0065]-[0066], wherein statistical tools such as multivariate analysis and a partial least squares (PLS)/orthogonal projections to latent structures (OPLS) analysis may be used by the machine learning tool 24 to extract information from historic grinding data (or Design of Experiment (DoE)-test grindings) to create models describing variable contributions to performance with good predictive ability…….a 3-level full factorial was used, although other experimental designs may be used. In an even more extensive design, more data points than the 3-level full factorial may be used where experiments are also added on the middle of all edges; and para [0053], wherein Fig. 24 as generated using a multivariate statistical analysis tool that plots the correlations among the collected parameters).
Givot et al. fails to explicitly disclose identifying at least one assumption.
Analogous art Benoit discloses identifying at least one assumption associated with a possible interaction between at least two factors in the multivariate process (see Benoit, para [0026], wherein identify and adjust for false inputs ( e.g., false assumptions) that would confound, bias and/or mask cause-and-effect knowledge and limit optimization results, as well as monitor and dynamically adapt to changes in causal relationships between process decisions and operational outcomes (e.g., as a result of equipment failure, wear and tear, weather event, etc.); paras [0057]-[0058], wherein
direction of interactions between the independent variables and the utility function and thus the extent to which data are representative of the current state of the system for real-time decision support……Clustering involves isolating assignments per individual
external factor, combination of external factors (e.g., as determined by principle component analysis), or combination of external factors states/values (e.g., as determined by conditional inference trees or other unsupervised classification methods) that act as effect modifiers; and para [0040], wherein processes for multivariate learning and optimization system 100 for execution by processor).
Givot directed to a system for simulating and optimizing industrial and other processes includes a computer that performs multivariate analysis of input variables and output variables. Benoit directed to multivariant learning and optimization repeatedly generate self-organized experimental. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Givot, regarding the System for Determining In-Season Crop Status in an Agricultural Crop, to have included identifying at least one assumption associated with a possible interaction between at least two factors in the multivariate process because both inventions teach improving the precision of the probability estimate. Further, the claimed invention is merely a combination of old elements, and in the combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art would have recognized that the results of the combination were predictable.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to HAFIZ A KASSIM whose telephone number is (571)272-8534. The examiner can normally be reached 9:00 - 5:00 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Rutao Wu can be reached on 571-272-6045. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/HAFIZ A KASSIM/Primary Examiner, Art Unit 3623 3/12/2026