CTNF 18/276,967 CTNF 93849 DETAILED ACTION Notice of Pre-AIA or AIA Status 07-03-aia AIA 15-10-aia The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA. 12-151 AIA 26-51 12-51 Status of Claims Claims 27-52 remain pending and are ready for examination. Information Disclosure Statement 06-52 The information disclosure statement (IDS) submitted on 08/11/2023, was filed. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Rejections - 35 USC § 102 07-06 AIA 15-10-15 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. 07-07-aia AIA 07-07 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – 07-08-aia AIA (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention. 07-15 AIA Claim s 27, 29-36, 40, 42-52 are rejected under 35 U.S.C. 102( a)(1 ) as being anticipated by Blevins et al., U.S. Pub No: US 20130069792 A1 (Hereinafter “Blevins”) . Regarding claim 27, Blevins teaches A method for training a machine learning module of a computer-implemented prediction model for predicting product quality parameter values for one or more quality parameters of a chemical product produced by a chemical production plant (see paragraph [0003], "chemical process"; paragraph [0015], "quality prediction model creation training data", "process model, such as a neural network"), wherein the production plant comprises a plurality of sensors, which are each configured to acquire, during the operation of the production plant, process parameter values for one or more process parameters of a chemical process carried out by the production plant for producing the chemical product (see paragraph [0009] and paragraph [0015], "measurements of various process parameters"; paragraph [0037], "a sensor or other process measurement devices") , the method comprising: providing training data, wherein the training data for a plurality of product units produced by the production plant comprises product quality parameter values determined for each of one or more quality parameters of the respective product unit as training product quality parameter values, wherein the training product quality parameter values are each assigned a production time of the product unit for which they were determined, wherein the training data further comprises a plurality of process parameter values from each of the sensors as training process parameter values, which were acquired during the production of the product units for which the training product quality parameter values were determined, wherein the training process parameter values are each assigned an acquisition time and an identifier of the acquiring sensor (implied by paragraphs [0074-0075, 0079-0080] and [0104], "shifted in time ... delay ... is different for each of the process parameters being used as inputs to the quality prediction model" and figure 5) ; providing a priori information about the production plant and the process carried out by the production plant, wherein the a priori information includes chronological sequence information about a chronological sequence of the process carried out within the production plant (see paragraphs [0080, 0082-0083] and [0105] which, for a Kamyr digester, indicate which delays and therefore also which temporal sequences should be taken into account for modelling) ; determining for each of the sensors a sensor-specific time shift between an acquisition time of one of the training process parameter values acquired by the corresponding sensor and a production time of the product unit, during the production of which the corresponding training process parameter value was acquired, the determination being carried out in each case using the chronological sequence information, with the determined sensor-specific time shifts of the sensors being assigned in each case to the training process parameter values acquired by the respective sensor (see paragraphs [0079-0081], "shifted in time ... delay ... is different for each of the process parameters being used as inputs to the quality prediction model" and figure 5) ; assigning the training process parameter values to the one or more training product quality parameter values of one of the product units, during the production process of which the respective training process parameter value was acquired (see paragraph [0084, 0106], "After all of the process parameter and state parameter time shifts have been determined, ... to time shift the process parameter data for each process parameter input to the model and the state parameter") ; using the acquisition time of the respective training process parameter value, the sensor-specific time shift of the sensor acquiring the respective training process parameter value, and the production time of the respective product unit (see paragraph [0084, 0106], "After all of the process parameter and state parameter time shifts have been determined, ... to time shift the process parameter data for each process parameter input to the model and the state parameter") ; and training the machine learning module using the training process parameter values and training product quality parameter values assigned to each other, wherein the respective training product quality parameter values are used to provide output data and the respective assigned training process parameter values are used to provide input data of the machine learning module for the training (see paragraph [0079, 0097, 0111], "training data is used to develop one or more quality prediction models", wherein the training data correlated in terms of time are meant here, as described after paragraph [0079]) ) . Regarding claim 29, Blevins teaches wherein the production times of the product units each concerns a completion time of the process carried out by the production plant for producing the corresponding product unit (see paragraph [0008, 0043], wherein the collected parameters and quality data are used to create a statistical model of the process, with the statistical model representing the "normal" operation of the process that results in desired quality metrics. This statistical model of the process can then be used to analyze how different process parameter measurements made during a particular process implementation statistically relate to the same measurements made within the processes used to develop the model.) . Regarding claim 30, Blevins teaches wherein the method further comprises cleaning the training process parameter values provided, wherein the cleaning comprises one or more of: removing outlier values from the training process parameter values; removing non-physical values from the training process parameter values ; and adding missing training process parameter values, wherein in order to identify missing training process parameters the training data is checked for completeness using a priori completeness information, which defines from which sensors of the production plant and for which process parameters the training data should include training process parameter values (see paragraph [0095], wherein filtering process makes the process model being generated more robust, and helps to prevent the model from detecting quality problems when the process is simply moving between states) . Regarding claim 31, Blevins teaches wherein the method further comprises aggregating the training process parameter values for one or more sensors acquired by the respective sensor, wherein the corresponding training process parameter values are assigned to an aggregation time window using the respectively assigned acquisition times, with process parameter values associated to a common aggregation time window each being aggregated (see paragraph [0108], wherein develops a set of process parameter and quality parameter means for each defined process state. The process parameter means for each particular process state will include, for each process parameter, the mean of the values of that process parameter from all of the time slices for which the state parameter value of the time slices falls with the particular process state. Likewise, the quality parameter mean for each particular process state will include, for the quality parameter, the mean of the values of the quality parameter for all of the time slices for which the state parameter value of the time slice falls with the particular process state. Similarly, the state parameter mean for each particular process state will include, for the state parameter, the mean of the values of the state parameter in all of the time slices for which the state parameter value of the time slice falls with the particular process state) . Regarding claim 32, Blevins teaches wherein the sensor-specific time shifts of the sensors, the training process parameter values of which are aggregated, are determined for each of the aggregation windows and are assigned to the aggregated training process parameter values of the respective aggregation window (see paragraph [0106]) . Regarding claim 33, Blevins teaches wherein the provision of input data further comprises extracting statistical feature values and / or frequency feature values from the training process parameter values for training the machine learning module (see paragraph [0012, 0094]) . Regarding claim 34, Blevins teaches wherein the provision of input data further comprises scaling the extracted feature values for training the machine learning module (see paragraph [0089, 0145]) . Regarding claim 35, Blevins teaches wherein the provision of output data comprises scaling the training product quality parameter values (see paragraph [0089, 0145]) . Regarding claim 36, Blevins teaches wherein the provision of input data further comprises reducing the dimensionality of extracted feature values using a transformation of the extracted feature values (see paragraph [0009]) . Regarding claim 40, Blevins teaches wherein the machine learning module comprises an artificial neural network (see paragraph [0131]) . Regarding claim 42, Blevins teaches A method for predicting product quality parameter values for one or more quality parameters of a chemical product produced by a chemical production plant using a computer-implemented prediction model having a machine learning module trained according to one of the preceding claims, wherein the production plant comprises a plurality of sensors, which are each configured to acquire process parameter values for one or more process parameters of a chemical process for producing the chemical product carried out by the production plant in the operation of the production plant (see paragraph [0003, 0066]) , the method comprising: providing a plurality of process parameter values acquired by the sensors in the operation of the production plant, wherein the process parameter values are each assigned an acquisition time and an identifier of the acquiring sensor (see paragraph [0117], wherein collects and stores process parameter and state parameter data from the process. (Of course, quality parameter data and fault data may also be collected, but this action is not strictly necessary for performing data analytics of the process using the models created by the technique of FIG. 3). The collected process parameter and state parameter data could be obtained on-line from the process in real time, could be obtained via lab or off-line analyses, could be obtained via user input or could be obtained in any other desired manner.) ; providing a priori information about the production plant and the process carried out by the production plant, wherein the a priori information includes chronological sequence information about a chronological sequence of the process carried out within the production plant (see paragraph [0080], wherein using the development of quality prediction models (e.g., the NN, MLR, and PLS models) for each of the quality parameters to be predicted, the deviation values used to produce the model are shifted in time to account for the time required for a change in an input to a process model (i.e., a change in one of the process parameter values) to impact the quality parameter being predicted by the model. As will be understood, this delay may be, and usually is, different for each of the process parameters being used as inputs to the quality prediction model. (Similarly, these same delays will be taken into account in the processing of the deviation values used for on-line quality parameter prediction.)) ; determining for each of the sensors a sensor-specific time shift between the acquisition times of the process parameter values acquired by the corresponding sensor and a production time of the product unit, for which product quality parameter values are to be predicted and during the production of which the respective process parameter values were acquired, the determination being carried out in each case using the chronological sequence information, with the determined sensor-specific time shifts of the sensors being assigned in each case to the process parameter values acquired by the respective sensor (see paragraph [0079-0082, 0118]) ; assigning to each other the process parameter values that were acquired during the production of the same product unit, wherein the process parameter values to be aggregated are determined using the sensor-specific time shift, using each of the assigned process parameter values to provide input data of the trained machine learning module to predict one or more product quality parameter values (see paragraph [0084, 0119-0123]) ; receiving one or more product quality parameter values predicted using the trained machine learning module for the product unit, during the production of which the product quality parameter values used to provide the input data were acquired, as an output of the prediction model (see paragraph [0122]) . Regarding claim 43, Blevins teaches further comprising an additional training of the trained machine learning module, the additional training comprising: providing product quality parameter values, which were determined using one or more of the product units produced by the production plant for which product quality parameter values have been predicted, as additional training product quality parameter values (see paragraph [0143-0145]) ; assigning the provided process parameter values that were acquired during production of the respective product units, for which the additional training product quality parameter values are provided, as additional training process parameter values to the additional training product quality parameter values, wherein the process parameter values that were acquired during the production of the respective product units are determined using the acquisition time of the respective process parameter values, the sensor-specific time shifts of the sensors acquiring the respective process parameter values, and the production times of the respective product units produced (see paragraph [0124, 0145-0146]) ; and additionally training the machine learning module using the additional training product quality parameter values and the assigned additional training process parameter values, wherein the respective additional training product quality parameter values are used to provide additional output data and the respectively assigned additional training process parameter values are used to provide additional input data of the machine learning module for the additional training (see paragraph [0141, 0143, 0145-0146]) . Regarding claim 44, Blevins teaches wherein the method further comprises detecting anomalies in the predicted product quality parameter values (see paragraph [0047, 0050]) , the detection of the anomalies comprising: comparing the product quality parameter values determined using the product units produced by the production plant with the product quality parameter values predicted for the respective product unit (see paragraph [0131, 0134]) ; identifying the predicted product quality parameter values as anomalies if a deviation between the predicted and determined product quality parameter values for the same product unit meets a predefined criterion (see paragraph [0066, 0138, 0152]) ; and outputting an anomaly alert if one or more of the predicted product quality parameter values are identified as anomalies (see paragraph [0060, 0138, 0157]) . Regarding claim 45, Blevins teaches wherein the predefined criterion comprises exceeding a predefined first threshold value (see paragraph [137, 0155]) . Regarding claim 46, Blevins teaches wherein the satisfying the predefined criterion comprises a confidence level of the deviation falling below a predefined second threshold value (see paragraph [0137, 0155]) . Regarding claim 47, Blevins teaches wherein a precondition for initiating an additional training of the trained machine learning module comprises identifying one or more of the predicted product quality parameter values as an anomaly ((see paragraph [0146)) . Regarding claim 48, Blevins teaches wherein a precondition for initiating an additional training of the trained machine learning module comprises accumulating additional training product quality parameter values and additional training process parameter values for a predefined number of product units produced (see paragraph [0146]) . Regarding claim 49, Blevins teaches selecting from the process parameters, for which the sensors of the production plant acquire process parameter values, a group of controllable process parameters which can be controlled by a central control system of the production plant; and identifying a subgroup of the controllable process parameters, the variation of which most strongly affects the product quality parameter values predicted using the trained machine learning module, wherein the identification comprises varying different subgroups of the controllable process parameters and comparing the resulting predicted product quality parameter values. Regarding claim 50, Blevins teaches receiving a set of target product quality parameter values for one or more product quality parameters of the product to be produced by the production plant (see paragraph [0052-0053, 0061]) ; determining process parameter values for the controllable process parameters of the subgroup for which a total deviation between the product quality parameter values predicted using the trained machine learning module and the received target product quality parameters falls below a predefined third threshold value, wherein the determination comprises a variation of the controllable process parameters of the subgroup using a non-linear minimization procedure (see paragraph [0052-0053, 0061]) ; and outputting the determined process parameter values as a recommendation for adjusting the controllable process parameters using the control system for producing product units of the product to be produced which exhibit the target product quality parameter values (see paragraph [0052-0053, 0061]) . Claim 51 rejected under the same rationale as claim 27. Claim 52 rejected under the same rationale as claim 42 . Claim Rejections - 35 USC § 103 07-20-02-aia AIA This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. 07-20-aia AIA The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 07-23-aia AIA The factual inquiries set forth in Graham v. John Deere Co. , 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. 07-21-aia AIA Claim s 41 are rejected under 35 U.S.C. 103 as being unpatentable over Blevins et al., U.S. Pub No: US 20130069792 A1 (Hereinafter “Blevins”) in view of Ilani et al., U.S. Pub No: US 20210096518 A1 (Hereinafter “Ilani”) . Regarding claim 41, Blevins teaches used in chemical, petroleum or other processes, in paragraph [0003], Blevins fails to explicitly disclose the limitation below. Ilani discloses wherein the production plant is a polymer production plant for producing a polymer product (see paragraph [0042, 0047]) . It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the system of Blevins to include the missing limitation, as taught by Ilani, since doing so would allow the system handling non-Linear system dynamics and applying localized linearization (Ilani; paragraphs [0004, 0043-0047]) . Allowable Subject Matter 12-151-08 AIA 07-43 12-51-08 Claim s 28, 37-39 objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to MAHER N ALGIBHAH whose telephone number is (571)272-0718. The examiner can normally be reached on Monday-Thursday. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http: //www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Aleksandr Kerzhner can be reached on (571) 270-1760. The fax phone number for the organization where this application or proceeding is assigned is 571-273-1264. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http: //pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MAHER N ALGIBHAH/Primary Examiner , Art Unit 2165 Application/Control Number: 18/276,967 Page 2 Art Unit: 2165 Application/Control Number: 18/276,967 Page 3 Art Unit: 2165 Application/Control Number: 18/276,967 Page 4 Art Unit: 2165 Application/Control Number: 18/276,967 Page 5 Art Unit: 2165 Application/Control Number: 18/276,967 Page 6 Art Unit: 2165 Application/Control Number: 18/276,967 Page 7 Art Unit: 2165 Application/Control Number: 18/276,967 Page 8 Art Unit: 2165 Application/Control Number: 18/276,967 Page 9 Art Unit: 2165 Application/Control Number: 18/276,967 Page 10 Art Unit: 2165 Application/Control Number: 18/276,967 Page 11 Art Unit: 2165 Application/Control Number: 18/276,967 Page 12 Art Unit: 2165 Application/Control Number: 18/276,967 Page 13 Art Unit: 2165 Application/Control Number: 18/276,967 Page 14 Art Unit: 2165 Application/Control Number: 18/276,967 Page 15 Art Unit: 2165 Application/Control Number: 18/276,967 Page 16 Art Unit: 2165