Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
Acknowledgment is made of the information disclosure statements filed on 12/08/2021, U.S. patents and Foreign Patents have been considered.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1, 3 – 5, 7, and 8 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Ypma et al. (United States Patent Application Publication US20180239851), hereinafter referenced as Ypma.
In regards to claim 1, (Ypma) shows a method comprising:
training a machine learning model with data input comprising one or more sets of historical recipe parameters associated with producing one or more substrates with substrate processing equipment and target data comprising historical performance data of the one or more substrates to generate a trained machine learning model; Ypma [0039] teaches using accumulated observations comprising sets of stack parameters xi as data inputs and corresponding empirical performance fitness values f(xi) as target data to transform the prior surrogate function into a posterior trained model through Bayes' rule of inference, directly mapping to training a machine learning model with historical recipe parameters and historical performance data. Ypma [0043] teaches that the surrogate function is trained with simulations across the parameter space to generate a trained surrogate model.
identifying one or more sets of additional recipe parameters associated with a level of uncertainty of the trained machine learning model; Ypma [0043] teaches selecting a candidate model based on where in the parameter space the approximation indicates it is likely to be fruitful to search according to both the uncertainty and the approximated result, directly mapping to identifying additional recipe parameters associated with a level of model uncertainty. Ypma [0065] teaches that the acquisition function assigns a respective score to each candidate point based on both fitness and uncertainty, and selects the highest-scoring point in the parameter space as the next candidate parameter set.
further training the machine learning model with additional data input comprising the one or more sets of additional recipe parameters and additional target data comprising additional performance data of one or more additional substrates produced based on the one or more sets of additional recipe parameters to update the trained machine learning model; Ypma [0043] teaches step 4 of the iterative calibration loop as updating the surrogate function based on the results of the simulation of each new candidate model, directly mapping to further training with additional recipe parameters and additional performance data to update the trained model. Ypma [0064] teaches that the measures of central tendency of the surrogate function are adjusted to match new simulation results and measures of uncertainty are reduced at areas where additional simulations are run, directly teaching updating the trained machine learning model with additional data.
In regards to claim 3, (Ypma) shows the method of claim 1:
wherein the identifying of the one or more sets of additional recipe parameters is based on local uncertainty reduction associated with the one or more sets of additional recipe parameters and global uncertainty reduction associated with the trained machine learning model; Ypma [0065] and [0066] teach that the acquisition function balances exploiting locally promising parameter regions with low current uncertainty against globally exploring regions of parameter space where little is known, directly implementing both local uncertainty reduction associated with the additional recipe parameters and global uncertainty reduction associated with the trained machine learning model as dual objectives of the parameter identification step.
In regards to claim 4, (Ypma) shows the method of claim 1:
wherein the historical performance data comprises one or more of thickness values, critical dimension (CD) values, shape parameter values, material property values, metrology measurement values, or sensor measurement values of one or more layers of the one or more substrates; Ypma [0044] and [0072] explicitly enumerate critical dimension values, film thickness values, etch depth values, and sidewall profile shape parameters as examples of the substrate measurement data; specifically, [0044] lists one or more critical dimensions, film thicknesses, etch depths, and sidewall profiles as stack model parameters, and [0072] enumerates critical dimension, etch depth, film thickness, sidewall angle, curvature, and surface roughness as model parameters collected from metrology tools, mapping directly to the thickness values, CD values, shape parameter values, and metrology measurement values recited in the claim.
In regards to claim 5, (Ypma) shows the method of claim 1:
wherein the level of uncertainty of the trained machine learning model is associated with target performance data to be obtained using the one or more sets of additional recipe parameters; Ypma [0042] and [0063] teach that the Gaussian Process surrogate outputs both a predicted performance value and a measure of uncertainty about that predicted performance value at each candidate parameter location, directly associating the model uncertainty with the target performance data to be obtained using those additional recipe parameters.
In regards to claim 7, (Ypma) shows the method of claim 1:
wherein the trained machine learning model uses one or more of Gaussian Process Regression (GPR), Gaussian Process Classification, Bayesian Linear Regression, Probabilistic Learning, Bayesian Neural Networks, or Neural Network Gaussian Processes; Ypma [0038] explicitly teaches using a Gaussian Process as the surrogate model, describing it as a distribution over functions specified by its mean function and covariance function. Ypma [0063] further teaches that the surrogate function is a probabilistic process, such as a Gaussian process, which yields at each evaluated point a statistical distribution, which is a Gaussian Process Regression (GPR) method expressly recited in the claim.
In regards to claim 8, (Ypma) shows the method of claim 1:
wherein the identifying of the one or more sets of additional recipe parameters comprises using one or more of: a space filling design (SFD); quantification and metrics of experiment design space coverage; grid expansion; numerical optimization; or Bayesian optimization; Ypma [0065] and [0067] teach identifying additional parameter sets by evaluating an acquisition function over the parameter space using a brute force grid search of candidate points and selecting the highest-scoring point, which constitutes grid expansion and numerical optimization over the design space, and further teaches Bayesian optimization as the overall framework for selecting additional parameters.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains.
Claims 2, 9, and 10 are rejected under 35 U.S.C. 103 as being unpatentable over US20180239851 (Ypma) as applied to claim 1 above, respectively, and further in view of US20040148049 (Schwarm).
In regards to claim 2, (Ypma) does not show: wherein the one or more sets of historical recipe parameters are associated with processes of a recipe used by the substrate processing equipment to produce the one or more substrates; and the one or more sets of additional recipe parameters are associated with updated processes of an updated recipe used by the substrate processing equipment to produce the one or more additional substrates.
Schwarm teaches the one or more sets of historical recipe parameters are associated with processes of a recipe used by the substrate processing equipment to produce the one or more substrates; Schwarm [0034] teaches that the factors in a designed experiment plan are recipe parameters of the semiconductor processing tool used to process substrates, establishing that recipe parameters are directly associated with the processes of a recipe used by the substrate processing equipment to produce substrates.
Schwarm teaches the one or more sets of additional recipe parameters are associated with updated processes of an updated recipe used by the substrate processing equipment to produce the one or more additional substrates; Schwarm [0029] teaches that the integrated controller automatically generates one or more updated process recipes comprising updated recipe parameters to be used by the substrate processing equipment to produce the one or more additional substrates.
The motivation to combine Ypma and Schwarm at the effective filing date of the invention is that both references are directed to model-based optimization of semiconductor process parameters using experimental data collected from physical substrate runs. Ypma provides the Bayesian Gaussian Process uncertainty reduction framework while Schwarm provides the semiconductor manufacturing context where recipe parameters are the structured inputs optimized through physical wafer-by-wafer DOE execution. A person of ordinary skill in the art would combine Ypma's uncertainty-driven parameter selection with Schwarm's physical recipe execution loop to produce an adaptive recipe optimization system with reasonable expectation of success.
In regards to claim 9, (Ypma) does not show: causing the one or more additional substrates to be produced by the substrate processing equipment based on the one or more sets of additional recipe parameters; and receiving the additional performance data of the one or more additional substrates produced based on the one or more sets of additional recipe parameters.
Schwarm teaches causing the one or more additional substrates to be produced by the substrate processing equipment based on the one or more sets of additional recipe parameters; Schwarm [0047] teaches that the designed experiment plan is automatically executed on the semiconductor processing tool with experiments run on a wafer-by-wafer basis using the identified recipe parameters, causing physical substrates to be produced by the substrate processing equipment based on the one or more sets of additional recipe parameters.
Schwarm teaches receiving the additional performance data of the one or more additional substrates produced based on the one or more sets of additional recipe parameters; Schwarm [0063] teaches that when the trained model is inadequate, a new set of experiments is automatically run on the tool to physically produce additional substrates and the resulting performance data of those additional substrates is collected and received to augment the training dataset.
The motivation to combine Ypma and Schwarm at the effective filing date of the invention is that both references are directed to model-based optimization of semiconductor process parameters using experimental data collected from physical substrate runs. Ypma provides the Bayesian Gaussian Process uncertainty reduction framework while Schwarm provides the semiconductor manufacturing context where recipe parameters are the structured inputs optimized through physical wafer-by-wafer DOE execution. A person of ordinary skill in the art would combine Ypma's uncertainty-driven parameter selection with Schwarm's physical recipe execution loop to produce an adaptive recipe optimization system with reasonable expectation of success.
In regards to claim 10, (Ypma) does not show, wherein: the data input comprises one or more historical recipes comprising the one or more sets of historical recipe parameters; and the additional data input comprises one or more additional recipes comprising the one or more sets of additional recipe parameters.
Schwarm teaches the data input comprises one or more historical recipes comprising the one or more sets of historical recipe parameters; Schwarm [0034] and [0047] teach that the recipe parameters of the semiconductor processing tool are the factors organized into the designed experiment plan, and the historical DOE execution data organized by recipe parameter sets forms the model training data input, constituting one or more historical recipes comprising the one or more sets of historical recipe parameters as data input.
Schwarm teaches the additional data input comprises one or more additional recipes comprising the one or more sets of additional recipe parameters; Schwarm [0063] teaches that when the model requires improvement, additional recipe runs comprising additional recipe parameters are executed on the substrate processing equipment to produce additional substrate data, constituting one or more additional recipes comprising the one or more sets of additional recipe parameters as additional data input.
The motivation to combine Ypma and Schwarm at the effective filing date of the invention is that both references are directed to model-based optimization of semiconductor process parameters using experimental data collected from physical substrate runs. Ypma provides the Bayesian Gaussian Process uncertainty reduction framework while Schwarm provides the semiconductor manufacturing context where recipe parameters are the structured inputs optimized through physical wafer-by-wafer DOE execution. A person of ordinary skill in the art would combine Ypma's uncertainty-driven parameter selection with Schwarm's physical recipe execution loop to produce an adaptive recipe optimization system with reasonable expectation of success.
Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over US20180239851 (Ypma) as applied to claim 1 above, respectively, and further in view of US20050289487 (Li).
In regards to claim 6, (Ypma) does not show, the trained machine learning model being capable of generating, based on output of target performance data, one or more inputs indicative of predictive recipe parameters to be used by the substrate processing equipment to produce a plurality of substrates having the target performance data, wherein the predictive recipe parameters are to be used for recipe optimization.
Li teaches the trained machine learning model being capable of generating, based on output of target performance data; Li [0035] teaches that the trained spatial dynamic model receives desired target output values as input and generates predicted process parameter values as output, directly teaching a model capable of generating outputs based on target performance data.
Li teaches one or more inputs indicative of predictive recipe parameters to be used by the substrate processing equipment to produce a plurality of substrates having the target performance data; Li [0035] and [0071] teach that the model outputs optimized TEB and TEPO gas flow rate values constituting process recipe parameters that, when used by the substrate processing equipment, cause the produced substrates to have the target sidewall profile performance data.
Li teaches wherein the predictive recipe parameters are to be used for recipe optimization; Li [0071] teaches that the validated model is used to optimize process recipes by adjusting input values until the output values predicted by the model match the desired output values, directly teaching that the predictive recipe parameters are to be used for recipe optimization.
The motivation to combine Ypma and Li at the effective filing date of the invention is that both references are directed to model-based optimization of semiconductor process recipe parameters from experimental substrate data. Ypma provides the Bayesian Gaussian Process framework for probabilistic predictions over recipe parameter space but does not teach inverse prediction from target performance data to recipe parameters. Li fills this gap by teaching a trained model that derives recipe input parameters directly from a desired target substrate performance output. A person of ordinary skill in the art would incorporate Li's inverse prediction capability into Ypma's optimization framework with reasonable expectation of success.
Claims 11 – 13, 15, and 17 – 19 are rejected under 35 U.S.C. 103 as being unpatentable over US20180239851 (Ypma) and in view of US20050289487 (Li).
In regards to claim 11, (Ypma) shows a method comprising:
providing the target performance data to a trained machine learning model that uses one or more of Gaussian Process Regression (GPR), Bayesian linear regression, Probabilistic Learning, Bayesian Neural Networks, or Neural Network Gaussian Processes; Ypma [0038] teaches using a Gaussian Process as the surrogate model, describing it as a distribution over functions that generates probabilistic predictions with associated uncertainty estimates, and the target performance data is provided to this GPR model to drive the parameter optimization. Ypma [0063] confirms that the surrogate function is a probabilistic process, such as a Gaussian process, which yields for each evaluated point both a measure of central tendency and a measure of uncertainty.
Ypma differs from the claimed invention in that it does not explicitly disclose identifying target performance data of a substrate to be produced by substrate processing equipment; obtaining, from the trained machine learning model, predictive data indicative of predictive recipe parameters to be used by the substrate processing equipment to produce one or more substrates having the target performance data;
Li teaches identifying target performance data of a substrate to be produced by substrate processing equipment; Li [0035] teaches identifying a desired target output profile for a substrate to be produced by the etch processing equipment, specifically defining the desired cylindrical sidewall profile as the target performance data for the substrate to be produced.
Li teaches obtaining, from the trained machine learning model, predictive data indicative of predictive recipe parameters to be used by the substrate processing equipment to produce one or more substrates having the target performance data; Li [0035] and [0071] teach that the trained model outputs optimized TEB and TEPO gas flow parameter values constituting predictive recipe parameters that, when used by the substrate processing equipment, cause the produced substrates to have the target sidewall profile performance data.
The motivation to combine Ypma and Li at the effective filing date of the invention is that both references are directed to model-based optimization of semiconductor process recipe parameters from experimental substrate data. Ypma provides the Bayesian Gaussian Process framework for probabilistic predictions over recipe parameter space but does not teach inverse prediction from target performance data to recipe parameters. Li fills this gap by teaching a trained model that derives recipe input parameters directly from a desired target substrate performance output. A person of ordinary skill in the art would incorporate Li's inverse prediction capability into Ypma's optimization framework with reasonable expectation of success.
In regards to claim 12, (Ypma) does not show: wherein the predictive data is indicative of predictive recipes comprising the predictive recipe parameters;
Li teaches wherein the predictive data is indicative of predictive recipes comprising the predictive recipe parameters; Li [0035] and [0071] teach that the optimized TEB and TEPO gas flow parameter values output by the model are organized as process recipe inputs constituting predictive recipes comprising the predictive recipe parameters to be used by the substrate processing equipment.
The motivation to combine Ypma and Li at the effective filing date of the invention is that both references are directed to model-based optimization of semiconductor process recipe parameters from experimental substrate data. Ypma provides the Bayesian Gaussian Process framework for probabilistic predictions over recipe parameter space but does not teach inverse prediction from target performance data to recipe parameters. Li fills this gap by teaching a trained model that derives recipe input parameters directly from a desired target substrate performance output. A person of ordinary skill in the art would incorporate Li's inverse prediction capability into Ypma's optimization framework with reasonable expectation of success.
In regards to claim 13, (Ypma) shows the method of claim 11:
trained based on one or more sets of historical recipe parameters and historical performance data; Ypma [0039] teaches using accumulated observations comprising sets of stack parameters xi as data inputs and corresponding empirical performance fitness values f(xi) as target data to transform the prior surrogate function into a posterior trained model through Bayes' rule of inference, directly mapping to training with historical recipe parameters and historical performance data. Ypma [0043] teaches that the surrogate function is trained with simulations across the parameter space to generate a trained surrogate model.
further trained based on one or more sets of additional recipe parameters identified based on model uncertainty and additional performance data of one or more additional substrates produced based on the one or more sets of additional recipe parameters; Ypma [0043] teaches selecting each new candidate model based on where in the parameter space both the uncertainty and the approximated result indicate it is likely to be fruitful to search, directly mapping to identifying additional recipe parameters based on model uncertainty. Ypma [0065] teaches that the acquisition function assigns scores based on both fitness and uncertainty and selects the highest-scoring point as the next candidate parameter set. Ypma [0043] and [0064] teach that the surrogate function is iteratively updated after each new simulation by adjusting measures of central tendency to match new results and reducing uncertainty at areas where additional simulations are run, directly mapping to further training with additional performance data to update the trained model.
In regards to claim 15, (Ypma) shows the method of claim 11:
wherein the target performance data comprises one or more of thickness values, critical dimension (CD) values, shape parameter values, shape description values, material property values, metrology measurement values, or sensor measurement values of one or more layers of the substrate; Ypma [0044] and [0072] explicitly enumerate critical dimension values, film thickness values, etch depth values, and sidewall profile shape parameters as examples of the substrate measurement data, specifically, [0044] lists one or more critical dimensions, film thicknesses, etch depths, and sidewall profiles as stack model parameters, and [0072] enumerates critical dimension, etch depth, film thickness, sidewall angle, curvature, and surface roughness as model parameters collected from metrology tools, mapping directly to the thickness values, CD values, shape parameter values, and metrology measurement values recited in the claim.
wherein the method further comprises obtaining, from the trained machine learning model, uncertainty distributions over parameter space, the parameter space comprising the predictive recipe parameters; Ypma [0042] and [0063] teach that the Gaussian Process surrogate outputs uncertainty distributions over the full parameter space, with the parameter space comprising the recipe parameters, directly teaching obtaining uncertainty distributions over a parameter space comprising the predictive recipe parameters.
In regards to claim 17, (Ypma) shows the method of claim 11:
wherein the obtaining of the predictive data indicative of predictive recipe parameters comprises using, based on the trained machine learning model, maximum a posteriori probability (MAP) optimization to determine optimal predictive recipe parameters associated with producing the one or more substrates having the target performance data; Ypma [0065] and [0067] teach using Bayesian optimization with an acquisition function to determine optimal recipe parameters, and maximum a posteriori probability (MAP) optimization is a standard Bayesian technique that a person of ordinary skill in the art would recognize as directly encompassed by Ypma's Bayesian optimization framework for determining optimal recipe parameters associated with producing substrates having the target performance data.
In regards to claim 18, (Ypma) shows a system comprising a memory; and a processing device coupled to the memory, the processing device to:
train a machine learning model with data input comprising one or more sets of historical recipe parameters associated with producing one or more substrates with substrate processing equipment and target data comprising historical performance data of the one or more substrates to generate a trained machine learning model; Ypma [0099] through [0106] teach a computer system comprising a processor and main memory storing instructions that execute the Gaussian Process surrogate model training step using sets of process stack parameters as inputs and empirical substrate performance measurements as training target data.
identify one or more sets of additional recipe parameters associated with a level of uncertainty of the trained machine learning model; and Ypma [0099] through [0106] teach that the processor executes instructions to apply the acquisition function to identify candidate parameter sets associated with regions of high model uncertainty.
further train the machine learning model with additional data input comprising the one or more sets of additional recipes parameters and additional target data comprising additional performance data of one or more additional substrates produced based on the one or more sets of additional recipe parameters to update the trained machine learning model; Ypma [0099] through [0106] teach that the processor executes instructions to iteratively retrain the surrogate model with the additional parameter sets and resulting performance measurements to update the trained machine learning model.
In regards to claim 19, (Ypma) shows the system of claim 18, wherein the level of uncertainty is evaluated over an acquisition function of the trained machine learning model:
wherein the level of uncertainty is evaluated over an acquisition function of the trained machine learning model; Ypma [0065] and [0067] teach that uncertainty is evaluated by computing an acquisition function over the parameter space, with the acquisition function scoring each candidate parameter point by weighting the model uncertainty and the predicted performance value, and teach that a variety of different acquisition functions may be used to evaluate and rank candidate parameter points by uncertainty.
Claims 14, 16, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over US20180239851 (Ypma) in view of US20050289487 (Li) as applied to claim 11 above, respectively, and further in view of US20040148049 (Schwarm).
In regards to claim 14, (Ypma modified by Li) does not show: wherein the predictive recipe parameters are associated with processes of a recipe to be used by the substrate processing equipment to produce the one or more substrates.
Schwarm teaches the predictive recipe parameters are associated with processes of a recipe to be used by the substrate processing equipment to produce the one or more substrates; Schwarm [0034] teaches that recipe parameters are the factors associated with the processes of a recipe used by the substrate processing equipment to produce substrates.
The motivation to combine Ypma and Li at the effective filing date of the invention is that both references are directed to model-based optimization of semiconductor process recipe parameters from experimental substrate data. Ypma provides the Bayesian Gaussian Process framework for probabilistic predictions over recipe parameter space but does not teach inverse prediction from target performance data to recipe parameters. Li fills this gap by teaching a trained model that derives recipe input parameters directly from a desired target substrate performance output. A person of ordinary skill in the art would incorporate Li's inverse prediction capability into Ypma's optimization framework with reasonable expectation of success.
The motivation to combine Ypma, Li, and Schwarm at the effective filing date of the invention is that all three references address model-based optimization of semiconductor process recipe parameters from experimental substrate data. Ypma and Schwarm together provide the adaptive Bayesian ML training loop and physical wafer execution, while Li provides the inverse prediction capability of deriving recipe input parameters directly from a desired target performance output. A person of ordinary skill in the art would incorporate Li's inverse prediction approach into the Ypma-Schwarm framework to enable direct recipe generation from target substrate performance specifications with reasonable expectation of success.
In regards to claim 16, (Ypma) does not show: receiving a recipe to produce the one or more substrates having the target performance data; responsive to obtaining the predictive data indicative of the predictive recipe parameters, optimizing the recipe based on the predictive recipe parameters;
Li teaches responsive to obtaining the predictive data indicative of the predictive recipe parameters, optimizing the recipe based on the predictive recipe parameters; Li [0071] teaches that responsive to obtaining the predictive recipe parameters from the model, the process recipe is optimized by adjusting recipe input values based on the predictive recipe parameters until the predicted output matches the desired target performance data.
Li differs from the claimed invention in that it does not explicitly disclose receiving a recipe to produce the one or more substrates having the target performance data;
Schwarm teaches receiving a recipe to produce the one or more substrates having the target performance data; Schwarm [0029] teaches that the integrated controller generates and receives a process recipe to produce substrates having the target performance data.
The motivation to combine Ypma and Li at the effective filing date of the invention is that both references are directed to model-based optimization of semiconductor process recipe parameters from experimental substrate data. Ypma provides the Bayesian Gaussian Process framework for probabilistic predictions over recipe parameter space but does not teach inverse prediction from target performance data to recipe parameters. Li fills this gap by teaching a trained model that derives recipe input parameters directly from a desired target substrate performance output. A person of ordinary skill in the art would incorporate Li's inverse prediction capability into Ypma's optimization framework with reasonable expectation of success.
The motivation to combine Ypma, Li, and Schwarm at the effective filing date of the invention is that all three references address model-based optimization of semiconductor process recipe parameters from experimental substrate data. Ypma and Schwarm together provide the adaptive Bayesian ML training loop and physical wafer execution, while Li provides the inverse prediction capability of deriving recipe input parameters directly from a desired target performance output. A person of ordinary skill in the art would incorporate Li's inverse prediction approach into the Ypma-Schwarm framework to enable direct recipe generation from target substrate performance specifications with reasonable expectation of success.
In regards to claim 20, (Ypma modified by Li) does not show, wherein: the one or more sets of historical recipe parameters are associated with processes of a recipe used by the substrate processing equipment to produce the one or more substrates; and the one or more sets of additional recipe parameters are associated with updated processes of an updated recipe used by the substrate processing equipment to produce the one or more additional substrates.
Schwarm teaches the one or more sets of historical recipe parameters are associated with processes of a recipe used by the substrate processing equipment to produce the one or more substrates; Schwarm [0034] teaches that the factors in a designed experiment plan are recipe parameters of the semiconductor processing tool used to process substrates, establishing that recipe parameters are directly associated with the processes of a recipe used by the substrate processing equipment to produce substrates.
Schwarm teaches the one or more sets of additional recipe parameters are associated with updated processes of an updated recipe used by the substrate processing equipment to produce the one or more additional substrates; Schwarm [0029] teaches that the integrated controller automatically generates one or more updated process recipes comprising updated recipe parameters to be used by the substrate processing equipment to produce the one or more additional substrates.
The motivation to combine Ypma, Li, and Schwarm at the effective filing date of the invention is that all three references address model-based optimization of semiconductor process recipe parameters from experimental substrate data. Ypma and Schwarm together provide the adaptive Bayesian ML training loop and physical wafer execution, while Li provides the inverse prediction capability of deriving recipe input parameters directly from a desired target performance output. A person of ordinary skill in the art would incorporate Li's inverse prediction approach into the Ypma-Schwarm framework to enable direct recipe generation from target substrate performance specifications with reasonable expectation of success.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ANWER AHMED ALAWDI whose telephone number is (703)756-1018. The examiner can normally be reached Monday - Friday 8:00 am - 5:30 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jack Chiang can be reached on (571)-272-7483. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or
571-272-1000.
/ANWER AHMED ALAWDI/Examiner, Art Unit 2851 /JACK CHIANG/ Supervisory Patent Examiner, Art Unit 2851