Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
1. Claims 1-20 are presented for examination.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
2. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
Claims recite “analytical module” (Claim 9 and 11), “simulation module” (Claim 16 and 19), and “data collection module” (Claim 19).
A review of the specification shows that the following appears to be the corresponding structure described in the specification for the 35 U.S.C. 112(f) or 35 U.S.C. 112 (pre-AIA ), sixth paragraph limitation:
Specification paragraph [0041] of PG PUB states:
[0041] As used in this description, the terms “component,” “database,” “module,” “system,” and the like are intended to broadly capture a computer-related entity, either hardware, firmware, a combination of hardware and software, software, or software in execution. For example, a component, in some instances, is a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer. An example of a database pertinent to the present disclosure includes but is not limited to a Relational Database System.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
3. Claims 1-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
As per Claim 1 and 19-20, they recite the limitation "generating, using the data set, a plurality of simulation curves of the hydrocarbon reservoir, wherein each parameter of the plurality of parameters has a range, and the range is adjustable, and
performing a simulation, based on the range of each said parameter of the plurality of parameters, of the hydrocarbon reservoir to generate a plurality of simulation curves;
downloading the plurality of simulation curves into a local server to prepare training data for training the neural network model;
…
defining each of the plurality of simulation curves as output features" which is unclear what the limitation refers. In particular “a plurality of simulation curves” is generated in the “generating” step previously, and then “a simulation” is performed to generate “a plurality of simulation curves”. It is unclear how those two “a plurality of simulation curves” are different from each other. Also, it is unclear what the limitation “the plurality of simulation curves” in the “downloading” step is referring to. Is it “a plurality of simulation curves” generated in the “generating” step previously or “a plurality of simulation curves” generated during a simulation? Further it is unclear what the limitation “the plurality of simulation curves” in the “defining” step is referring to as the claims fails to define which “plurality of simulation curves” is downloaded.
Clarification is respectfully requested.
As per Claim 1 and 19-20, they recite the limitation "key factors" which is a relative term that renders the claim indefinite. The term “key” is not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention.
As per claim 6, it recites the limitation “major side surface” which is a relative term that renders the claim indefinite. The term “major” is not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention.
As per claim 10, it recites the limitation “matching simulation production data and simulation pressure data” is vague and indefinite since "matching" does not set a range for similarity or a match measurement.
As per claim 13, it recites the limitation “matching an outcome of the plurality of probability simulation curve…” is vague and indefinite since "matching" does not set a range for similarity or a match measurement.
As per claim 14, it recites the limitation “achieving an optimal economic result” is vague and indefinite since "optimal" does not set a range.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
4. Claims 1-2, 6-8, 15-17, and 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Garcia et al. (US 20130338985 A1), and further in view of Amr et al. (US 20190284910 A1).
As per Claim 1 and 19-20, Garcia et al. teaches a method/ computer device of predicting an output of oil and gas production in a hydrocarbon reservoir using a neural network model (Abstract, Fig. 1-3), comprising:
(Claim 20) a non-transitory computer readable medium configured to store computer executable instructions; at least one processor, wherein in response to executing the computer executable instructions (Fig. 3), the processor is configured to:
(Examiner Note: As per Claim 19-20, Fig. 3 of Garcia et al. teaches: “data collection module”, “a simulation module”, and “a graphic user interface (GUI))
receiving a data set comprising a plurality of parameters of the hydrocarbon reservoir at a wellsite, wherein the wellsite comprises a wellbore penetrating a subterranean formation to extract reserves from the hydrocarbon reservoir (Fig. 1 element 102 [0017] “The reservoir simulator model input data is represented by multiple cases wherein each case represents a value for all decision variables (e.g. well chokes, etc. that may be manipulated during the asset management decision support workflow) at every time step.”; Fig 2 element 202-203);
generating, using the data set, a plurality of simulation curves of the hydrocarbon reservoir, wherein each parameter of the plurality of parameters has a range, and the range is adjustable (Fig. 1 element 102, Fig. 2 element 209-210, [0024]-[0025] “reservoir simulator model input data is calculated using a well known stochastic simulation stochastic simulation algorithm stochastic simulation algorithm such as, for example, Monte Carlo, Orthogonal Array and Latin Hypercube, and the results from steps 206 and 208…constrain the reservoir simulator model input data by limiting the number of cases… The reservoir simulator model input data is represented by (n.times.m) values for (x) number of cases”: simulator input cases generated via stochastic methods), and
performing a simulation, based on the range of each said parameter of the plurality of parameters, of the hydrocarbon reservoir to generate a plurality of simulation curves (Fig. 1 element 104, [0024]-[0025], “proxy model data are calculated using the reservoir the reservoir simulator model the reservoir simulator model input data from step 102 and the reservoir simulator model. In other words, the reservoir simulator model algorithm processes only the constrained runs determined by step 210 in a batch mode to calculate the proxy model data. The proxy model data comprises data that may be used for training (training data), testing (testing data) and validation (validation data).”: proxy model data calculated via a simulator);
downloading the plurality of simulation curves into a local server to prepare training data for training the neural network model (Fig. 3 [0033]-[0038]);
… and
defining each of the plurality of simulation curves as output features of the neural network model (Fig. 1 element 112-> 116, [0029]-[0031] “the proxy model output data”, “asset performance is forecasted using the trained and validated proxy model”).
Garcia et al. fails to teach explicitly calculating a plurality of key factors from a neighboring well for at least two wellsites from the simulation;
combining the plurality of key factors with the plurality of parameters to define input features of the neural network model.
Amr et al. teaches calculating a plurality of key factors from a neighboring well for at least two wellsites from the simulation ([0023]-[0024], [0040], [0047]-[0050], [0060]-[0061] “neighboring wells of the NPL well under consideration… the set of the neighboring wells, a type curve (TC) well whose production is representative”, “well density 418 at the location may include information about the number of wells to n area associated with the Well-Considered”, “predictor parameters 302 may include… production of similarly situated neighboring wells 416, well density information 418, gas to oil ratio 419A, water to oil ratio 419B, nearest well distance information 420, nearest well angle information 422, etc.”, “the particular predictor parameter 302 used by the model generator 120 to train each of the qi model 702 and the di model 704,”);
combining the plurality of key factors with the plurality of parameters to define input features of the neural network model ([0023]-[0024], [0040], [0047]-[0048], [0058], [0060]-[0061], “predictor parameters 302 may include… production of similarly situated neighboring wells 416, well density information 418, gas to oil ratio 419A, water to oil ratio 419B, nearest well distance information 420, nearest well angle information 422, etc.”, “the particular predictor parameter 302 used by the model generator 120 to train each of the qi model 702 and the di model 704,”). In particular, Amr et al. teaches determining neighboring wells and aligning production histories/curve including neighboring-well derived factors such as well density (corresponds to the claimed limitation “key factors from a neighboring well”) used to model production for multiple wells ([0023], [0040], [0050]) wherein neighbor well production and related parameters used as as predictor input for machine learning models ([0055], [0058]: corresponds to the claimed limitation “combining the plurality of key factors with the plurality of parameters…”).
Garcia et al. and Amr et al. are analogous art because they are both related to a machine learning based reservoir/well simulation method.
It would have obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to combine the teachings of cited references. Thus, one of ordinary skill in the art before the effective filling date of the claimed invention would have been motivated to incorporate Amr et al. into Garcia et al.’s invention to provide an improved method that allows for more accurate predictions (Amr et al.: [0070]).
As per Claim 2, Garcia et al. teaches comprising tuning the neural network model using a set of hidden layers between the input features and the output features, wherein the tuning comprises a plurality of tunings ([0027]-[0028] “The neural network is a parallel mathematical structure composed of nodes, which calculate an individual result to be passed on to the other nodes for further processing.”, “back propagation or an optimization solver subroutine”).
As per Claim 6, Garcia et al. fails to teach explicitly further comprising processing each said tuning of the plurality of tunings and calculating an average error of the neural network model, and saving the average error as a result.
Amr et al. teaches further comprising processing each said tuning of the plurality of tunings and calculating an average error of the neural network model, and saving the average error as a result ([0058] “these machine learning algorithms were fine tuned to enhance accuracy. The term “accuracy” of the model 306, as used herein, connotes: [1—mean of absolute relative difference of the forecasted target variables”: mean based metric).
As per Claim 7, Garcia et al. fails to teach explicitly comprising searching for a plurality of hyperparameters by fitting the scaled data in each said tuning of the plurality of tunings, and comparing a result and selecting an optimal set of hyperparameters of the plurality of hyperparameters belonging to the neural network model having a lowest validation error.
Amr et al. teaches comprising searching for a plurality of hyperparameters by fitting the scaled data in each said tuning of the plurality of tunings, and comparing a result and selecting an optimal set of hyperparameters of the plurality of hyperparameters belonging to the neural network model having a lowest validation error ([0058]-[0061]: “fine tuned to enhance accuracy” and “ with the… tuning parameters… models 702 and 704 may be trained and tested using cross validation techniques” provides performance “score” “in terms of accuracy”).
As per Claim 8, Garcia et al. fails to teach explicitly further comprising: further training the neural network model having the lowest validation error with the optimal set of hyperparameters to obtain an optimized neural network model; and uploading the optimized neural network model to a virtual server or a virtual private cloud.
Amr et al. teaches further comprising: further training the neural network model having the lowest validation error with the optimal set of hyperparameters to obtain an optimized neural network model ([0058]-[0061]: “fine tuned to enhance accuracy” and “ with the… tuning parameters… models 702 and 704 may be trained and tested using cross validation techniques” provides performance “score” “in terms of accuracy”); and
uploading the optimized neural network model to a virtual server or a virtual private cloud ([0029], [0033] “the structure 102 may be a server (e.g., a web server) and a user may interact with the structure 102 via a computer 134 in communication therewith.”).
As per Claim 15, Garcia et al. teaches wherein the range has a low variable and a high variable ([0022], [0024] “triangular distribution…typically requires three values: a minimum value and a maximum value”, “stochastic simulation algorithm such as, for example, Monte Carlo, Orthogonal Array and Latin Hypercube”).
As per Claim 16, Garcia et al. teaches further comprising using a simulation module user interface (Fig. 3) to: adjust each said range of the plurality of parameters for the simulation to obtain an outcome of the base case simulation ([0031] “daily forecasting and daily optimization … The operation's team can then determine what needs to be done to modify the status quo using the proxy model in a trial and error manner. In the optimization workflow, the desired end state of the asset is set by a user and iteratively uses the proxy model to determine where to set the asset decision variables (e.g. chokes) to achieve a predefined objective”).
Garcia et al. fails to teach explicitly display a plurality of hydrocarbon producing wells and the reserve based on the adjusted range of the plurality of parameters and the outcome of the simulation; display an outcome of the simulation in the plurality of simulation curves on a display in the simulation module; and export and store the plurality of simulation curves into a database in a virtual server or a virtual private cloud.
Amr et al. teaches display a plurality of hydrocarbon producing wells and the reserve based on the adjusted range of the plurality of parameters and the outcome of the simulation (Fig. 2 & 9, [0063], [0069], “the production optimizer 122 may continually vary one or more predictor parameters”, “production forecasting”);
display an outcome of the simulation in the plurality of simulation curves on a display in the simulation module (Fig. 2 & 9, [0030], [0063], [0069], “the production optimizer 122 may continually vary one or more predictor parameters”, “production forecasting”); and
export and store the plurality of simulation curves into a database in a virtual server or a virtual private cloud ([0029], [0033] “the structure 102 may be a server (e.g., a web server) and a user may interact with the structure 102 via a computer 134 in communication therewith.”).
As per Claim 17, Garcia et al. fails to teach explicitly wherein the plurality of key factors comprises: neighboring well quantities and influence, spacing differences, timing differences, or FDI factors.
Amr et al. teaches wherein the plurality of key factors comprises: neighboring well quantities and influence, spacing differences, timing differences, or FDI factors ([0023]-[0024], [0040], [0047]-[0050], [0060]-[0061] “neighboring wells of the NPL well under consideration… the set of the neighboring wells, a type curve (TC) well whose production is representative”, “well density 418 at the location may include information about the number of wells to n area associated with the Well-Considered”, “predictor parameters 302 may include… production of similarly situated neighboring wells 416, well density information 418, gas to oil ratio 419A, water to oil ratio 419B, nearest well distance information 420, nearest well angle information 422, etc.”, “the particular predictor parameter 302 used by the model generator 120 to train each of the qi model 702 and the di model 704,”).
5. Claims 3-5 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Garcia et al. (US 20130338985 A1), in view of Amr et al. (US 20190284910 A1), and further in view of Paolo Dell’Aversana (“Cross-disciplinary Machine Learning”).
Garcia et al. as modified by Amr et al. teaches most all the instant invention as applied to claims 1-2, 6-8, 15-17, and 19-20 above.
As per Claim 3, Garcia et al. as modified by Amr et al. teaches further comprising retrieving, for each said tuning of the plurality of tunings, …with … cross validation, and scaling the selected data using a scaler to obtain scaled data (Amr et al.: [0056]-[0057], [0061] “trained using the xgbTree algorithm with the following tuning parameters: nrounds=100, max_depth=10, eta=0.03, gamma=0.1, colsample_bytree=0.4, min_child_weight=1 and subdample=1… trained and tested using cross validation techniques”).
Garcia et al. as modified by Amr et al. fails to teach explicitly elected data shuffling and splitting the selected data with K-fold cross validation.
Paolo Dell’Aversana teaches selected data shuffling and splitting the selected data with K-fold cross validation (Figure 1.1, Pg 18 & 20).
Garcia et al., Amr et al., and Paolo Dell’Aversana are analogous art because they are all related to a machine learning based reservoir/well simulation method.
It would have obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to combine the teachings of cited references. Thus, one of ordinary skill in the art before the effective filling date of the claimed invention would have been motivated to incorporate Paolo Dell’Aversana into Garcia et al. as modified by Amr et al.’s invention to provide an improved method that allows for more accurate predictions (Amr et al.: [0070]) and to provide efficient training and robust classification of multiple well data for well logs analysis (Paolo Dell’Aversana: Abstract on Pg 42)
As per Claim 4, Garcia et al. as modified by Amr et al. teaches further comprising searching for a plurality of hyperparameters by fitting the scaled data in each said tuning of the plurality of tunings (Amr et al.: [0061] “trained using the xgbTree algorithm with the following tuning parameters”).
As per Claim 5, Garcia et al. as modified by Amr et al. teaches further comprising applying an early stop function to prevent the training from overfitting the selected data to the neural network model (Amr et al.: [0058]- [0059] “overfitting was avoided or at least minimized….decreasing the number of trees in a random forest algorithm may be employed to minimize overfitting.”).
As per Claim 18, Garcia et al. as modified by Amr et al. teaches wherein the tuning further comprises using: a number of nodes, activation functions, optimizer functions, … (Garcia et al.: [0027]-[0028] “the neural network is a parallel mathematical structure composed of nodes, which calculate an individual result to be passed on to the other nodes for further processing. the proxy model architecture therefore, represents the total number of required nodes, … Each nodes requires a value as a coefficient, which is determined by a training activity.”, “the proxy model architecture coefficient values are calculated for each node in the proxy model architecture using techniques well known in the art such as, for example, back propagation or an optimization solver subroutine”).
Garcia et al. as modified by Amr et al. fails to teach explicitly the tuning using learning rates, dropout rates, and regularization.
Paolo Dell’Aversana teaches wherein the tuning further comprises using: a number of nodes, activation functions, optimizer functions, learning rates, dropout rates, and regularization (Figure 1.1, Pg 17-18 & 32-33).
6. Claims 9-14 are rejected under 35 U.S.C. 103 as being unpatentable over Garcia et al. (US 20130338985 A1), in view of Amr et al. (US 20190284910 A1), and further in view of Mohaghegh (US 20160042272 A1).
Garcia et al. as modified by Amr et al. teaches most all the instant invention as applied to claims 1-2, 6-8, 15-17, and 19-20 above.
As per Claim 9, Garcia et al. as modified by Amr et al. teaches further comprising: uploading from a client firewall, by an analytical module, (Amr et al.: [0029], [0033] “the structure 102 may be a server (e.g., a web server) and a user may interact with the structure 102 via a computer 134 in communication therewith.) actual wellsite data comprising actual wellsite production data, …, and actual wellsite parameter data (Amr et al.: Fig. 4A, [0053], [0056], [0063], [0069] “the data in the database 150 regarding known wells”, “actual production information… in the oil and gas database”); and
inputting the wellsite parameter data and selecting the range of each said parameter of the plurality of parameters to display the plurality of simulation curves generated from the neural network model in a virtual server or a virtual private cloud (Garcia et al.: [0031] “to forecast asset performance… daily forecasting and daily optimization… iteratively uses the proxy model ”; Amr et al.: Fig. 9, [0053], [0063] “the predictor parameters 302 for a statistically significant number of horizontal wells may be stored in the oil and gas well database. … the production optimizer 122 may determine physical parameter changes 504 for optimizing well production of the Well-Considered… automatically (or with user intervention) bring about the physical parameter changes 504 in the Well-Considered 502 to optimize production.”, “the production optimizer 122 may continually vary one or more predictor parameters, such as the proppant amount per lateral segment, the amount of water per lateral segment, the lateral length of the Well-Considered, the well density, etc., to evaluate whether such a physical predictor parameter change 504 boosts EUR.”).
Garcia et al. as modified by Amr et al. fails to teach explicitly actual wellsite pressure data.
Mohaghegh teaches actual wellsite pressure data ([0009] “pressure… Hard Data” are measured and recorded”, [0028] “producing a model comprising an artificial neural network (ANN) trained artificial neural network (ANN) trained with the first part of the set of measurement or objective data parameters …, wherein the model is useful for producing a group of optimized settings for one or more of the parameters ”).
Garcia et al., Amr et al., and Mohaghegh are analogous art because they are all related to a machine learning based reservoir/well simulation method.
It would have obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to combine the teachings of cited references. Thus, one of ordinary skill in the art before the effective filling date of the claimed invention would have been motivated to incorporate Mohaghegh a into Garcia et al. as modified by Amr et al.’s invention to provide an improved method that allows for more accurate predictions (Amr et al.: [0070]) and to provide an accurate predictive model (Mohaghegh: [0097]).
As per Claim 10, Garcia et al. as modified by Amr et al. teaches further comprising:
matching simulation production data and simulation pressure data from the plurality of simulation curves generated from the neural network model in the virtual server or the virtual private cloud with the actual wellsite production data and the actual wellsite pressure data to obtain a plurality of matching simulation curves (Amr et al.:[0028]- [0029], [0033] “the structure 102 may be a server (e.g., a web server) and a user may interact with the structure 102 via a computer 134 in communication therewith.”, [0058] “these machine learning algorithms were fine tuned to enhance accuracy. The term “accuracy” of the model 306, as used herein, connotes: [1—mean of absolute relative difference of the forecasted target variables 304 with respect to the actual target variables 304].”);
displaying an outcome of the plurality of matching simulation curves on a display (Amr et al.: Fig. 2, [0030] “The output device 112 may include one or more visual indicators (e.g., a display, touch screen),”); and
storing the plurality of matching simulation curves and the plurality of parameters (Amr et al.: Fig. 3 & 9, [0028]- [0029], [0069]).
As per Claim 11, Garcia et al. as modified by Amr et al. teaches further comprising:
creating a plurality of hydrocarbon development scenarios in the analytical module user interface for drilling operation in an area of interest (Garcia et al.: [0031] “In the optimization workflow, the desired end state of the asset is set by a user and iteratively uses the proxy model to determine where to set the asset decision variables (e.g. chokes) to achieve a predefined objective.”; Amr et al. [0063] “the production optimizer 122 may continually vary one or more predictor parameters”);
assigning the stored plurality of parameters to the wellsite in a hydrocarbon development scenarios of the plurality of hydrocarbon development scenarios (Garcia et al.:[0026] “proxy model input variables are selected”; Amr et al.: [0063] “the production optimizer 122 may continually vary one or more predictor parameters”); and
displaying the plurality of simulation curves generated from the neural network model in the virtual server or the virtual private cloud using the plurality of hydrocarbon development scenarios (Amr et al.: Fig. 2 & 9, [0063], [0069], “the production optimizer 122 may continually vary one or more predictor parameters”, “production forecasting”).
As per Claim 12, Garcia et al. as modified by Amr et al. teaches further comprising calculating a probability distribution for an outcome of the plurality of simulation curves (Garcia et al.: [0022], [0024] “triangular distribution”, “stochastic simulation algorithm such as, for example, Monte Carlo, Orthogonal Array and Latin Hypercube”).
As per Claim 13, Garcia et al. as modified by Amr et al. teaches further comprising:
creating a plurality of decline curve models with an outcome of calculated probability distribution (Amr et al.: [0060]-[0062] “well production curve may be parameterized by qi, di, and b”, “once the machine learning model 306 (i.e., the qi model 702 and the di model 704) is generated by the model generator 120, it may be used to forecast one or more target variables 304 (e.g., the EUR 438) of the Well-Considered 502.” );
matching an outcome of the plurality of probability simulation curves to the plurality of decline curve models by adjusting a plurality of decline curve parameters (Amr et al.: [0060]-[0062] “well production curve may be parameterized by qi, di, and b”, “models… maybe be trained tuning parameters” to fit production data); and
exporting the adjusted plurality of decline curve models for the current and the future producing wells into a user format for economic analysis (Garcia et al.: [0031] “In the optimization workflow, the desired end state of the asset is set by a user and iteratively uses the proxy model to determine where to set the asset decision variables (e.g. chokes) to achieve a predefined objective.”; Amr et al.: [0060]-[0062], [0070] “EUR”, “economic forecasts of the well value over its lifetime”).
As per Claim 14, Garcia et al. as modified by Amr et al. teaches further comprising:
re-selecting the range of each said parameter of the plurality of parameters and re-adjusting the hydrocarbon development scenarios for adjusting the probability distribution for the current and the future producing wells until achieving an optimal economic result (Garcia et al. : [0031] “daily forecasting and daily optimization … The operation's team can then determine what needs to be done to modify the status quo using the proxy model in a trial and error manner. In the optimization workflow, the desired end state of the asset is set by a user and iteratively uses the proxy model to determine where to set the asset decision variables (e.g. chokes) to achieve a predefined objective”; Amr et al.: [0060]-[0062], [0070] “EUR”, “economic forecasts of the well value over its lifetime”).
Garcia et al. as modified by Amr et al. fails to teach explicitly using the adjusted probability distribution to select a location to perform a drilling operation to drill another wellbore at the hydrocarbon reservoir.
Mohaghegh teaches using the adjusted probability distribution to select a location to perform a drilling operation to drill another wellbore at the hydrocarbon reservoir (Fig. 17, [0022], [0106], [0113]-[0114] “the probability distribution function of output for each of the one or more locations in the well field to select the drill site ”).
Conclusion
7. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Yeten et al. (US 20100161300 A1)
Usadi et al. (US 20130118736 A1)
Wilkinson et al. (US 20080082469 A1)
8. Any inquiry concerning this communication or earlier communications from the examiner should be directed to EUNHEE KIM whose telephone number is (571)272-2164. The examiner can normally be reached Monday-Friday 9am-5pm ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ryan Pitaro can be reached at (571)272-4071. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
EUNHEE KIM
Primary Examiner
Art Unit 2188
/EUNHEE KIM/Primary Examiner, Art Unit 2188