DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114.
Applicant's submission filed on 1/28/2026 has been entered and considered. Rejections and/or objections not reiterated from the previous office action mailed 10/23/2025 are hereby withdrawn. The following rejections and/or objections are either newly applied or are reiterated and are the only rejections and/or objections presently applied to the instant application.
The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action.
Status of Claims
Claims 1-8, 21-29, and 33-35 pending.
Claims 9-20 and 30-32 cancelled.
Claims 33-35 newly added.
Claims 1-8, 21-29, and 33-35 examined on the merits.
Priority
The instant application claims no benefit of priority. Thus, the effective filing date of the claims is 8/10/2021.
The applicant is reminded that amendments to the claims and specification must comply with 35 U.S.C. § 120 and 37 C.F.R. § 1.121 to maintain priority to an earlier-filed application. Claim amendments may impact the effective filing date if new subject matter is introduced that lacks support in the originally filed disclosure. If an amendment adds limitations that were not adequately described in the parent application, the claim may no longer be entitled to the priority date of the earlier filing.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim 35 rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 35 recites "an edge of the predicted probability distribution is two times a largest observation of the 1-10 phenotypic output observations" on lines 1-2. It is not clear what an edge of a predicted probability distribution is intended to mean, or what happens should an observation be outside of it as there are no active positive steps for this method limitation. Is it a boundary or a threshold that defines noise or outliers? Is it only a maximum value? What is the procedure for handling negative values of an observation? While para.0036 of the instant specification mentions a distribution edge, it does not further illuminate these ambiguities. To further prosecution, claim 35 is interpreted as not further limiting claim 2.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-8, 21-29, and 33-35 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea of a mental process, a mathematical concept, organizing human activity, or a law of nature or natural phenomenon without significantly more. In accordance with MPEP § 2106, claims found to recite statutory subject matter (a process, Step 1: YES) are then analyzed to determine if the claims recite any concepts that equate to an abstract idea, law of nature or natural phenomenon (Step 2A, Prong 1). In the instant application, the claims recite the following limitations that equate to an abstract idea:
Claims 1, 21, and 29: “training, by using the set of training data, the recurrent neural network by minimizing the loss function with respect to the limited data set of the output observations and with the input parameter data set in the supervised training to form a trained network” provides a mathematical calculation (minimizing a loss function involves selecting and using a mathematical expression, the negative log-likelihood expression in this case) that is considered a mathematical concept, which is an abstract idea.
Claims 5 and 25: “the biological system is intrinsically noisy and the trained loss function calculates whether readings outside of a mean range of the input parameter data set are inherent to the biological system or outlier readings” provides mathematical calculations (calculating whether readings are inside or outside a range) that are considered a mathematical concept, which is an abstract idea.
These recitations are similar to the concepts of collecting information, analyzing it, and displaying certain results of the collection and analysis in Electric Power Group, LLC, v. Alstom (830 F.3d 1350, 119 USPQ2d 1739 (Fed. Cir. 2016)), organizing and manipulating information through mathematical correlations in Digitech Image Techs., LLC v Electronics for Imaging, Inc. (758 F.3d 1344, 111 U.S.P.Q.2d 1717 (Fed. Cir. 2014)) and comparing information regarding a sample or test to a control or target data in Univ. of Utah Research Found. v. Ambry Genetics Corp. (774 F.3d 755, 113 U.S.P.Q.2d 1241 (Fed. Cir. 2014)) and Association for Molecular Pathology v. USPTO (689 F.3d 1303, 103 U.S.P.Q.2d 1681 (Fed. Cir. 2012)) that the courts have identified as concepts that can be practically performed in the human mind or are mathematical relationships. Therefore, these limitations fall under the “Mental process” and “Mathematical concepts” groupings of abstract ideas. Additionally, while claims 21-29 recite performing some aspects of the analysis on “A computer program product comprising: one or more computer-readable storage media; and program instructions stored on the one or more computer-readable storage media to perform operations” (claim 21) and “A computer system comprising: a processor set; one or more computer-readable storage media; and program instructions stored on the one or more computer-readable storage media to cause the processor set to perform operations” (claim 29), there are no additional limitations that indicate that this requires anything other than carrying out the recited mental processes or mathematical concepts in a generic computer environment. Merely reciting that a mental process is being performed in a generic computer environment does not preclude the steps from being performed practically in the human mind or with pen and paper as claimed. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components, then it falls within the “Mental processes” grouping of abstract ideas. As such, claims 1-8, 21-29, and 33-35 recite an abstract idea (Step 2A, Prong 1: YES).
Claims found to recite a judicial exception under Step 2A, Prong 1 are then further analyzed to determine if the claims as a whole integrate the recited judicial exception into a practical application or not (Step 2A, Prong 2). The judicial exceptions listed above are not integrated into a practical application because the claims do not recite an additional element or elements that reflects an improvement to technology. Specifically, the claims recite the following additional elements:
Claims 1, 21, and 29: “gathering a data set comprising at least 3000 input parameters for a biological system; generating a limited data set consisting of 1-10 phenotypic output observations through experimentation and/or simulation of the input parameter data set” provides insignificant extra-solution activities (gathering and generating data through experimentation or simulation are a pre-solution activities involving data gathering steps) that do not serve to integrate the judicial exceptions into a practical application.
“building a deep learning neural network comprising a recurrent neural network and a loss function that is a negative log-likelihood function” provides insignificant extra-solution activities (building a NN model is a pre-solution activity involving data gathering and manipulation steps) that do not serve to integrate the judicial exceptions into a practical application.
“inputting a condition about the biological system into the trained network such that, in response, the trained network produces, as output, a new predicted probability distribution of a biological phenotype associated with the biological system” provides insignificant extra-solution activities (inputting and outputting data to and from a model are a pre- and post-solution activities involving data manipulation and gathering steps) that do not serve to integrate the judicial exceptions into a practical application.
Claim 21: “A computer program product comprising: one or more computer-readable storage media; and program instructions stored on the one or more computer-readable storage media to perform operations” provides insignificant extra-solution activities (running instructions on generic computer components) that do not serve to integrate the judicial exceptions into a practical application.
Claim 29: “A computer system comprising: a processor set; one or more computer-readable storage media; and program instructions stored on the one or more computer-readable storage media to cause the processor set to perform operations” provides insignificant extra-solution activities (running instructions on generic computer components) that do not serve to integrate the judicial exceptions into a practical application.
The steps for gathering, generating, inputting, and outputting data, and building a model are insignificant extra-solution activities that do not serve to integrate the recited judicial exceptions into a practical application because they are pre- and post-solution activities involving data gathering, data manipulation, and sample manipulation steps (see MPEP 2106.04(d)(2)). Furthermore, the limitations regarding implementing program instructions do not indicate that they require anything other than mere instructions to implement the abstract idea in a generic way or in a generic computing environment. As such, this limitation equates to mere instructions to implement the abstract idea on a generic computer that the courts have stated does not render an abstract idea eligible in Alice Corp., 573 U.S. at 223, 110 USPQ2d at 1983. See also 573 U.S. at 224, 110 USPQ2d at 1984. Therefore, claims 1-8, 21-29, and 33-35 are directed to an abstract idea (Step 2A, Prong 2: NO).
Claims found to be directed to a judicial exception are then further evaluated to determine if the claims recite an inventive concept that provides significantly more than the judicial exception itself (Step 2B). The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception because the claims recite additional elements that are insignificant extra-solution activities that do not serve to integrate the recited judicial exceptions into a practical application, or equate to mere instructions to apply the recited exception in a generic way or in a generic computing environment.
As discussed above, there are no additional elements to indicate that the claimed “A computer program product comprising: one or more computer-readable storage media; and program instructions stored on the one or more computer-readable storage media to perform operations” (claim 21) and “A computer system comprising: a processor set; one or more computer-readable storage media; and program instructions stored on the one or more computer-readable storage media to cause the processor set to perform operations” (claim 29), requires anything other than generic computer components in order to carry out the recited abstract idea in the claims. Claims that amount to nothing more than an instruction to apply the abstract idea using a generic computer do not render an abstract idea eligible. MPEP 2106.05(f) discloses that mere instructions to apply the judicial exception cannot provide an inventive concept to the claims. Additionally, the limitations for gathering, generating, inputting, and outputting data, and building a model are insignificant extra-solution activities that do not serve to integrate the recited judicial exceptions into a practical application. Furthermore, no inventive concept is claimed by these limitations as they are well-understood, routine, and conventional.
The additional elements do not comprise an inventive concept when considered individually or as an ordered combination that transforms the claimed judicial exception into a patent-eligible application of the judicial exception. Therefore, the claims do not amount to significantly more than the judicial exception itself (Step 2B: No). As such, claims 1-8, 21-29, and 33-35 are not patent eligible.
Response to Arguments under 35 USC 101
Applicant’s arguments filed 1/28/2026 are fully considered but they are not persuasive.
Applicant asserts that independent claim 1 is now integrated into a practical application in view of amendments to the claims because the recited method achieves improvements in a technical field and are similar to the limitations of Patent Eligibility Example 39 (Remarks 1/28/2026 pages 3-4). Applicant also asserts that independent claims 21 and 29 are "directed to a manufacture which is patent-eligible category of subject matter" (Remarks 1/28/2026 pages 4-5). Examiner notes, as indicated above in section "Claim Rejections - 35 USC 101", that independent claims 1, 21, and 29 still recite a judicial exception in the form of a mathematical calculation because building the recurrent neural network model explicitly claims minimizing a loss function, which is explicitly using a mathematical function. Furthermore, this minimization step may also be considered an evaluation, as to minimize a function one necessarily has to compare values, which may be performed in the human mind and is therefore considered a mental process, which is an abstract idea. The above section also addresses Applicant's assertion regarding claims 21 and 29 as being "a manufacture". It is noted that the generic computer components merely provide insignificant extra-solution activities (running instructions on generic computer components) that do not serve to integrate the judicial exceptions into a practical application.
Applicant further asserts that claims 6-8 and 26-28 erroneously rely on Markush group language as reciting a mental process (Remarks 1/28/2026 page 5). Examiner notes this clarity of record, and finds Applicant's argument persuasive, however due to their dependency on claims 1, 21, or 29, are still rejected under 35 USC 101 as being directed to abstract ideas of mathematical concepts and mental processes without being integrated into a practical application and without being significantly more than the alleged judicial exceptions.
Therefore, the rejection of claims 1, 21, and 29 are maintained. All other claims depend from these independent claims, and therefore are likewise rejected.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 3-5, 21, 23-25, 29, and 33-34 rejected under 35 U.S.C. 103 as being unpatentable over Rolfe et al. (US-20190244680) in view of Cope et al. (US-20140214391).
Regarding claims 1, 21, and 29, Rolfe teaches gathering a data set comprising at least 3000 input parameters for a biological system (Para.0068 "The transcriptome can be used to analyze genes that are differentially expressed based upon the phenotype of the organism", as there are thousands of genes measured in a transcriptome, each representing an input parameter).
Rolfe also teaches building a deep learning neural network comprising a recurrent neural network and a loss function that is a negative log-likelihood function (Para.0034 "A conditional likelihood p(x|z) can be selected to be a product of independent Gaussian distributions with means and variances defined by a neural network applied to a latent representation z" and para.0097 "Existing approaches such as those described above can use a negative binomial distribution instead of a Poisson distribution").
Rolfe also teaches training, by using the set of training data, the recurrent neural network by minimizing the loss function with respect to the limited data set of the output observations and with the input parameter data set in the supervised training to form a trained network, wherein the minimizing of the loss function maximizes a probability that the 1-10 phenotypic output observations are within a predicted probability distribution of the recurrent neural network (Para.0032 "Denoting the observed data by x and the latent variables by z, variational autoencoders are generative models p(x, z) paired with variational approximations to the posterior probability distribution of the latent variables q(z|x), trained by maximizing the evidence lower bound (ELBO) on the log-likelihood").
Rolfe also teaches inputting a condition about the biological system into the trained network such that, in response, the trained network produces, as output, a new predicted probability distribution of a biological phenotype associated with the biological system (Abstract "ML may realize an efficient estimation of multi-disease genetic and environmental correlation matrices", para.0179 "the present disclosure may be applied to plants to facilitate plant breeding by isolating genetic causes of particular phenotypes or to predict which offspring (from the space of all possible offspring produced by pairs of plants in the dataset) would produce desired phenotypes across environments and/or in a given environment" implies inputting a desired phenotype to produce gene expression distributions for that input phenotype, and para.0107 "One use of the technology described in the present application is to test whether expression of a single gene can behave differently depending on disease condition").
Rolfe does not explicitly teach generating a limited data set consisting of 1-10 phenotypic output observations through experimentation and/or simulation of the input parameter data set, wherein the data set and the limited data set together form a set of training data for supervised training for a model to train the model to predict a new probability distribution of output observations in response to receiving a new set of input parameters.
However, Cope teaches using a set of observations for building and fitting a model to (Para.0066 ""Training set" refers to a set of sequence-activity data or observations that one or more models are fitted to and built upon. For instance, for a protein sequence-activity model, a training set comprises residue sequences for an initial or improved protein variant library. Typically, these data include complete or partial residue sequence information, together with an activity value for each protein in the library. In some cases, multiple types of activities (e.g., rate constant data and thermal stability data) are provided together in the training set. The activity is sometimes a beneficial property"). Coupled with the k-fold cross-validation of Rolfe teach the full limitation (Para.0044 "The quality of a model relative to conventional approaches can be assessed using an k-fold cross-validation of the log-likelihood. In k-fold cross-validation, a sample is randomly partitioned into k equally sized subsamples. Of the k subsamples, one subsample is retained as validation data for testing the model. The remaining (k−1) subsamples can be used as training data. The cross-validation process can be repeated k times, with each of the k subsamples used once as validation data. The k results can then be averaged to provide an estimation of the log-likelihood").
Therefore, it would have been obvious to one of ordinary skill in the art as of the effective filing date of the claimed invention to modify the methods of Rolfe as taught by Cope in order to generate a training set for generating a model (para.0067 "The term "observation" is information about protein or other biological entity that may be used in a training set for generating a model such as a sequence activity model. The term "observation" may refer to any sequenced and assayed biological molecules, including protein variants. In certain embodiments, each observation is an activity value and an associated sequence for a variant in a library"). One skilled in the art would have a reasonable expectation of success because both methods are concerned with building neural network models for predicting a phenotype.
Regarding claims 3 and 23, Rolfe in view of Cope teach the methods of Claims 1 on which this claim depends/these claims depend, respectively. Rolfe also teaches the limited data set consists of 1 phenotypic output observation (Para.0044 "The quality of a model relative to conventional approaches can be assessed using an k-fold cross-validation of the log-likelihood. In k-fold cross-validation, a sample is randomly partitioned into k equally sized subsamples. Of the k subsamples, one subsample is retained as validation data for testing the model. The remaining (k−1) subsamples can be used as training data. The cross-validation process can be repeated k times, with each of the k subsamples used once as validation data. The k results can then be averaged to provide an estimation of the log-likelihood" also encompasses a situation where k=2, leaving a single observation for training data. Furthermore, while this approach has been tried by many others, it generally causes overfitting of the resulting model (i.e. having too many parameters [genes] relative to the number of observations) as evidenced by Cope et al. in para.0089 ""Overfitting" refers to a condition that occurs when a statistical model describes random error or noise instead of the underlying relationship. Overfitting generally occurs when a model is excessively complex, such as having too many parameters relative to the number of observations. A model which has been overfit will generally have poor predictive performance, as it can exaggerate minor fluctuations in the data" (US-20140214391)).
Regarding claims 4 and 24, Rolfe in view of Cope teach the methods of Claims 1 on which this claim depends/these claims depend, respectively. Rolfe also teaches the predicted probability distribution is a continuous probability distribution of each input parameter for the biological system (Para.0082 "The surrogate variables can be continuous and adjustable, in contrast to the fixed, discrete observed covariates, and so can more effectively represent the observed gene expression data").
Regarding claims 5 and 25, Rolfe in view of Cope teach the methods of Claims 1 on which this claim depends/these claims depend, respectively. Rolfe also teaches the biological system is intrinsically noisy and the trained loss function calculates whether readings outside of the mean range of the input parameter data set are inherent to the biological system or outlier readings (Para.0190 "Unsupervised learning of probabilistic models can facilitate tasks such as denoising to extract a signal from a mixture of signal and noise, and inpainting to reconstruct lost or corrupted parts of an image").
Regarding claim 33, Rolfe in view of Cope teach the methods of Claims 1 on which this claim depends/these claims depend, respectively. Rolfe also teaches the 1-10 phenotypic output observations are discrete variables, the trained network predicts respective probability distribution values for all possible discrete numbers, and the trained network comprises one or more additional neural network layers to ensure that a sum of the predicted probability distribution values for an input condition equals one (Para.0201 "This approach is possible whenever the approximating posteriors for each hidden variable, q.sub.i(z.sub.i|x, ϕ), are independent given x and ϕ; the cumulative distribution function (CDF) of each q.sub.i is invertible; and the inverse CDF each q.sub.i, is differentiable. Specifically, choose [D] to be the uniform distribution between 0 and 1, and ƒ.sub.i to be the inverse CDF of q.sub.i").
Regarding claim 34, Rolfe in view of Cope teach the methods of Claims 1 on which this claim depends/these claims depend, respectively. Rolfe also teaches the 1-10 phenotypic output observations are continuous variables, the trained network provides the predicted probability distribution by interpolating discrete observations where a last layer of the trained network is normalized with a normalization factor to ensure that a cumulative trapezoidal numerical integration of the predicted probability distribution equals one (Para.0206 "The approach can run into challenges with discrete distributions, such as, for example, Restricted Boltzmann Machines (RBMs). An approximating posterior that only assigns non-zero probability to a discrete domain corresponds to a CDF that is piecewise-constant. That is, the range of the CDF is a proper subset of the interval [0, 1]. The domain of the inverse CDF is thus also a proper subset of the interval [0, 1] and its derivative is generally not defined").
Claims 2 and 22 rejected under 35 U.S.C. 103 as being unpatentable over Rolfe et al. (US-20190244680) in view of Cope et al. (US-20140214391) as applied to claims 1, 3-5, 21, 23-25, 29, and 33-34 above, and further in view of Isayev et al. (US-20200168302).
Rolfe et al. in view of Cope et al. are applied to claims 1, 3-5, 21, 23-25, 29, and 33-34.
Regarding claims 2 and 22, Rolfe in view of Cope teach the method of Claims 1 and 21 on which this claim depends/these claims depend, respectively.
Rolfe nor Cope explicitly teach the deep learning neural network is selected from a recurrent neural network and a long short-term memory network and the loss function is a negative log-likelihood function.
However, Isayev teaches the deep learning neural network is selected from a recurrent neural network and a long short-term memory network and the loss function is a negative log-likelihood function (Para.0033 "FIG. 2 illustrates, on the left hand side, the scheme of a generative stack-augmented RNN time step. This model has two modes—training and generating. During training, the input token is a character of the currently processed SMILES string from the training set. The model outputs probability vector [p] of the next character given the prefix. A vector of parameters [theta] is optimized by a cross-entropy loss function minimization.", which reads on the limitation because a RNN is a type of LSTM, and a "cross-entropy" loss function is another name for a negative log-likelihood function).
Therefore, it would have been obvious to one of ordinary skill in the art as of the effective filing date of the claimed invention to modify the methods of Rolfe and Cope as taught by Isayev in order to use a LSTM network type RNN with a negative log-likelihood function in order to more properly model sequence dependent data types (Isayev, Para. 0031 "Another weakness of regular recurrent neural networks is their inability to capture long term dependencies which leads to difficulties in generalizing to longer sequences. [] Therefore, Stack RNNs are a proper choice for modeling such sequence dependencies."). One skilled in the art would have a reasonable expectation of success because both methods are concerned with building neural network models for predicting desired properties of a biological system.
Claims 6-8 and 26-28 rejected under 35 U.S.C. 103 as being unpatentable over Rolfe et al. (US-20190244680) in view of Cope et al. (US-20140214391) as applied to claims 1, 3-5, 21, 23-25, 29, and 33-34 above, and further in view of Chait et al. (US-20200286580).
Rolfe et al. in view of Cope et al. are applied to claims 1, 3-5, 21, 23-25, 29, and 33-34.
Regarding claims 6 and 26, Rolfe in view of Cope teach the method of Claims 1 and 21 on which this claim depends/these claims depend, respectively.
Rolfe nor Cope explicitly teach the biological system is selected from the group consisting of a cellular system, a biological collective, a synthetic gene circuit, and combinations thereof.
However, Chait teaches the biological system is selected from the group consisting of a cellular system, a biological collective, a synthetic gene circuit, and combinations thereof (Para.0004 "correct functioning of individual units in the context of the whole (synthetic) biomolecular network can be tested and its dynamic functionality assessed even before all parts of the network are implemented" and para.0010 "the output from the simulation provides input into the cellular part of the biological circuit via the controlling means; and cell state or environmental parameter measurements taken from the cell(s) provide input into the simulated part of the circuit").
Therefore, it would have been obvious to one of ordinary skill in the art as of the effective filing date of the claimed invention to modify the methods of Rolfe and Cope as taught by Chait in order to enable faster, cheaper, and better predictions of how a biological unit will perform if used in a larger network (Para.0004 "this enables better predictions of how a unit will perform if it is implemented into a larger network in situ and faster and more economical laboratory development cycle"). One skilled in the art would have a reasonable expectation of success because both methods are concerned with predicting phenotypes of biological systems.
Regarding claims 7-8 and 27-28, Rolfe in view of Cope teach the method of Claims 1 and 21 on which this claim depends/these claims depend, respectively. Chait also teaches: the input parameters are selected from the group consisting of cell growth rate, cell lysis rate, cell motility, gene expression, nutrient concentration, temperature, pH, activation rate, transcription rate, agar density, and combinations thereof; and the biological phenotype is selected from the group consisting of number of mRNA produced, number of amino acids, number of proteins, cellular growth, cellular adhesion, cellular sensing, fluorescence strength, optical density, chemical concentration, and combinations thereof (Para.0063 "Environmental parameters that may be measured include pH, temperature, light or fluorescence emission or wavelength frequency, oxygen saturation, or the presence or concentration of any secreted or excreted molecules or metabolites as described above."; Para.0061 "In some cases the presence or concentration of one or more metabolites or molecule produced and optionally secreted or excreted by the cells is measured. Such molecules might include nucleic acids, (for example DNA, mRNA, microRNA, and small interfering RNAs), proteins, antibodies, receptors, ligands, signaling molecules, protein complexes and toxins." and para.0062 "Cell state parameters that may be measured include cell growth, cell division, reproduction, rate of cell growth, division, or reproduction, cell number, cell density, cell confluence, viability, respiration, cell morphology, cell shape, cell adhesion, spatial organization of tissues, metabolic condition, cell motility, cell movement, cytoskeletal arrangement, cytoplasmic movement, intracellular trafficking, electrophysiological state, firing times of neurons, degree of differentiation, expression of specific molecules such as ligands or receptors on the cell surface, receptor activation, pH and temperature", as cell lysis rate can be calculated from cell/optical density measurements over time, and transcription rate can be calculated from measurement of mRNA over time).
Response to Arguments under 35 USC 103
Applicant’s arguments filed 1/28/2026 are fully considered but they are not persuasive.
Applicant asserts that claim 1 is patentable over the combination of the cited references (Araya, Isayev, and Barron) because they "fail to disclose or suggest the features recited in the amended claim 1" (Remarks 1/28/2026 pages 5-7). Examiners notes that new references are now cited as grounds for rejection under 35 USC 103, rendering these arguments moot. However, Examiner also notes Applicant's clarifying remarks regarding the distinction between inputs and outputs (Remarks 1/28/2026 page 7), which have been taken into account during application of the newly cited art. Claims 21 and 29 are amended similarly, and are also rejected using the newly cited art, likewise all other claims depend from claims 1, 21, or 29, and therefore are also rejected (see section "Claim Rejections - 35 USC 103", above for details).
Citation of Pertinent Prior Art
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
US-20160371432, teaches utilization of limited data sets for training statistical models.
Conclusion
No claims are allowed.
Inquiries
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Robert A. Player whose telephone number is 571-272-6350. The examiner can normally be reached Mon-Fri, 8am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Larry D. Riggs can be reached at 571-270-3062. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/R.A.P./Examiner, Art Unit 1686
/LARRY D RIGGS II/Supervisory Patent Examiner, Art Unit 1686