DETAILED ACTION
Applicant’s response filed 09/25/2025 has been fully considered. The following rejections and/or objections are either reiterated or newly applied.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Status
Claims 21-22 are newly added by Applicant.
Claims 1-22 are currently pending and are herein under examination.
Claims 1-22 are rejected.
Claims 7 and 15 are objected.
Priority
The instant application claims domestic benefit as a 371 filing of PCT Patent Application No. PCT/US2020/017477, filed 02/10/2020, which claims domestic benefit to U.S. Provisional Application No. 62/803,092, filed 02/08/2019. The claims to domestic benefit are acknowledged for claims 1-22. As such, the effective filing data for claims 1-22 is 02/08/2019.
Information Disclosure Statement
The IDS filed 09/25/2025 follows the provisions of 37 CFR 1.97 and has been considered in full. A signed copy of the list of references cited from this IDS is included with this Office Action.
Drawings
The objections to the drawings are withdrawn in view of the drawings filed 09/25/2025.
The drawings filed 08/06/2021 are accepted.
Claim Objections
The objection to claim 14 is withdrawn in view of Applicant’s claim amendments.
Claims 7 and 15 are objected to because of the following informalities:
Claim 7, line 2, should have a comma after “devices”.
Claim 15, line 4, recites the phrase “chemical structure” which should have either “a” or “the” in front of the phrase.
Appropriate correction is required.
Withdrawn Rejections
35 USC 112(b)
The rejection of claims 3 and 15 is withdrawn in view of Applicant’s claim amendments.
Claim Rejections - 35 USC § 112
35 USC 112(b)
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-22 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
This rejection is newly recited and is necessitated by claim amendment.
Claim 1, line 8, recites the phrase “the chemical structure of the selected molecule” which lacks antecedent basis. To overcome this rejection, it is suggested to amend the phrase to “a chemical structure of a selected molecule”.
Claim 1, lines 10-11, recites the phrase “the prediction data indicative of the one or more predicted olfactory properties of the selected molecule” which lacks antecedent basis. For examination purposes, claim 1 is being interpreted to mean that the GNN actively predicts olfactory properties of molecules, wherein prediction data indicative of the predicted olfactory properties of a selected molecule comprises numerical embeddings. To overcome this rejection, it is suggested to amend the phrase to move this phrase at the end of line 15 because it appears that the phrase intends to further limit the received prediction data, or provide antecedent basis for the phrase.
Furthermore, claims 2-11 and 21 are also rejected because the depend on claim 1, which is rejected, and because they do not resolve the issue of indefiniteness.
Claim 11, line 6, recites the phrase “the numerical embedding” which renders the claim indefinite. It is unclear if this phrase refers to “a numerical embedding” in claim 11, line 3, or if it refers to “a numerical embedding expressed in a numerical odor embedding space” in claim 1, line 12. To overcome this rejection, it is suggested to specify to which the phrase refers.
Claim 12, line 10, recites the phrase “the chemical structure” which lacks antecedent basis. To overcome this rejection, it is suggested to amend the phrase to “a chemical structure”.
Claim 12, lines 13, recites the phrase “the prediction data indicative of the one or more predicted olfactory properties of the selected molecule” which lacks antecedent basis. For examination purposes, claim 12 is being interpreted to mean that the GNN actively predicts olfactory properties of molecules, wherein prediction data indicative of the predicted olfactory properties of a selected molecule comprises numerical embeddings. To overcome this rejection, it is suggested to move this phrase at the end of line 17 because it appears that the phrase intends to further limit the received prediction data, or provide antecedent basis for the phrase.
Furthermore, claims 13-20 and 22 are also rejected because the depend on claim 12, which is rejected, and because they do not resolve the issue of indefiniteness.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-22 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea and a natural phenomenon without significantly more.
Any newly recited portions herein are necessitated by claim amendments.
Step 2A, Prong 1:
In accordance with MPEP § 2106, claims found to recite statutory subject matter (Step 1: YES) are then analyzed to determine if the claims recite any concepts that equate to an abstract idea, law of nature or natural phenomena (Step 2A, Prong 1). In the instant application, claims 1-11 and 21 recite a method and claims 12-20 and 22 recite a system. The instant claims recite the following limitations that equate to one or more categories of judicial exception:
Claim 1 recites “providing … a graph that graphically describes the chemical structure of the selected molecule … , wherein the prediction data indicative of the one or more predicted olfactory properties of the selected molecule comprises a numerical embedding expressed in a numerical odor embedding space;”
Claims 2 and 13 recite “wherein the machine-learned graph neural network has been trained by: … training, by the one or more computing devices, the machine-learned graph neural network to predict olfactory properties of molecules based in part on the obtained training data.”
Claims 3 and 15 recite “generating … visualization data descriptive of a relative importance of one or more structural units of chemical structure of the selected molecule to the predicted olfactory properties associated with the selected molecule, relative to importance of one or more other structural units of the chemical structure of the selected molecule to the predicted olfactory properties associated with the selected molecule”
Claims 4 and 14 recite “generating … data indicative of how a structural change to the chemical structure of the selected molecule affects the predicted olfactory properties associated with the selected molecule.”
Claims 5 and 16 recite “wherein the prediction data indicative of the one or more olfactory properties of the selected molecule comprises an intensity of a particular olfactory property”.
Claims 6 and 17 recite “determining … one or more olfactory differences between the selected molecule and the second selected molecule based on a comparison of the prediction data for the selected molecule with the second prediction data for the second selected molecule.”
Claim 7 recites “determining . . . data indicative of one or more of: optical properties of the selected molecule; gustatory properties of the selected molecule; biodegradability of the selected molecule; stability of the selected molecule; or toxicity of the selected molecule.”
Claim 8 recites “wherein the graph that graphically describes the chemical structure of the selected molecule comprises a two dimensional graph structure indicative of a two-dimensional representation of the chemical structure of the selected molecule.”
Claims 9 and 20 recites “wherein the method further comprises performing … one or more quantum chemical calculations to identify the three-dimensional representation of the chemical structure of the selected molecule.”
Claims 10 recites “performing … an iterative search process to identify an additional molecule that exhibits one or more desired olfactory properties, wherein the iterative search process comprises, for each of a plurality of iterations: generating … a candidate molecule graph that graphically describes a candidate chemical structure of a candidate molecule; comparing … the one or more predicted olfactory properties of the candidate molecule to the one or more desired olfactory properties.”
Claim 11 recites “wherein the prediction data indicative of the one or more predicted olfactory properties of the selected molecules comprises a numerical embedding; and the method further comprises identifying … other molecules that have olfactory properties that are similar to the predicted olfactory properties of the selected molecule by comparing the numerical embedding with other numerical embeddings output for the other molecules ...”
Claim 12 recites “providing graph data representative of the chemical structure … , wherein the prediction data indicative of the one or more predicted olfactory properties of the selected molecule comprises a numerical embedding expressed in a numerical odor embedding space;”
Claim 18 recites “determining, based at least in part on graph data representative of the chemical structure, data indicative of one or more of: optical properties of the selected molecule; gustatory properties of the selected molecule; biodegradability of the selected molecule; stability of the selected molecule; or toxicity of the selected molecule.”
Claim 19 recites “wherein the graph data representative of the chemical structure of the selected molecule comprises a graph structure indicative of a two-dimensional structure of the selected molecule.”
Claims 21 and 22 recite “determining … a measure of distance between the numerical embedding predicted for the selected molecule and one or more other numerical embeddings in the numerical odor embedding space that have been output by the machine-learned graph neural network for one or more other molecules; and identifying at least one of the other molecules that has olfactory properties that are similar to the predicted olfactory properties of the selected molecule based on the determined measure of distance between the numerical embedding predicted for the selected molecule and the one or more other numerical embeddings output for the one or more other molecules by the machine-learned graph neural network.”
Limitations reciting a mental process.
The above cited limitations in claims 1, 3-4, 6-7, 10-12, 14-15, 17-18 and 21-22 are recited at such a high level of generality that they equate to a mental process because they are similar to the concepts of collecting information, analyzing it, and displaying certain results of the collection and analysis in Electric Power Group, LLC, v. Alstom (830 F.3d 1350, 119 USPQ2d 1739 (Fed. Cir. 2016)), which the courts have identified as concepts that can be practically performed in the human mind. The paragraph below describes the limitations in these claims that recite a mental process under their broadest reasonable interpretation (BRI).
The BRI of using a graphical chemical structure of a molecule to predict olfactory properties includes using an 2D acyclic graph of a chemical structure to identify functional groups in the molecule that have known associated odors. A human could practically generate visualization data describing importance of structural units of chemical structure as its BRI includes analyzing the chemical structure and determining which units are important for olfactory properties by searching literature for structural units that correspond to olfactory properties. Similarly, a human could generate data indicative of how a structural change affects a predicted olfactory property as its BRI includes altering the structure of a chemical and then re-evaluating the structure by comparing the altered structure to known structural units that have an olfactory property. A human could practically compare olfactory prediction properties between molecules as its BRI includes observing that one molecule is labeled as having citrus odor whereas another molecule has a vanilla odor. A human could practically determine toxicity of a molecule as its BRI includes analyzing the structure of the chemical and comparing its structure to molecules that have known structural units that are toxic. A human could practically iteratively search a list of molecules, select one, create a molecular graph, and compare its predicted properties to desired olfactory properties as its BRI includes selecting molecules, drawing a graph representing the molecule and making comparisons. A human could also practically compare numerical embeddings of a molecule as its BRI includes comparing numbers and identifying numbers that match or are similar. As such, these limitations equate to a mental process.
Limitations reciting a mathematical concept.
The above cited limitations in claims 1-2, 9, 12-13 and 20-22 equate to a mathematical concept because they are similar to the concepts of organizing and manipulating information through mathematical correlations in Digitech Image Techs., LLC v Electronics for Imaging, Inc. (758 F.3d 1344, 111 U.S.P.Q.2d 1717 (Fed. Cir. 2014)), which the courts have identified as mathematical concepts. The BRI of performing quantum chemical calculations to determine 3D chemical structure includes performing a Schrodinger equation, Born-Oppenheimer approximation, Density functional theory, Hartree-Fock theory, or Post-Hartree-Fock methods, which are all equations and necessitate calculations. The BRI of training a machine-learned graph neural network includes performing calculations by using equations such as backpropagation, which equates to a mathematical equation/calculation. This interpretation is reinforced by the specification in para. [47]. The BRI of numerical embeddings in an embedding space equates to a mathematical relationship because it equates the structure of a molecule to a numerical representation in the form of vectors. As such, these limitations equate to a mathematical calculation/equation.
Limitations reciting a law of nature.
Regarding the above cited limitations in claims 1-2, 7, 12-13 and 18, these limitations equate to a law of nature because they describe naturally occurring principles/relations (i.e., olfactory/gustatory properties of a molecule derived from its chemical structure).
Limitations included in the recited judicial exception.
The above cited limitations in claims 5, 8, 11, 16 and 19 limit the prediction data, the graph, the graph data, which are part of the judicial exception, but do not change the fact that they are part of the judicial exception.
As such, claims 1-22 recite an abstract idea and a natural phenomenon (Step 2A, Prong 1: Yes).
Step 2A, Prong 2:
Claims found to recite a judicial exception under Step 2A, Prong 1 are then further analyzed to determine if the claims as a whole integrate the recited judicial exception into a practical application or not (Step 2A, Prong 2). The judicial exception is not integrated into a practical application because the claims do not recite additional elements that reflect an improvement to a computer, technology, or technical field (MPEP § 2106.04(d)(1) and 2106.5(a)), require a particular treatment or prophylaxis for a disease or medical condition (MPEP § 2106.04(d)(2)), implement the recited judicial exception with a particular machine that is integral to the claim (MPEP § 2106.05(b)), effect a transformation or reduction of a particular article to a different state or thing (MPEP § 2106.05(c)), nor provide some other meaningful limitation (MPEP § 2106.05(e)). Rather, the claims include limitations that equate to an equivalent of the words “apply it” and/or to instructions to implement an abstract idea on a computer (MPEP § 2106.05(f)), insignificant extra-solution activity (MPEP § 2106.05(g)), and field of use limitations (MPEP § 2106.05(h)). The instant claims recite the following additional elements:
Claim 1 recites “A computer-implemented method, the method comprising … , by one or more computing devices, … as input to a machine-learned graph neural network that has been trained to predict olfactory properties of molecules based at least in part on chemical structure data associated with the molecules … ; receiving, by the one or more computing devices, prediction data descriptive of one or more predicted olfactory properties of the selected molecule as an output of the machine-learned graph neural network; and providing, by the one or more computing devices, the prediction data descriptive of the one or more predicted olfactory properties of the selected molecule as an output.”
Claims 2-11 and 21 recite “the computer-implemented method of 1,”
Claims 2-4, 6-7, 9-11 and 21 recite “by the one or more computing devices”
Claims 2 and 13 recite “wherein the machine-learned graph neural network has been trained by: obtaining, by the one or more computing devices, training data comprising a plurality of example chemical structures, each example chemical structure labeled with one or more olfactory property labels that describe olfactory properties of the example chemical structure;”
Claims 3 and 15 recite “providing, by the one or more computing devices, the visualization data in association with the prediction data indicative of the one or more olfactory properties.”
Claims 6 and 17 recite “obtaining, by the one or more computing devices, a second graph that graphically describes a second chemical structure of a second selected molecule; providing, by the one or more computing devices, the second graph that graphically describes the second chemical structure of the second selected molecule as input to the machine-learned graph neural network; receiving, by the one or more computing devices, second prediction data descriptive of one or more second olfactory properties associated with the second selected molecule as an output of the machine-learned graph neural network;”
Claim 7 recites “… by the one or more computing devices through input of the graph that graphically describes the chemical structure of the selected molecule into the machine-learned graph neural network or an additional machine-learned graph neural network …”
Claims 9 and 20 recite “wherein the graph that graphically describes the chemical structure of the selected molecule comprises a three-dimensional graph structure indicative of a three-dimensional representation of the chemical structure of the selected molecule, and …”
Claim 10 recites “providing, by the one or more computing devices, the candidate molecule graph that graphically describes the candidate chemical structure of the candidate molecule as input to the machine-learned graph neural network; receiving, by the one or more computing devices, prediction data descriptive of one or more predicted olfactory properties of the candidate molecule as an output of the machine-learned graph neural network; and”
Claim 11 recites “… by the machine-learned graph neural network”
Claim 12 recites “A computing device, comprising: one or more processors; and one or more non-transitory computer-readable media that store instructions that, when executed by the one or more processors, cause the computing device to perform operations, the operations comprising: … as input to a machine-learned graph neural network that has been trained to predict olfactory properties of molecules based at least in part on chemical structure data associated with the molecules … receiving prediction data descriptive of one or more olfactory properties associated with the selected molecule as an output of the machine-learned graph neural network; and providing the prediction data descriptive of the one or more predicted olfactory properties of the selected molecule as an output.”
Claims 13-20 and 22 recite “the computing device of claim 12”
Regarding the above cited limitation in claims 1-22 of a computer-implemented method, by the one or more computing devices, the computing device comprising a processor and non-transitory computer-readable media storing instructions. There are no limitations that the one or more computing devices or the computing device comprising a processor and non-transitory computer-readable media require anything other than a generic computer and/or generic computing system. Therefore, these limitations equate to mere instructions to implement an abstract idea on a generic computer, which the courts have established does not render an abstract idea eligible in Alice Corp. 573 U.S. at 223, 110 USPQ2d at 1983.
Regarding the above cited limitations in claims 1-3, 6, 10, 12-13, 15 and 17 of obtaining, providing and receiving, these limitations equate to insignificant, extra-solution activity of mere data gathering and outputting, which do not provide a practical application.
Regarding the above cited limitations in claims 1, 7 and 11-12 of “as input to a machine-learned graph neural network that has been trained to predict olfactory properties of molecules based at least in part on chemical structure data associated with the molecules”, “by the machine-learned graph neural network” and “through input of the graph that graphically describes the chemical structure of the selected molecule into the machine-learned graph neural network or an additional machine-learned graph neural network”. These limitations provide nothing more than mere instructions to implement an abstract idea on a generic computer (See MPEP 2106.05(f)). MPEP 2106.05(f) provides the following considerations for determining whether a claim simply recites a judicial exception with the words “apply it” (or an equivalent), such as mere instructions to implement an abstract idea on a computer: (1) whether the claim recites only the idea of a solution or outcome i.e., the claim fails to recite details of how a solution to a problem is accomplished; (2) whether the claim invokes computers or other machinery merely as a tool to perform an existing process; and (3) the particularity or generality of the application of the judicial exception. The paragraphs below analyze these limitations regarding these considerations.
The machine-learned graph neural network performs the abstract ideas of generating “prediction data indicative of the one or more prediction olfactory properties of the selected molecule” in claims 1 and 12, “determining … data indicative of one or more of …” in claim 7, and “identifying … other molecules that have olfactory properties that are similar to the predicted olfactory properties of the selected molecule by comparing the numerical embeddings with other numerical embeddings output for the other molecules” in claim 11. The machine-learned graph neural network is used to generally apply the abstract idea without placing any limits on how the machine-learned graph neural network functions. Rather, performing these abstract ideas with the machine-learned graph neural network only recites the outcome of generating “prediction data”, “determining … data indicative of one or more of …”, and of “identifying … other molecules that have olfactory properties that are similar to the predicted olfactory properties of the selected molecule by comparing the numerical embeddings with other numerical embeddings output for the other molecules”, but do not include any details about how the generation of “prediction data”, “determining”, and “determining by comparing” are accomplished. See MPEP 2106.05(f).
Furthermore, the recitations of “as input to a machine-learned graph neural network that has been trained to predict olfactory properties of molecules based at least in part on chemical structure data associated with the molecules”, “by the machine-learned graph neural network”, and “through input of the graph that graphically describes the chemical structure of the selected molecule into the machine-learned graph neural network or an additional machine-learned graph neural network” merely indicate a field of use or technological environment in which the judicial exception is performed. Although these additional elements limit the identified judicial exceptions, these limitations merely confine the use of the abstract idea to a particular technological environment (graph neural networks) and thus fail to add an inventive concept to the claims. See MPEP 2106.05(h).
As such, claims 1-22 are directed to an abstract idea and a natural phenomenon (Step 2A, Prong 2: No).
Step 2B:
Claims found to be directed to a judicial exception are then further evaluated to determine if the claims recite an inventive concept that provides significantly more than the judicial exception itself (Step 2B). These claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception because these claims recite additional elements that equate to instructions to apply the recited exception in a generic way and/or in a generic computing environment (MPEP § 2106.05(f)) and to well-understood, routine and conventional (WURC) limitations (MPEP § 2106.05(d)). The instant claims recite the following additional elements:
Claim 1 recites “A computer-implemented method, the method comprising … , by one or more computing devices, … as input to a machine-learned graph neural network that has been trained to predict olfactory properties of molecules based at least in part on chemical structure data associated with the molecules … receiving, by the one or more computing devices, prediction data descriptive of one or more predicted olfactory properties of the selected molecule as an output of the machine-learned graph neural network; and providing, by the one or more computing devices, the prediction data descriptive of the one or more predicted olfactory properties of the selected molecule as an output.”
Claims 2-11 and 21 recite “the computer-implemented method of 1,”
Claims 2-4, 6-7, 9-11 and 21 recite “by the one or more computing devices”
Claims 2 and 13 recite “wherein the machine-learned graph neural network has been trained by: obtaining, by the one or more computing devices, training data comprising a plurality of example chemical structures, each example chemical structure labeled with one or more olfactory property labels that describe olfactory properties of the example chemical structure;”
Claims 3 and 15 recite “providing, by the one or more computing devices, the visualization data in association with the prediction data indicative of the one or more olfactory properties.”
Claims 6 and 17 recite “obtaining, by the one or more computing devices, a second graph that graphically describes a second chemical structure of a second selected molecule; providing, by the one or more computing devices, the second graph that graphically describes the second chemical structure of the second selected molecule as input to the machine-learned graph neural network; receiving, by the one or more computing devices, second prediction data descriptive of one or more second olfactory properties associated with the second selected molecule as an output of the machine-learned graph neural network;”
Claim 7 recites “… by the one or more computing devices through input of the graph that graphically describes the chemical structure of the selected molecule into the machine-learned graph neural network or an additional machine-learned graph neural network …”
Claims 9 and 20 recite “wherein the graph that graphically describes the chemical structure of the selected molecule comprises a three-dimensional graph structure indicative of a three-dimensional representation of the chemical structure of the selected molecule, and …”
Claim 10 recites “providing, by the one or more computing devices, the candidate molecule graph that graphically describes the candidate chemical structure of the candidate molecule as input to the machine-learned graph neural network; receiving, by the one or more computing devices, prediction data descriptive of one or more predicted olfactory properties of the candidate molecule as an output of the machine-learned graph neural network; and”
Claim 11 recites “… by the machine-learned graph neural network”
Claim 12 recites “A computing device, comprising: one or more processors; and one or more non-transitory computer-readable media that store instructions that, when executed by the one or more processors, cause the computing device to perform operations, the operations comprising: … as input to a machine-learned graph neural network that has been trained to predict olfactory properties of molecules based at least in part on chemical structure data associated with the molecules … receiving prediction data descriptive of one or more olfactory properties associated with the selected molecule as an output of the machine-learned graph neural network; and providing the prediction data descriptive of the one or more predicted olfactory properties of the selected molecule as an output.”
Claims 13-20 and 22 recite “the computing device of claim 12”.
Regarding the above cited limitations in claims 1-22 a computer-implemented method, by the one or more computing devices, the computing device comprising a processor and non-transitory computer-readable media storing instructions. These limitations equate to instructions to implement an abstract idea on a generic computing environment, which the courts have established does not provide an inventive concept in Intellectual Ventures I LLC v. Capital One Bank (USA), 792 F.3d 1363, 1367, 115 USPQ2d 1636, 1639 (Fed. Cir. 2015).
Regarding the above cited limitations in claims 1-3, 6, 10, 12-13, 15 and 17 of obtaining, providing and receiving, these limitations equate to receiving/transmitting data over a network, which the courts have established as WURC limitation of a generic computer in buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014). Regarding the above cited limitation in claim 11 of wherein the prediction data comprises a numerical embedding, this limitation also equates to transmitting/receiving data over a network because it limits the type of data but does not change the fact that data is being transmitted/received.
Regarding the above cited limitations in claims 1, 7 and 11-12 of “as input to a machine-learned graph neural network that has been trained to predict olfactory properties of molecules based at least in part on chemical structure data associated with the molecules”, “by the machine-learned graph neural network”, and “through input of the graph that graphically describes the chemical structure of the selected molecule into the machine-learned graph neural network or an additional machine-learned graph neural network”. As discussed in the section above of Step 2 Prong 2A, these limitations equate to instructions to “apply” the abstract idea, which cannot provide an inventive concept. See MPEP 2106.05(f).
When these additional elements are considered individually and in combination, they do not provide an inventive concept because they all equate to WURC functions/components of a generic computer and/or generic computing system as well as mere instruction to apply the abstract idea. Therefore, these additional elements do not transform the claimed judicial exception into a patent-eligible application of the judicial exception and do not amount to significantly more than the judicial exception itself (Step 2B: No).
As such, claims 1-22 are not patent eligible.
Response to Arguments under 35 USC 101
Applicant's arguments filed 09/25/2025 have been fully considered but they are not persuasive.
Applicant argues that, because independent claims 1 and 12 were found to be patent-eligible in the previous Office action, the dependent claims should also be patent-eligible (pg. 12, para. 2 of Applicant’s remarks). Applicant’s remarks are not persuasive for the following reasons:
Claims 1 and 12 now recite a judicial exception. The amendments to claims 1 and 12 appear to require actively predicting olfactory properties of molecules using a graphical representation of chemical structure, which has been identified as reciting an abstract. Regardless, even if an independent claim is eligible, a dependent claim may be found ineligible. See MPEP 2106.07 which recites “even if an independent claim is determined to be eligible, a dependent claim may be ineligible because it adds a judicial exception without also adding limitations that integrate the judicial exception or provide significantly more. Thus, each claim in an application should be considered separately based on the particular elements recited therein.”
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-2, 6-8, 11-14 and 17-19 are rejected under 35 U.S.C. 103 as being unpatentable over Tran et al. (“Tran”; DeepNose: Using Artificial Neural Networks to Represent the Space of Odorants; published 11/07/2018; previously cited on PTO892 mailed) in view of Duprat et al. (“Duprat”; ref. on IDS filed 08/06/2021; In 2014 IEEE International Workshop on Machine Learning for Signal Processing (MLSP), pp. 1-6. IEEE, 2014; previously cited).
Any newly recited portions herein are necessitated by claim amendment.
The bold and italicized text below are the limitations of the instant claims, and the italicized text serves to map the prior art onto the instant claims.
Claims 1 and 12:
A computer-implemented method, the method comprising:
A computing device, comprising: one or more processors; and one or more non-transitory computer-readable media that store instructions that, when executed by the one or more processors, cause the computing device to perform operations, the operations comprising:
Tran trains artificial neural networks to represent the chemical space of odorants and use that representation to predict human olfactory percepts (abstract). Tran’s method is computer-implemented as evidenced by accessing databases such as PubChem3D and ESOL (Table 1).
providing, by one or more computing devices, a graph that graphically describes the chemical structure of the selected molecule as input to a machine-learned graph neural network that has been trained to predict olfactory properties of molecules based at least in part on chemical structure data associated with the molecules,
Tran discloses a DeepNose autoencoder that contains two convolutional neural networks (pg. 7, col. 1, para. 5). DeepNose was trained on molecular structure of chemicals, which are represented as 3D objects in a grid (pg. 7, col. 1, para. 4 – col. 2, para. 1). Figure 1 shows the input into the model. Figures 4 and 5 shows that DeepNose predicts olfactory properties of the input molecules (pg. 7, col. 2, para. 2) (abstract).
However, Tran does not disclose using a graphical representation of the chemical structure as input into a graph neural network (GNN).
Duprat discloses that GNNs can predict an activity or property of interest of a molecule by using the structure of the molecule described as a graph (pg. 1, col. 1, para. 2). Duprat’s GNN is capable of predicting physical and chemical properties of a molecules (pg. 2, col. 1, para. 1; Table 2). Duprat states that their GNN achieves reasonable and excellent performance for prediction of properties (pg. 7, col. 1, para. 2).
wherein the prediction data indicative of the one or more predicted olfactory properties of the selected molecule comprises a numerical embedding expressed in a numerical odor embedding space;
Tran teaches representing the space of molecules using the autoencoder. Tran recites “Autoencoder consists of two convolutional neural networks: an encoder, which molecular structure into a feature vector and a decoder to recover the structure restricted the feature vector to have a lower dimensionality than the input, and therefore induced the network to learn a latent representation captures the most statistically salient information in structures. We refer to this latent representation as ‘DeepNose features’” (pg. 2, col. 2, last para.) (pg. 7, col. 1, last para.) (Figure 2A). These DeepNose features represent variables relevant to human olfaction (pg. 6, col. 2, para. 1) and were used to predict odorant percepts using the Flavornet and Good Scents datasets (pg. 6, col. 2, para. 1-2).
receiving, by the one or more computing devices, prediction data descriptive of one or more predicted olfactory properties of the selected molecule as an output of the machine-learned graph neural network; and providing, by the one or more computing devices, the prediction data descriptive of the one or more predicted olfactory properties of the selected molecule as an output.
Tran states that the DeepNose classifier was used to predict bioactivates of molecules such as perceptual coordinates (pg. 7, col. 2, para. 2). Figure 3A shows the DeepNose classifier structure, which uses 3D molecule structure as input, outputs olfactory properties of the molecules.
However, Tran does not teach that DeepNose is a GNN.
As stated above, Tran discloses a GNN capable of predicting properties of molecules using graphical representations of the molecules (abstract).
Claims 2 and 13:
obtaining, by the one or more computing devices, training data comprising a plurality of example chemical structures, each example chemical structure labeled with one or more olfactory property labels that describe olfactory properties of the example chemical structure;
Tran discloses accessing Flavornet and Good Scent databases which contain information having human response to a large number of molecules that are represented by sematic descriptors (pg. 3, col. 2, last para.).
training, by the one or more computing devices, the machine-learned graph neural network to predict olfactory properties of molecules based in part on the obtained training data.
Tran trained the encoder layers and consolidation layers of DeepNose with the Flavornet and Good Scents datasets (pg. 4, col. 2, last para.; Figure 3A). It was found that DeepNose autoencoder is able to automatically discover useful chemical features from 3D images of molecules alone for predicting human olfactory percepts (pg. 5, col. 1, para. 1). Tran also that the DeepNose classifier was trained to minimize the different between the target bioactivity and the network output using backpropagation (pg. 7, col. 2, last para.).
However, Tran does not disclose a that DeepNose is a GNN.
Duprat discloses that GNNs can predict an activity or property of interest of a molecule by using the structure of the molecule described as a graph (pg. 1, col. 1, para. 2). Duprat’s GNN is capable of predicting physical and chemical properties of a molecules (pg. 2, col. 1, para. 1; Table 2). Duprat states that their GNN achieves reasonable and excellent performance for prediction of properties (pg. 7, col. 1, para. 2).
Regarding claims 6 and 17, these claims perform the same operations as disclosed in claim 1, which have been taught by Tran and Duprat, wherein the only difference is using a second molecule to then compare the olfactory predictions between the first and second molecule. Since Tran discloses performing the operations of claim 1 on many molecules as shown in Figures 4 and 5, the teachings of claim 1 apply to claims 6 and 17. Regarding determining olfactory difference between molecules by comparing their olfactory predictions, this comparison is inherent when producing the olfactory predictions for molecules that have different olfactory predictions.
Prima facie case for obvious for claims 1-2, 6, 12-13 and 17:
An invention would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the instant invention if there was (i) a finding that the prior art contained a
"base" method upon which the claimed invention can be seen as an improvement, (ii) a finding that the prior art contained a comparable method that was improved in the same way as the claimed invention, (iii) and a finding that one of ordinary skill could have applied the known "improvement" technique in the same way to the "base" method and the results would have been predictable.
One of ordinary skill would could have modified the base method of Tran for predicting odor percepts using chemical structure information and chemical descriptors with a convolutional autoencoder by applying the known improved technique of using a GNN and a graphical representation of a chemical structure to predict physical properties of the molecule, as taught by Duprat. Tran uses DeepNose features which are chemical descriptors to predict odor percepts (abstract), and Duprat recites “in order to circumvent the problems related to descriptor design and selection, methods for performing prediction or classification directly from a graph representation were developed” (pg. 1, col. 2, para. 3). Therefore, the known improvement in using graphs and GNN to perform predictions and classifications could have been applied to Tran.
Furthermore, one of ordinary skill in the art would have been able to apply a GNN to chemical structures to predict olfactory properties because a GNN is capable of predicting physical properties of a molecule based on its structure, as taught by Duprat, wherein the modification would have resulted in the GNN predicting other physical properties of a molecule such as olfactory properties, which was achieved by the neural network of Tran.
Claim 7 and 18:
Regarding claims 7 and 18, Tran discloses predicting olfactory properties of molecules using an autoencoder (abstract). However, Tran does not teach predicting stability of a molecule using a graph that represents the molecules structure as input into a trained GNN. Duprat discloses using a trained GNN (machine-learned graph neural network) to predict the stability constant of chelates as contrast agents in MRI (determining data indicative of stability of the selected molecule) (sections 4-4.1). The chemical structure of the molecules as described as graphs (abstract).
It would have been prima facie obvious to one of ordinary skill in the art to have substituted the physical property prediction of water solubility of Tran with the prediction of molecule stability as taught by Duprat (section 4.1) because stability is another type of physico-chemical property of a molecule. The result of substituting these components would have yielded predictable results because a GNN is capable of predicting physical properties of a molecule based on its structure, as taught by Duprat, wherein the substitution would have reasonably resulted in the GNN predicting other physical properties of a molecule such as molecule stability.
Claims 8 and 19:
Regarding claims 8 and 19, Tran uses 3D grids of molecules to present chemical structure (abstract), but Tran does not teach using a 2D graph. Duprat shows in Figure 1 the steps used to create the structure of a graph machine where Simplified Molecular Input Line Entry Specification (SMILES) is are used as planar representation which are then turned into an undirected, cyclic, or acyclic graph with appropriately labeled nodes (pg. 2, col. 1, para. 2). Thus, the input in the GNN is a 2-D graph shown in Figure 1.
It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the instant invention to have modified the base method of Tran for using the DeepNose autoencoder and 3D grid of chemical structure to predict olfactory properties by using the improved techniques of the GNN and 2-D graph of Duprat for predicting properties of molecules using chemical structure because Duprat states that a GNN overcomes problems associated with using chemical descriptors (pg. 1, col. 2, para. 3), wherein the DeepNose autoencoder of Tran uses chemical descriptors defined as DeepNose features (abstract). The result of using the improved technique would have yielded predictable results because a GNN is capable of predicting physical properties of a molecule based on a 2-D graphical representation of the molecules structure, as taught by Duprat, wherein the improved technique would have reasonably resulted in the GNN predicting other physical properties of a molecule such as olfactory properties. There would have been a reasonable expectation for the GNN of Duprat to predict olfactory properties because it already can predict properties of the molecule, wherein using the dataset of Tran would have enabled the GNN to also predict olfactory properties.
Claim 11:
Tran discloses using olfactory datasets to embed the space of olfactory perceptual objects, wherein each molecule in the dataset contains a number of semantic descriptors (pg. 3, col. 2, last para.). Random subsets were selected form each dataset and low dimensional embedding was performed for each subset, wherein molecules that share molecule descriptors are spatially grouped together (Figures 4A and 5A; pg. 4, col. 2, para. 2).
Claims 4 and 14 are rejected under 35 U.S.C. 103 as being unpatentable over Tran et al. (“Tran”; DeepNose: Using Artificial Neural Networks to Represent the Space of Odorants; published 11/07/2018; previously cited on PTO892 mailed 06/27/2025) in view of Duprat et al. (“Duprat”; ref. on IDS filed 08/06/2021; In 2014 IEEE International Workshop on Machine Learning for Signal Processing (MLSP), pp. 1-6. IEEE, 2014), as applied above to claims 1 and 12, and in further view of Schmuker et al. (“Schmuker”; Chemistry Central Journal 1 (2007): 1-10; previously cited on the PTO892 mailed 06/27/2025).
This rejection is maintained from the previous office action.
The limitations of claims 1 and 12 have been taught in the rejection above by Tran and Duprat.
Regarding claims 4 and 14, Tran discloses predicting olfactory properties of molecules using a 3D grid representative of chemical structure (abstract). However, neither Tran nor Duprat disclose data indicative how a structural change to a chemical affects a predicted olfactory property.
Schmuker discloses predicting olfactory neuron response by using physicochemical molecule descriptors and artificial neural networks (abstract). Schmuker discloses identifying strongly varying descriptors by generating multiple conformers of all odorants using MOE’s stochastic conformer generation functionality, resulting in nine conformers per molecule. For each descriptor the variance over all conforms of an odorant was calculated and scaled (pg. 7, col. 1, para. 5 – col. 2, para. 1).
It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the instant invention to have modified the method of Tran and Duprat for predicting olfactory properties of molecules by using 3D chemical structure and descriptors with the method of Schmuker for pruning unsuitable descriptors that vary among conformers of a molecule. The motivation for doing so is stated by Schmuker who states that some descriptors depend on 3-D conformation of a molecule which can lead to inconsistent modeling results for different conformations (pg. 7, col. 1, para. 5). One of ordinary skill in the art would have had a reasonable expectation of success for this combination because all three references use neural networks, 3D representations of chemical structure, and molecular descriptors of odorants.
Claims 3, 5 and 15-16 are rejected under 35 U.S.C. 103 as being unpatentable over