DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims 1, 4-8, and 11-14 have been examined.
Legibility of Amendment
Applicant is reminded of 37 CFR 1.52(a):
(a) Papers that are to become a part of the permanent United States Patent and Trademark Office records in the file of a patent application, or a reexamination or supplemental examination proceeding.
(1) All papers, other than drawings, that are submitted on paper or by facsimile transmission, and are to become a part of the permanent United States Patent and Trademark Office records in the file of a patent application or reexamination or supplemental examination proceeding, must be on sheets of paper that are the same size, not permanently bound together, and:
(i) Flexible, strong, smooth, non-shiny, durable, and white;
(ii) Either 21.0 cm by 29.7 cm (DIN size A4) or 21.6 cm by 27.9 cm (8 1/2 by 11 inches), with each sheet including a top margin of at least 2.0 cm (3/4 inch), a left side margin of at least 2.5 cm (1 inch), a right side margin of at least 2.0 cm (3/4 inch), and a bottom margin of at least 2.0 cm (3/4 inch);
(iii) Written on only one side in portrait orientation;
(iv) Plainly and legibly written either by a typewriter or machine printer in permanent dark ink or its equivalent; and
(v) Presented in a form having sufficient clarity and contrast between the paper and the writing thereon to permit the direct reproduction of readily legible copies in any number by use of photographic, electrostatic, photo-offset, and microfilming processes and electronic capture by use of digital imaging and optical character recognition.
Contrary to sections (iv) and (v), the amendments submitted on February 10, 2026 show up as faint and pixilated. Any future amendments, etc. should be submitted in a dark, solid, readily legible font.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1, 4-8, and 11-14 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (abstract idea) without significantly more. The claims recite an abstract idea. The judicial exception is not integrated into a practical application. The claims do not recite additional elements that are sufficient to amount to significantly more than the judicial exception.
The following 35 U.S.C. 101 analysis is performed in accordance with section 2106 of the Manual of Patent Examination Procedure (concerning Patent Subject Matter Eligibility Guidance). First, it is determined that the claims are directed to a statutory category of invention. See MPEP 2106.03 (II). In the instant case, claims 1 and 4-7 are directed to a method, in the statutory category of process. Claims 8 and 11-14 are directed to a system comprising a memory and at least one processor, and therefore fall within the statutory category of machine. Therefore, claims 1, 4-8, and 11-14 are directed to statutory subject matter under Step 1 of the Alice/Mayo test. (Step 1: YES).
The claims are then analyzed to determine whether the claims are directed to a judicial exception. See MPEP 2106.04. The claims are analyzed to evaluate whether they recite a judicial exception (Step 2A, Prong One) as well as analyzed to evaluate whether the claims recite additional elements that integrate the judicial exception into a practical application of the judicial exception (Step 2A, Prong Two). See MPEP 2106.04. First applying Step 2A, Prong One: Claim 1 recites the following abstract idea (bolded):
A method for use in a computing system, comprising:
training a first neural network based on a training data set, the training data set including a plurality of entries, each entry including a respective product configuration signature and a label identifying one or more product configurations that are similar to the product configuration signature, each of the identified product configurations corresponding to a different record in an electronic database;
executing the first neural network based on a current signature and receiving from the first neural network a list of past product configurations that is generated by the first neural network as a result of the execution, the past product configurations being contained in records stored in the database, the records corresponding to a current product configuration that is specified via a graphical user interface of the computing system, the current signature encoding the current product configuration, each of the past product configurations including one or more strings, each string corresponding to a different product configuration component; and
displaying the list in the graphical user interface of the computing system, the list result being displayed alongside a respective winnability score and/or a respective salability score for each of the past product configurations.
Claim 1 is directed to an abstract idea, specifically to mental processes, because claim 1 recites observations, evaluations, and arguably judgments. A human being could perform a search of a database, perhaps in written form as a set of pages or file cards, to identify one or more past product configurations meeting certain criteria, as a mental process. Displaying a result of the search need not in itself involve any significant technology. (Step 2A, Prong One: YES)
Claim 8 is a system claim parallel to method claim 1, and likewise directed to an abstract idea (mental process). The system of claim 8 comprises a memory; and at least one processor that is operatively connected to the memory, the at least one processor being configured to perform operations corresponding to those of claim 1. In accordance with the Supreme Court’s decision in Alice Corporation v. CLS Bank, having an essentially generic computer apply an abstract idea does not make a claimed computer system patent-eligible. (Step 2A, Prong One: YES)
Proceeding to Step 2A, Prong Two: Claim 1 and parallel claim 8 do not recite any of the specific limitations that are indicative of integration into a practical application, or otherwise apply or use the judicial exception in a meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to manage the exception. Claim 1 recites a computing system, a neural network, an electronic database, and a graphical user interface of the computing system; these are not pure abstractions, but they are described at a high level of abstraction, without details of their structure, and an abstract idea does not become non-abstract merely because a computer is involved in applying it, as the Supreme Court ruled in Alice Corporation v. CLS Bank. Claim 8, which is parallel to claim 1, additionally recites a system comprising a memory and at least one processor operatively coupled to the memory, and configured to perform operations. The system is recited at a high level of abstraction, and the description in claim 8 would fit almost any generic computer. Mere instructions to implement an abstract idea on a computer, or use a computer as a tool to perform an abstract idea, are not indicative of integration into a practical application, nor is linking the use of the judicial exception to a particular technological environment or field of use (Mayo test, Step 2A, Prong 2). Adding insignificant extra-solution activity to a judicial exception is also not indicative of integration into a practical application. (Step 2A, Prong Two: NO)
Next, under Step 2B of the Alice/Mayo test, the claims are analyzed to determine whether there are additional claim limitations that individually, or as an ordered combination, ensure that the claim amounts to significantly more than the abstract idea. See MPEP 2106.05. The instant claims do not include additional elements that are sufficient to amount to significantly more than the abstract idea for at least the following reasons: The instant claims do not recite any of the specific limitations that are indicative of integration into a practical application, or otherwise apply or use the judicial exception in a meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to manage the exception.
Claim 1 recites training a first neural network based on a training data set. Chen et al. (U.S. Patent Application Publication 2023/0084164) discloses (paragraph 68, emphasis added), “Modern deep neural networks are routinely trained with computing resources that are thousands of times greater than what was available to a typical researcher just fifteen years ago.” Hence, training neural networks required only well-understood, routine, and conventional technology prior to inventors’ filing date; given this, neural networks themselves were also well-understood, routine, and conventional technology, so executing the first neural network requires only the use of well-understood, routine, and conventional technology. Claim 1 further recites an electronic database. McGuire et al. (U.S. Patent Application Publication 2004/0131997) discloses (paragraph 22, emphasis added), “The data center 106 may be any computer 210 or electronic database that can store data. Databases and the software and electronics that form databases are well known in the art and will not be explained further.” Hence, the electronic database need involve only well-understood, routine, and conventional technology.
Uebuchi (U.S. Patent Application Publication 2020/0064993) discloses (paragraph 3, emphasis added), “As well known, a software keyboard receives a character input operation performed by a user using a touchscreen input device through a graphical user interface (GUI).” Hagerty et al. (U.S. Patent Application Publication 2020/0064993) discloses (paragraph 23, emphasis added), “Any conventional method for displaying a set of images within a GUI may be utilized.” Hence, having a current product configuration specified via a graphical user interface of the computing system, and of displaying in the graphical user interface of the computing system a list of past product configurations generated by the neural network, require only the use of well-understood, routine, and conventional technology.
The courts have recognized the following computer functions as well-understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity: Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090,1093 (Fed. Cir. 2015) sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network). Hence, receiving from the first neural network a list of past product configurations generated by the first neural network would have required only the use of well-understood, routine, and conventional functions and technology.
Avidan et al. (U.S. Patent Application Publication 2017/0193592) discloses (paragraph 25, emphasis added), “Although not illustrated, it should be appreciated that the ecommerce server 110, the merchant computer 120, and the customer computer 130 each include conventional components, such as a processor and a memory medium storing computer-readable instructions that are executable on the processor to perform various operations including those described herein.” For such components to be conventional, computing systems themselves must have been conventional; hence, the computing system of claim 1 requires only the use of well-understood, routine, and conventional technology. The limitations of claim 1, whether considered separately or in combination with each other, do not raise the claimed method to significantly more than an abstract idea.
Claim 4, which depends from claim 1, recites that the method further comprises: obtaining a plurality of weights by executing a second neural network based on the current product configuration or a species of the current product configuration; calculating an estimated salability score for the current product configuration based on the plurality of weights, the estimated salability score being indicative of a likelihood of the current product configuration being satisfactory to customer [with further description of the means of calculation]; and outputting an indication of the estimated salability score. Obtaining a plurality of weights (in the mathematical sense) is not in itself technological, nor is calculating an estimated salability score. As set forth above with regard to claim 1, based on Chen et al. (U.S. Patent Application Publication 2023/0084164) (paragraph 68), neural networks require only the use of well-understood, routine, and conventional technology. Claim 4 also recites outputting an indication of the estimated salability score. The courts have recognized the following computer functions as well-understood, routine and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity: Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090,1093 (Fed. Cir. 2015) sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network). Hence, the outputting step requires only well-understood, routine, and conventional functions and technology, for example, to transmit an outputted indication over a network. Alternatively, a local computing system could output an indication of the estimated salability score to a graphical user interface, established by Hagerty et al. (U.S. Patent Application Publication 2020/0064993) as well-understood, routine, and conventional (see Step 2B analysis of claim 1 above). The particular mathematical operations and conditions set forth in claim 4 are not in themselves technological. Claim 5, which depends from claim 4, recites that the salability score is calculated in accordance with a specific equation, that the salability score is equal to a summation of the product of weights with the numerical values of components of the current product configuration or the species of the current product configuration. This is just a mathematical relation, which is not patent-eligible (using mathematics to, for example, control the curing of rubber can be patentable, but the case here is not parallel). The limitations of claims 4 and 5, whether considered separately or in combination with each other and with the limitations of claim 1, do not raise the claimed method to significantly more than an abstract idea.
Claim 6, which depends from claim 1, recites:
receiving an input identifying a competitor or competing product;
obtaining a plurality of weights by executing a second neural network based on the current product configuration or a species of the current product configuration;
calculating an estimated winnability score for the current product configuration vis-a-vis the competitor or competing product, the estimated winnability score being calculated by multiplying each of the plurality of weights by a different one of a plurality of numerical values, the plurality of numerical values being associated with either components of the current product configuration or components of the species of the current product configuration, depending on whether the second neural network is executed based on the current product configuration or the species of the current product configuration; and
outputting an indication of the estimated winnability score.
The step of receiving an input requires only well-understood, routine, and conventional functions and technology, for example, to receive an input identifying a competitor or competing product over a network, based on judicial precedents, as set forth above with regard to claim 4. The calculating an estimated winnability score is a mathematical operation, not primarily technological. The step of outputting an indication of the estimated winnability score likewise requires only well-understood, routine, and conventional functions and technology, for example, to transmit an outputted indication over a network, based on judicial precedents, as set forth above with regard to claim 4. Claim 6 depends on how the second neural network is executed. As set forth above with regard to claim 1, based on Chen et al. (U.S. Patent Application Publication 2023/0084164) (paragraph 68), neural networks require only the use of well-understood, routine, and conventional technology. Hence, the limitations of claim 6, whether considered separately or in combination with each other and with the limitations of claim 1, do not raise the claimed method to significantly more than an abstract idea.
Claim 7, which depends from claim 1, recites that the current product configuration includes a storage system configuration. Biskeborn et al. (U.S. Patent 11,532,330) discloses (column 1, lines 7-11, emphasis added), “Conventional tape drive storage systems comprise a magnetic tape wound around a dual reel (reel-to-reel cartridge) or a single reel (endless tape cartridge), wherein the reel(s) are rotated in order to move the magnetic tape over one or more transducer heads during write/read operations.” Hence, the storage system configuration would require only the use of well-understood, routine, and conventional technology. (This is redundant, since the nature of the current product configuration need not affect the status of the technology used in the method.) The limitation of claim 7, whether considered separately or in combination with the limitations of claim 1, does not raise the claimed method to significantly more than an abstract idea. (Claims 1 and 4-7: Step 2B: NO)
Independent claim 8 recites, “A system, comprising: a memory; and at least one processor operatively coupled to the memory, the at least one processor being configured to perform the operations” [corresponding to the steps of method claim 1]. Analysis under Step 2B of the Alice/Mayo test therefore leads to the conclusion that the operations of claim 8 do not make the claimed system significantly more than an abstract idea, based on the same references applied to claim 1 above (Chen, McGuire, Uebuchi, Hagerty, and Avidan). Further, Avidan et al. (U.S. Patent Application Publication 2017/0193592) discloses (paragraph 25, emphasis added), “Although not illustrated, it should be appreciated that the ecommerce server 110, the merchant computer 120, and the customer computer 130 each include conventional components, such as a processor and a memory medium storing computer-readable instructions that are executable on the processor to perform various operations including those described herein.” Hence, the memory and at least one processor as recited in claim 8 require only well-understood, routine, and conventional technology. The limitations of claim 8, whether considered separately or in combination with each other, do not raise the claimed system to significantly more than an abstract idea.
Claim 11, which depends from claim 8, and claim 12, which depends from claim 11, are directly parallel to claims 4 and 5, respectively. Therefore, based on the reasoning, prior art references, and judicial precedents set forth above with respect to claims 4 and 5, claims 11 and 12 are likewise found to require only well-understood, routine, and conventional functions and technology; the limitations of claims 11 and 12 , whether considered separately or in combination with each other and with the limitations of claim 8, do not raise the claimed system to significantly more than an abstract idea.
Claim 13, which depends from claim 8, is directly parallel to claim 6. Therefore, based on the reasoning, prior art references, and judicial precedents set forth above with respect to claim 6, claim 13 is likewise found to require only well-understood, routine, and conventional functions and technology; the limitations of claim 13, whether considered separately or in combination with each other and with the limitations of claim 8, do not raise the claimed system to significantly more than an abstract idea.
Claim 14, which depends from claim 8, is directly parallel to claim 7. Therefore, based on the reasoning and prior art reference to Biskeborn set forth above with respect to claim 7, claim 14 is likewise found to require at most only well-understood, routine, and conventional functions and technology. The limitation of claim 14, whether considered separately or in combination with the limitations of claim 8, does not raise the claimed method to significantly more than an abstract idea. (Claims 8 and 11-14: Step 2B: NO)
Non-Obvious Subject Matter
Claims 1 and 4-7 are rejected under 35 U.S.C. 101, but recite non-obvious subject matter.
Claims 8 and 11-15 are rejected under 35 U.S.C. 101, but recite non-obvious subject matter.
The following is a statement of reasons for the indication of non-obvious subject matter: The closest prior art of record, Lagerling et al. (U.S. Patent Application Publication 2020/0104866) discloses a method for use in a computing system, the computing system being shown in Figure 5, and disclosed in paragraph 17, emphasis added, “FIG. 5 is an example computer system useful for implementing various embodiments.” Lagerling discloses generating sellability scores for selling objects on an online marketplace (Abstract, emphasis added), “Disclosed herein are system, method, and computer program product embodiments for generating sellability and cancellability scores for selling objects on an electronic marketplace.” (The word “sellability” in Lagerling is regarded and treated as having the same meaning as “salability” in the instant application and claims.) Lagerling does not disclose generating or displaying winnability scores, but Smith et al. (U.S. Patent Application Publication 2022/0122100) teaches winnability reports (paragraph 17, emphasis added), “The method further describes generating a winnability report descriptive of a probability of winning a winnable search term associated with the target product and an estimated search cost to win the winnable search term.”
Lagerling discloses neural networks (e.g., paragraph 45, emphasis added), “According to some embodiments, deep learning models such as deep neural network models can be used for generating and updating the model and the weights (e.g., scales) used in generating sellability score 104.” Smith teaches neural networks being trained (paragraph 124, emphasis added), “The neural network model (including designation of node values in the input layer, and number of layers), along with the weight matrices associated with each layer in an embodiment may form a trained machine learning classifier, algorithm, or mathematical model to be used in generating any winnability report 804 as described herein.” Training a first neural network based on a training data set would therefore have been obvious on the date of inventors’ filing, and a set, by its nature, normally contains a plurality of entries or items. Neither Lagerling nor Smith discloses training a first neural network based on a training data set including a plurality of entries, each entry including a respective product configuration signature and a label identifying one or more product configurations that are similar to the product configuration signature, each of the identified product configurations corresponding to a different record in an electronic database. Sinha et al. (U.S. Patent Application Publication 2024/0296375) teaches (paragraph 124, emphasis added), “As will be discussed, the machine learning models can be used with training data that includes needs information and uses configuration information associated with various solutions as ‘labels’ to train the respective machine learning model. Links can be established between solution selection and configuration information and information about particular needs that led to selection of a solution/configuration in a set of training data.” This is not quite what is recited in claims 1 and 8.
Searching databases is well known, as taught, for example, by Smith (paragraph 98, emphasis added), “Although specific search query websites are contemplated herein, the present specification also contemplates that other search query databases may be accessed by the processor 810 whether those databases are accessible by a user via a website or not.” Neural networks are also known, as set forth above, and graphical user interfaces are well known (e.g., Smith, paragraph 49, emphasis added: “In an embodiment, the user outputs 270 may present to a user a graphical user interface by which the user may interact with the smartphone 126 in order to affect the methods and processes described herein”). Further, Gilani et al. (U.S. Patent Application Publication 2023/0127626) teaches generating a hardware configuration signature that is generated based on a system specification (paragraph 51, emphasis added), “Although in the example of FIG. 6, a generic system specification is classified, alternative implementations are possible in which the precise system specification is classified instead. The system specification that is classified by the neural network 113, whether be it a precise system specification or a generic system specification, may be referred to as an initial system specification. The phrase ‘classifying a precise system specification’ may refer to one of ‘classifying the precise system specification’, ‘classifying a generic system specification that is generated based on the precise system specification’, ‘classifying a hardware configuration signature that is generated based on the precise system specification’, or ‘classifying a hardware configuration signature that is generated based on the generic system specification’.”
However, Lagerling does not disclose performing a search of the database by executing the first neural network based on a current signature and receiving from the first neural network a list of past product configurations being contained in records stored in the database, the records corresponding to a current product configuration that is specified via a graphical user interface of the computing system, the current signature encoding the current product configuration, each of the past product configurations including one or more strings, each string corresponding to a different product configuration component. Furthermore, Smith, Sinha, Gilani, and the other prior art references of record do not disclose, teach, or reasonably suggest this.
As claims 1 and 8 and their respective dependents are parallel to each other, the above statement applies to both claim 1 and claim 8.
Response to Arguments
Applicant's arguments filed February 10, 2026 have been fully considered but they are not persuasive. The claims, as amended, are non-obvious, but remain rejected under 35 U.S.C. 101. Applicant argues that claim 1 should not be rejected under 35 U.S.C. because it does not recite any judicial exceptions, and specifically that it recites the operations of training and executing a neural network, and displaying the list (Applicant writes “the search results”, but that no longer corresponds to claim 1 as amended) in a graphical user interface. Displaying something, such as a list of past product configurations, in a graphical user interface is a matter of insignificant extra-solution activity. While what is formally claimed includes training and executing a neural network, the goal is to generate a list of past product configurations meeting certain criteria, which could be done a matter of human thought and judgement, at least in relatively simple cases. Having a neural network do it is a matter of using a computer to apply an essentially abstract idea, which is not sufficient for patent eligibility, as the Supreme Court ruled in the Alice v. CLS Corporation case (vide infra). Chen is cited above in the Step 2B analysis as evidence that neural networks and the training thereof were well-understood, routine, and conventional prior to Applicant’s priority date.
Examiner disagrees that the claims are rooted in technology in a strong sense; having an essentially generic processor implement an abstract idea does not root a claimed method or parallel system or computer-readable medium claim in technology in such a way as establish patent-eligibility. To quote from the Supreme Court decision in Alice Corp. v. CLS Bank International:
These cases demonstrate that the mere recitation of a generic computer cannot transform a patent-ineligible abstract idea into a patent-eligible invention. Stating an abstract idea “while adding the words ‘apply it’” is not enough for patent eligibility. Mayo, supra, at ____ (slip op., at 3). Nor is limiting the use of an abstract idea “‘to a particular technological environment.’” Bilski, supra, at 610-611. Stating an abstract idea while adding the words “apply it with a computer” simply combines those two steps, with the same deficient result. Thus, if a patent’s recitation of a computer amounts to a mere instruction to “implement[t]” an abstract idea “on … a computer,” Mayo, supra, at ____ (slip op., at 16), that addition cannot impart patent eligibility. This conclusion accords with the pre-emption concern that undergirds our §101 jurisprudence. Given the ubiquity of computers, see 717 F.3d, at 1286 (Lourie, J., concurring), wholly generic computer implementation is not generally the sort of “additional feature[e]” that provides any “practical assurance that the process is more than a drafting effort designed to monopolize the [abstract idea] itself.” Mayo, 566 U.S., at ____ (slip op., at 8-9).
To quote another paragraph from the same decision:
Put another way, the system claims are no different from the method claims in substance. The method claims recite the abstract idea implemented on a generic computer; the system claims recite a handful of generic computer components configured to implement the same idea This Court has long “warn[ed] … against” interpreting §101 “in ways that make patent eligibility ‘depend simply on the draftsman’s art.’” Mayo, supra, at ____ (slip op., at 3) (quoting Flook, 437 U.S., at 593); see id., at 590 (“The concept of patentable subject matter under §101 is not ‘like a nose of wax which may be turned and twisted in any direction … ’”). Holding that the system claims are patent eligible would have exactly that result.
As training a neural network, or using one, has by now become well-understood, routine, and conventional, as prior art references of record document, training and using a neural network has become a form of “apply it with a computer”.
Applicant refers to PEG Example 39 as teaching that training a neural network cannot be performed in the human mind. As set forth above, Examiner maintains claim 1 (and the other claims) are drawn to a judicial exception (mental process), and replies that PEG Example 39 provides an example of some relevance, but does not teach that reciting the training and use of a neural network necessarily makes a claim eligible under 35 U.S.C. 101. PEG Example 39 involves a specific application of a neural network for facial detection that does not parallel what a human being does in recognizing the faces of his or her acquaintances. Claims 1 and 8 of the instant application essentially involve using a neural network to do what other computer technology and programs could do, if not necessarily as well, or what a human being could do, at least in a relatively simple case.
Furthermore, PEG Example 47, Anomaly Detection, includes an example claim 2, a “method of using an artificial neural network (ANN)”, which comprises, inter alia, “(c) training, by the computer, the ANN based on the input data and a selected training algorithm to generate a trained ANN, wherein the selected training algorithm includes a backpropagation algorithm and a gradient descent algorithm”, “(e) analyzing the one or more detected anomalies using the trained ANN to generate anomaly data; and (f) outputting the anomaly data from the trained ANN.” The Analysis section sets forth that example claim 2 is not eligible, under 35 U.S.C. 101, because, briefly, it is directed to a combination of mental process and mathematical concepts, without significantly more. This applies, although no human being could perform the recited method without a computer and an artificial neural network. In particular, the Analysis section states, “The limitations in (d) and (e) reciting ‘using the trained ANN’ provide nothing more than mere instructions to implement an abstract idea on a generic computer.”
Further in response to Applicant’s arguments (page 9 of the Amendments and Remarks of February 10, 2026), regarding “observations, evaluations, and arguably judgments”, Examiner replies that claim 1, as now amended, implicitly involves observing and evaluating past product configurations in the course of generating a list of such past product configurations to determine whether particular product configurations qualify to be on the list. (This is somewhat changed from the observations and evaluations which would have been made in the previous iteration of claim 1.) Further in reply to Applicant’s argument, Examiner does not dispute that a graphical user interface is technological, but does respectfully reiterate that a GUI is well-understood, routine, and conventional technology, and therefore that the use of a graphical user interface does not raise claim 1 (or the other currently pending claims) to significantly more than an abstract idea.
In response to Applicant’s further arguments (page 10 of the Amendments and Remarks of February 10, 2026), Examiner does not assert that the step of “executing the first neural network based on a current signature and receiving from the first neural network a list of one or more past product configurations that is generated” can be practically performed in the human mind; instead, Examiner’s position is that executing the first neural network is a matter or “apply it” used to implement a fundamentally abstract idea.
Applicant then argues that claim 1 is compliant with 35 U.S.C. 101 under Step 2A, Prong Two, as being directed to a practical application, specifically for searching a database. Examiner respectfully replies that no technological improvement in database searching is recited. There may be some advantage in using a neural network to perform operations (although claim 1 and parallel claim 8 have been amended so that they no longer recite “performing a search of the database”), but it is not clear whether there is a real advantage, or just how such an advantage relates to what is actually recited in the claims. There were presumably some advantages to using an unspecified generic computer to exchange obligations as between parties, instead of performing the exchange manually, in the Shepherd patents at issue in Alice Corp. v. CLS Bank International, but the Supreme Court did not find these presumed advantages to be reasons for patentability.
Applicant also refers to “PEG Example 42, which has indicated that data sharing constitutes a valid practical application under the Alice/Mayo test, and MPEP 2106.04(D)(III) which suggests that claims that improve the technical capture of information are subject matter eligible.” Examiner respectfully replies (again) that PEG Example 42 does not teach that data sharing as such is sufficient to constitute a valid practical application, but that a particular method (Example claim 1) comprising multiple steps to assure that “each user has immediate access to up-to-date patient information” qualifies as a practical application. Example claim 2 of PEG Example 42, which is broader than Example claim 1, does not qualify as “integrated into a practical application” for purposes of Step 2A, Prong Two, although practicing Example claim 2 would be of some use.
For these reasons, Examiner maintains that rejecting the instant claims under 35 U.S.C. 101 is in accordance with precedent and approved procedure.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to NICHOLAS D ROSEN whose telephone number is (571)272-6762. The examiner can normally be reached 9:00 AM-5:30 PM, M-F. Non-official/draft communications may be faxed to the examiner at 571-273-6762, or emailed to Nichol.Rosen@uspto.gov (in the body of an email, please, not as an attachment).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Marissa Thein, can be reached at 571-272-6764. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/NICHOLAS D ROSEN/ Primary Examiner, Art Unit 3689 March 11, 2026