DETAILED ACTION
This action is in reply to the Amendments filed on 01/14/2026.
Claims 8-13 are cancelled.
Claims 1-7 and 14-20 are rejected.
Claims 1-7 and 14-20 are currently pending and have been examined.
Response to Amendment
Applicant’s amendment, filed 01/14/2026, has been entered. Claims 1 and 14-20 have been amended.
Claim Objections
The claim objections from the prior Office Action have been withdrawn pursuant Applicant’s amendments.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Objections
Claims 14-20 are objected to because of the following informalities:
-Claim 14 reads “wherein the candidate terms are displayed in a graphical user interface below a user interface element in which the user input the prefix” but should likely read “wherein the candidate terms are displayed in the graphical user interface below a user interface element in which the user input the prefix”
Claims 15-20 inherit the deficiencies noted in claim 14, and are therefore objected to on the same basis.
Appropriate correction is required.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-7 and 14-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., law of nature, a natural phenomenon, or an abstract idea) without significantly more.
Under Step 1 of the Subject Matter Eligibility Test for Products and Processes, the claims must be directed to one of the four statutory categories (see MPEP 2106.03). All the claims are directed to one of the four statutory categories (YES).
Under Step 2A of the Subject Matter Eligibility Test, it is determined whether the claims are directed to a judicially recognized exception (see MPEP 2106.04). Step 2A is a two-prong inquiry.
Under Prong 1, it is determined whether the claim recites a judicial exception (YES). Taking Claim 1 as representative, the claim recites limitations that fall within the certain methods of organizing human activity groupings of abstract ideas, including:
-obtaining training data comprising a plurality of examples, each example comprising:
-a set of input features for the conversion model, wherein the set of input features comprises features for a warehouse, a term corresponding to a category of items within a taxonomy of items, a prefix received in a prior search, and a feature describing a number of orders for items from the warehouse that included an item within the category of the taxonomy of items corresponding to the term, wherein a category of items within the taxonomy of items represents a set of items that are related within the taxonomy of items, and
-a label indicating whether an item within the category corresponding to the term was included in a prior order received by an online concierge system corresponding to the prior search,
-wherein obtaining the training data comprises generating the plurality of examples, wherein at least one example of the plurality of examples is generated by:
-receiving a prefix for a search query from a client device associated with a user;
-generating a candidate term for each category of a set of categories within the taxonomy of items;
-transmitting the generated candidate terms to the client device for display to the user through a graphical user interface;
-receiving, from the client device, a selection by the user of a candidate term of the generated candidate terms;
-responsive to receiving the selection of the candidate term, selecting a set of candidate items based on the selected candidate term;
-transmitting the selected set of candidate items to the client device for display to the user;
-receiving an order from the client device, wherein the order comprises a set of items selected by the user;
-determining that no item in the set of items of the received order is associated with the category associated with the selected candidate term; and
-generating the example for the selected candidate term, wherein the example comprises a label indicating that no item of the category corresponding to the selected candidate term was included in the order;
-initializing a neural network for the conversion model that comprises a plurality of layers, where the neural network is configured to receive, as an input a set of features of a received prefix from a search, a candidate term corresponding to a candidate category within the taxonomy of items, and a warehouse, and is configured to generate a predicted probability of an item within the candidate category of the taxonomy of items corresponding to the candidate term being included in an order received by the online concierge system for the warehouse based on the prefix; and
-training the conversion model to, for a set of candidate terms input to the conversion model, generat[ing] a predicted probability for each candidate term that an item within a candidate category within the taxonomy corresponding to each candidate term would be included in an order if displayed as a candidate suggestion for a prefix to a search input by a user by, for each of the plurality of the examples of the training data:
-applying the [conversion model] neural network to the combination of the warehouse, the candidate term, the prefix received in the prior search, and the corresponding set of features of the combination to generate a predicted probability of an item within the candidate category corresponding to the candidate term being included in an order received by the online concierge system, wherein the candidate category comprises a plurality of items that are related in a taxonomy;
-backpropagating one or more error terms obtained from one or more loss functions to update a set of parameters of the neural network, the backpropagating performed through the neural network and the one or more of the error terms based on a difference between the label applied to an example and the generated predicted probability;
-stopping the backpropagation after the one or more loss functions satisfy one or more criteria; and
-storing the set of parameters of the layers of the network on the computer-readable medium as parameters of the conversion model
The above limitations recite the concept of determining and providing candidate term suggestions for searching an item. The above limitations fall within the “Certain Methods of Organizing Human Activity” and the "mathematical concepts" groupings of abstract ideas, enumerated in MPEP 2106.04(a)(2)(11).
Certain methods of organizing human activity include:
fundamental economic principles or practices (including hedging, insurance, and mitigating risk)
commercial or legal interactions (including agreements in the form of contracts; legal obligations; advertising, marketing or sales activities or behaviors; and business relations)
managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions)
Mathematical concepts include:
mathematical relationships
mathematical formulas or equations
mathematical calculations
The limitations of obtaining training data comprising a plurality of examples, each example comprising: a set of input features for the conversion model, wherein the set of input features comprises features for a warehouse, a term corresponding to a category of items within a taxonomy of items, a prefix received in a prior search, and a feature describing a number of orders for items from the warehouse that included an item within the category of the taxonomy of items corresponding to the term, wherein a category of items within the taxonomy of items represents a set of items that are related within the taxonomy of items, and wherein obtaining the training data comprises generating the plurality of examples, wherein at least one example of the plurality of examples is generated by: generating a candidate term for each category of a set of categories within the taxonomy of items; responsive to receiving the selection of the candidate term, selecting a set of candidate items based on the selected candidate term; determining that no item in the set of items of the received order is associated with the category associated with the selected candidate term; and generating the example for the selected candidate term, wherein the example comprises a label indicating that no item of the category corresponding to the selected candidate term was included in the order are processes that, under their broadest reasonable interpretation, cover a commercial interaction. For example, “obtaining,” “obtaining,” “generating,” “generating,” “selecting,” “determining,” and “generating” in the context of this claim encompass advertising, and marketing or sales activities.
Similarly, the limitations of a label indicating whether an item within the category corresponding to the term was included in a prior order received by an online concierge system corresponding to the prior search, receiving a prefix for a search query from a client device associated with a user; transmitting the generated candidate terms to the client device for display to the user through a graphical user interface; receiving, from the client device, a selection by the user of a candidate term of the generated candidate terms; transmitting the selected set of candidate items to the client device for display to the user; receiving an order from the client device, wherein the order comprises a set of items selected by the user; initializing a neural network for the conversion model that comprises a plurality of layers, where the neural network is configured to receive, as an input a set of features of a received prefix from a search, a candidate term corresponding to a candidate category within the taxonomy of items, and a warehouse, and is configured to generate a predicted probability of an item within the candidate category of the taxonomy of items corresponding to the candidate term being included in an order received by the online concierge system for the warehouse based on the prefix; and training the conversion model to, for a set of candidate terms input to the conversion model, generat[ing] a predicted probability for each candidate term that an item within a candidate category within the taxonomy corresponding to each candidate term would be included in an order if displayed as a candidate suggestion for a prefix to a search input by a user by, for each of the plurality of the examples of the training data: applying the [conversion model] neural network to the combination of the warehouse, the candidate term, the prefix received in the prior search, and the corresponding set of features of the combination to generate a predicted probability of an item within the candidate category corresponding to the candidate term being included in an order received by the online concierge system, wherein the candidate category comprises a plurality of items that are related in a taxonomy; backpropagating one or more error terms obtained from one or more loss functions to update a set of parameters of the neural network, the backpropagating performed through the neural network and the one or more of the error terms based on a difference between the label applied to an example and the generated predicted probability; stopping the backpropagation after the one or more loss functions satisfy one or more criteria; and storing the set of parameters of the layers of the network on the computer-readable medium as parameters of the conversion model are processes that, under their broadest reasonable interpretation, cover a commercial interaction, as well as, mathematical relationships, formulas, and calculations. That is, other than reciting that the concierge system is an online concierge system, that the prefix is received from a client device associated with a user, that the transmitting is to the client device and displayed through a graphical user interface, that the receiving is from the client device, that the transmitting is to the client device, that the receiving is from the client device, initializing a neural network for the conversion model that comprises a plurality of layers, that the input is received by the neural network, that the conversion model is trained, that the conversion model is a neural network, that the one or more error terms are backpropagated, that the set of the parameters are of the neural network, that the backpropagating performed through the neural network, that the backpropagation is stopped after the one or more loss functions satisfy one or more criteria, and that the set of parameters are of the layers of the network on the computer readable storage medium, nothing in the claim element precludes the step from practically being performed by people. For example, but for the “an online concierge system,” “a client device associated with a user,” “a graphical user interface,” “initializing a neural network for the conversion model that comprises a plurality of layers,” “the neural network,” “training,” “backpropagating,” “the backpropagating performed through the neural network,” “stopping the backpropagation,” “layers of the network,” and “the computer readable storage medium” language, “receiving,” “transmitting,” “receiving,” “transmitting,” “receiving,” “received,” “initializing,” “receive,” “training,” “applying,” “update,” “stopping,” and “storing” in the context of this claim encompass advertising, and marketing or sales activities and mathematical relationships, formulas, and calculations.
Under Prong 2, it is determined whether the claim recites additional elements that integrate the exception into a practical application of the exception. This judicial exception is not integrated into a practical application (NO).
-obtaining training data comprising a plurality of examples, each example comprising:
-a set of input features for the conversion model, wherein the set of input features comprises features for a warehouse, a term corresponding to a category of items within a taxonomy of items, a prefix received in a prior search, and a feature describing a number of orders for items from the warehouse that included an item within the category of the taxonomy of items corresponding to the term, wherein a category of items within the taxonomy of items represents a set of items that are related within the taxonomy of items, and
-a label indicating whether an item within the category corresponding to the term was included in a prior order received by an online concierge system corresponding to the prior search,
-wherein obtaining the training data comprises generating the plurality of examples, wherein at least one example of the plurality of examples is generated by:
-receiving a prefix for a search query from a client device associated with a user;
-generating a candidate term for each category of a set of categories within the taxonomy of items;
-transmitting the generated candidate terms to the client device for display to the user through a graphical user interface;
-receiving, from the client device, a selection by the user of a candidate term of the generated candidate terms;
-responsive to receiving the selection of the candidate term, selecting a set of candidate items based on the selected candidate term;
-transmitting the selected set of candidate items to the client device for display to the user;
-receiving an order from the client device, wherein the order comprises a set of items selected by the user;
-determining that no item in the set of items of the received order is associated with the category associated with the selected candidate term; and
-generating the example for the selected candidate term, wherein the example comprises a label indicating that no item of the category corresponding to the selected candidate term was included in the order;
-initializing a neural network for the conversion model that comprises a plurality of layers, where the neural network is configured to receive, as an input a set of features of a received prefix from a search, a candidate term corresponding to a candidate category within the taxonomy of items, and a warehouse, and is configured to generate a predicted probability of an item within the candidate category of the taxonomy of items corresponding to the candidate term being included in an order received by the online concierge system for the warehouse based on the prefix; and
-training the conversion model to, for a set of candidate terms input to the conversion model, generate a predicted probability for each candidate term that an item within a candidate category within the taxonomy corresponding to each candidate term would be included in an order if displayed as a candidate suggestion for a prefix to a search input by a user by, for each of the plurality of the examples of the training data:
-applying the neural network to the combination of the warehouse, the candidate term, the prefix received in the prior search, and the corresponding set of features of the combination to generate a predicted probability of an item within the candidate category corresponding to the candidate term being included in an order received by the online concierge system, wherein the candidate category comprises a plurality of items that are related in a taxonomy;
-backpropagating one or more error terms obtained from one or more loss functions to update a set of parameters of the neural network, the backpropagating performed through the neural network and the one or more of the error terms based on a difference between the label applied to an example and the generated predicted probability;
-stopping the backpropagation after the one or more loss functions satisfy one or more criteria; and
-storing the set of parameters of the layers of the network on the computer-readable medium as parameters of the conversion model
The additional elements of claim 1 are recited at a high level of generality (i.e. as generic computing hardware) such that they amount to nothing more than mere instructions to implement or apply the abstract idea on a generic computing hardware (or, merely use a computer as a tool to perform an abstract idea) as supported by paragraph [0071] of Applicant’s specification – “Embodiments of the invention may also relate to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, and/or it may comprise a general-purpose computing device selectively activated or reconfigured by a computer program stored in the computer.” Specifically, the additional elements of a non-transitory computer readable storage medium, an online concierge system, a client device associated with a user, a graphical user interface, initializing a neural network for the conversion model that comprises a plurality of layers, the neural network, training, backpropagating, the backpropagating performed through the neural network, stopping the backpropagation, and layers of the network, are recited at a high-level of generality (i.e. as a generic processor performing the generic computer functions of obtaining data, receiving data, generating data, transmitting data, selecting data, determining data, training data, applying a model to generate data, updating data, and storing data) such that they amount do no more than mere instructions to apply the exception using generic computer components. Accordingly, these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea. Further, the additional elements do no more than generally link the use of the judicial exception to a particular technological environment or field of use (such as computers or computing networks). Employing well-known computer functions to execute an abstract idea, even when limiting the use of the idea to one particular environment, does not integrate the exception into a practical application.
Additionally, the additional elements are insufficient to integrate the abstract idea into a practical application because the claim fails to i) reflect an improvement in the functioning of a computer or an improvement to another technology or technical field, ii) apply the judicial exception with, or use the judicial exception in conjunction with, a particular machine or manufacture that is integral to the claim, iii) effect a transformation or reduction of a particular article to a different state or thing, or iv) apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment.
Accordingly, the judicial exception is not integrated into a practical application.
Under Step 2B, it is determined whether the claims recite additional elements that amount to significantly more than the judicial exception. The claims of the present application do not include additional elements that are sufficient to amount to significantly more than the judicial exception (NO).
In the case of claim 1, taken individually or as a whole, the additional elements of claim 9 do not provide an inventive concept. As discussed above under step 2A (prong 2) with respect to the integration of the abstract idea into a practical application, the additional elements used to perform the claimed functions amount to no more than a general link to a technological environment.
Even considered as an ordered combination (as a whole), the additional elements do not add anything significantly more than when considered individually.
Claim 14 is a computer program product reciting similar functions as claim 1. Examiner notes that claim 14 recites the additional elements of a non-transitory computer readable storage medium, a processor, an online concierge system, a client device associated with a user, a graphical user interface, initializing a neural network for the conversion model that comprises a plurality of layers, the neural network, training, backpropagating, the backpropagating performed through the neural network, stopping the backpropagation, layers of the network, and a user interface element, however, claim 14 does not qualify as eligible subject matter for similar reasons as claim 1 indicated above.
Therefore, claims 1 and 14 do not provide an inventive concept and do not qualify as eligible subject matter.
Dependent claims 2-7 and 15-20, when analyzed as a whole, are held to be patent ineligible under 35 U.S.C. § 101 because they do not add “significantly more” to the abstract idea. More specifically, dependent claims 2-7 and 15-20 further fall within the “Certain Methods of Organizing Human Activity” grouping of abstract ideas in that they recite commercial interactions. Dependent claims 5-7, 15-16, and 20 do not recite any farther additional elements, and as such are not indicative of integration into a practical application for at least similar reasons discussed above. Dependent claims 2-4 and 17-19 recite the additional elements of the online concierge system, but similar to the analysis under prong two of Step 2A these additional elements are used as a tool to perform the abstract idea. As such, under prong two of Step 2A, claims 2-7 and 15-20 are not indicative of integration into a practical application for at least similar reasons as discussed above. Thus, dependent claims 2-7 and 15-20 are “directed to” an abstract idea. Next, under Step 2B, similar to the analysis of claims 1 and 14, dependent claims 2-7 and 15-20 when analyzed individually and as an ordered combination, merely further define the commonplace business method (i.e. determining and providing candidate term suggestions for searching an item) being applied on a general-purpose computer and, therefore, do not amount to significantly more than the abstract idea itself. Accordingly, the Examiner concludes that there are no meaningful limitations in the claims that transform the judicial exception into a patent eligible application such that the claims amount to significantly more than the judicial exception itself. The analysis above applies to all statutory categories of invention.
Allowable Subject Matter
In the present application, claims 1-7 and 14-20 would be allowable if rewritten or amended to overcome the rejections under 35 USC § 101 set forth in this Office action. The following is the Examiner's statement of reasons of allowance:
Regarding 35 U.S.C. §103, upon review of the evidence at hand, it is hereby concluded that the totality of the evidence, alone or in combination, neither anticipates, reasonably teaches, nor renders obvious the below noted features of the applicant’s invention. Claims 1-7 and 14-20 are allowable over the prior art as follows:
Claims 1-7 and 14-20 are allowable over 35 U.S.C. §103 as follows:
The most relevant prior art made of record includes previously cited Sen et al. (US 11,126,660 B1), previously cited Hinegardner et al. (US 11,016,964 B1), previously cited Cheng et al. (US 10,929,392 B1), and newly cited Achan et al. (US 2021/0034687 A1). Sen teaches obtaining training data comprising a plurality of examples, each example comprising: a set of input features for the conversion model, wherein the set of input features comprises a term corresponding to a category of items within a taxonomy of items, a prefix received in a prior search, and a feature describing a number of orders for items that included an item within the category of the taxonomy of items corresponding to the term (Sen, see at least: Col. 9 Ln. 28-66, Col. 13 Ln. 58-61, Col. 3 Ln. 52-66, and Col. 18 Ln. 23-26); a label indicating whether an item within the category corresponding to the term was included in a prior order received by an online concierge system corresponding to the prior search (Sen, see at least: Col. 18 Ln. 23-26), wherein obtaining the training data comprises generating the plurality of examples (Sen, see at least: Col. 5 Ln. 50-53 and Col. 8 Ln. 37-40) and, wherein at least one example of the plurality of examples is generated by: receiving a prefix for a search query from a client device associated with a user (Sen, see at least: Col. 3 Ln. 47-67 & Col. 4 Ln. 1-7 and Col. 8 Ln. 29-40); generating a candidate term for each category of a set of categories (Sen, see at least: Col. 3 Ln. 47-67 & Col. 4 Ln. 1-7 and Col. 8 Ln. 29-40); transmitting the generated candidate terms to the client device for display to the user through a graphical user interface (Sen, see at least: Col. 3 Ln. 47-67 & Col. 4 Ln. 1-7 and Col. 8 Ln. 29-40); receiving, from the client device, a selection by the user of a candidate term of the generated candidate terms (Sen, see at least: Col. 3 Ln. 47-67 & Col. 4 Ln. 1-7 and Col. 8 Ln. 29-40); responsive to receiving the selection of the candidate term, selecting a set of candidate items based on the selected candidate term (Sen, see at least: Col. 3 Ln. 47-67 & Col. 4 Ln. 1-7 and Col. 8 Ln. 29-40); transmitting the selected set of candidate items to the client device for display to the user (Sen, see at least: Col. 3 Ln. 47-67 & Col. 4 Ln. 1-7 and Col. 8 Ln. 29-40); receiving an order from the client device, wherein the order comprises a set of items selected by the user (Sen, see at least: Col. 3 Ln. 47-67 & Col. 4 Ln. 1-7 and Col. 8 Ln. 29-40); initializing a neural network for the conversion model that comprises a plurality of layers, where the neural network is configured to receive, as an input a set of features of a received prefix from a search, a candidate term corresponding to a candidate category within the taxonomy of items, and is configured to generate a predicted probability of an item within the candidate category of the taxonomy of items corresponding to the candidate term being included in an order received by the online concierge system based on the prefix (Sen, see at least: Col. 13 Ln. 58-61, Col. 9 Ln. 3-8, Col. 9 Ln. 28-66 & Col. 10 Ln. 1-3, Col. 3 Ln. 47-64 and Col. 3 Ln. 23-32); and training the conversion model to, for a set of candidate terms input to the conversion model, generate a predicted probability for each candidate term that an item within a candidate category within the taxonomy corresponding to each candidate term would be included in an order if displayed as a candidate suggestion for a prefix to a search input by a user by (Sen, see at least: Col. 9 Ln. 28-66, Col. 3 Ln. 25-38, Col. 3 Ln. 43-66 and Figs. 2-3) for each of the plurality of the examples of the training data: applying the neural network to the combination of the candidate term, the prefix received in the prior search, and the corresponding set of features of the combination to generate a predicted probability of an item within the candidate category corresponding to the candidate term being included in an order received by the online concierge system (Sen, see at least: Col. 9 Ln. 28-66, Col. 9 Ln. 3-8, Col. 3 Ln. 43-66 and Figs. 2-3), backpropagating one or more error terms obtained from one or more loss functions to update a set of parameters of the neural network, the backpropagating performed through the neural network and the one or more of the error terms based on a difference between the label applied to an example and the generated predicted probability (Sen, see at least: Col. 15 Ln. 12-21, Col. 5 Ln. 4-10, Col. 9 Ln. 41-66 and Col. 3 Ln. 25-36);and storing the set of parameters of the layers of the network on the computer-readable storage medium as parameters of the conversion model (Sen, see at least: Col. 15 Ln. 12-21 and Col. 10 Ln. 43-48).
Sen is deficient in a number of ways. As written, the claims require a set of input features comprising features for a warehouse, orders for items being from the warehouse, a category of items within the taxonomy of items represents a set of items that are related within the taxonomy of items; each category of items being within the taxonomy of items; determining that no item in the set of items of the received order is associated with the category associated with the selected candidate term; and generating the example for the selected candidate term, wherein the example comprises a label indicating that no item of the category corresponding to the selected candidate term was included in the order; the neural network being configured to receive, as an input a set of features of a warehouse, and an order being received for the warehouse; a combination being of the warehouse, the candidate term, the prefix; the candidate category comprising a plurality of items that are related in a taxonomy; and stopping the backpropagation after the one or more loss functions satisfy one or more criteria.
Regarding Hinegardner, Hinegardner teaches a set of input features comprising features for a warehouse and orders for items being from the warehouse (Hinegardner, see at least: Col. 4 Ln. 58-67 & Col. 5 Ln. 1 and Col. 7 Ln. 44-51); the neural network being configured to receive, as an input a set of features of a warehouse, and an order being received for the warehouse (Hinegardner, see at least: Col. 4 Ln. 58-67 & Col. 5 Ln. 1-11 and Col. 7 Ln. 44-51); and a combination being of the warehouse, the candidate term, the prefix (Hinegardner, see at least: Col. 4 Ln. 58-67 & Col. 5 Ln. 1).
Though disclosing these features, Hinegardner does not disclose or render obvious the features discussed above.
Regarding Cheng, Cheng teaches a category of items within the taxonomy of items represents a set of items that are related within the taxonomy of items (Cheng, see at least: Col. 14 Ln. 42-49); each category of items being within the taxonomy of items (Cheng, see at least: Col. 15 Ln. 48-61); the candidate category comprising a plurality of items that are related in a taxonomy (Cheng, see at least: Col. 14 Ln. 42-49); and stopping the backpropagation after the one or more loss functions satisfy one or more criteria (Cheng, see at least: Col. 7 Ln. 47-57 and Col. 12 Ln. 9-19).
Though disclosing these features, Cheng does not disclose or render obvious the features discussed above.
Regarding Achan, Achan teaches utilizing both positive and negative historical samples of conversions in training data (Achan, see at least: [0048]).
Though disclosing these features, Achan does not disclose or render obvious at least one example of the plurality of examples is generated by: determining that no item in the set of items of the received order is associated with the category associated with the selected candidate term; and generating the example for the selected candidate term, wherein the example comprises a label indicating that no item of the category corresponding to the selected candidate term was included in the order.
Ultimately, the particular combination of limitations as claimed, is not anticipated nor rendered obvious in view of Sen, Hinegardner, Cheng, and Achan, and the totality of the prior art. While certain references may disclose more general concepts and parts of the claim, the prior art available does not specifically disclose the particular combination of these limitations.
Sen, Hinegardner, Cheng, and Achan, however, do not teach or suggest, alone or in combination the claimed invention. Examiner emphasizes that the prior art/additional art would only be combined and deemed obvious based on knowledge gleaned from the applicant’s disclosure. Such a reconstruction is improper (i.e. hindsight reasoning). See In re McLaughlin, 443 F.2d 1392, 170 USPQ 209 (CCPA 1971).
Cited NPL reference U (cited 02/20/2026 on PTO-892) teaches a convolutional neural networks adapted for hierarchical text classification task, but does not teach or suggest the recited limitations.
The Examiner further emphasizes the claims as a whole and hereby asserts that the totality of the evidence fails to set forth, either explicitly or implicitly, an appropriate rationale for further modification of the evidence at hand to arrive at the claimed invention. The combination of features as claimed would not be obvious to one of ordinary skill in the art as combining various references from the totality of evidence to reach the combination of features as claimed would be a substantial reconstruction of Applicant’s claimed invention relying on improper hindsight bias.
It is thereby asserted by Examiner that, in light of the above and further deliberation over all of the evidence at hand, that the claims are allowable as the evidence at hand does not anticipate the claims and does not render obvious any further modification of the references to a person of ordinary skill in the art.
Response to Arguments
Rejections under 35 U.S.C. §101
Applicant argues that The claimed invention recites a technical improvement by providing specific steps for how to generate data for a machine-learning model. It is clear to persons of ordinary skill in the art how important training data is to machine-learning models. The quality of the data supplied is crucial for these systems to function effectively. Essentially, every learning system is only as proficient as the data it is fed.") ( emphasis added). As is clear from these sources, ensuring training data of sufficient quality and quantity is at least as important, if not more important, than new developments in the structure of a machine-learning model. The claimed invention here provides a process for how to generate training data a machine-learning model for improving an auto-completion process for search queries. For example, the claims recite detailed steps on how training examples are generated. The recited limitations are not merely the traditional steps for generating training data; instead, they are specific steps for ensuring the training data has the relevant labels to achieve the outcome needed for integrating the machine learning model into the broader system (Remarks, pages 11-13).
Examiner respectfully disagrees. Improving data is not a technical improvement as it does not reflect an improvement in the functioning of a computer or an improvement to another technology or technical field. This is further supported in the CloudFactory article cited by Applicant that describes how all machine learning models rely on training data and the importance of the quality of the training data. Improving the quality of data does not improve the machine learning technology itself. Accordingly, the amended claims are ineligible.
Applicant further argues that claim 14 has been amended to specify how the claimed conversion model is used to arrange the candidate terms within a graphical user interface. Specifically, it states that the candidate terms are displayed "below a user interface element in which the user input the prefix" and "the candidate terms are displayed such that candidate terms with higher predicted probabilities are displayed above candidate terms with lower predicted probabilities." These predicted probabilities are generated by applying the conversion model to the relevant input data. Applicant further cites to paragraph [0018] of the spec (Remarks pages 13-14).
Examiner respectfully disagrees. Merely displaying ranked data below a user interface element in a GUI does not reflect an improvement in the functioning of a computer or an improvement to another technology or technical field (i.e. the interface itself is not improved it is merely utilized). The recitation of additional elements such as the GUI and the user interface element amount to nothing more than mere instructions to implement or apply the abstract idea on a generic computing hardware (or, merely use a computer as a tool to perform an abstract idea). Additionally, allowing a user to more easily identify, and to select, terms for inclusion in a search query that correspond to items more likely to be included in an order, as well as, allowing users to more efficiently generate orders are business improvements, they are not technological improvements. Accordingly, the amended claims are ineligible.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
-Mun et al. (US 11,379,900 B1) teaches identifying a category of a search query based on a search term.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ARIELLE E WEINER whose telephone number is (571)272-9007. The examiner can normally be reached M-F 8:30-5:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Maria-Teresa (Marissa) Thein can be reached at 571-272-6764. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ARIELLE E WEINER/ Primary Examiner, Art Unit 3689