DETAILED ACTION
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination
2. A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 24 December 2025 [hereinafter Response] has been entered, where:
Claims 2-10, and 18 have been amended.
Claim 1 has been cancelled.
Claims 2-21 are pending.
Claims 2-21 are rejected.
Claim Rejections - 35 U.S.C. § 101
3. 35 U.S.C. § 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
4. Claims 2-21 are rejected under 35 U.S.C. § 101 because the claimed invention is directed to an abstract idea without significantly more.
Claim 2 recites a “computer-implemented method,” which is a process and thus one of the statutory categories of patentable subject matter. (35 U.S.C. § 101).
However, under Step 2A Prong One, the claim recites the limitation of “for each training item, implementing . . . label smoothing regularization by applying a smoothing label distribution to the initial target label distribution using a smoothing label distribution to obtain a modified target label distribution that is independent of the training item, to obtain a modified target label distribution.” This limitation of “modifying” recites a mental process, (MPEP § 2106.04(a)(2) sub III), which is one of the groupings of abstract ideas. (MPEP § 2106.04(a)(2)). Also, the claim recites “training . . . to minimize a loss function,” which is a mathematical concept, (MPEP § 2106.04(a)(2) sub I), and is one of the groupings of abstract ideas. The claim recites more details or specifics to the abstract idea of “applying,” “wherein the smoothing label distribution is independent of the training item,” and accordingly, is merely more specific to the abstract idea. Thus, claim 2 recites an abstract idea.
Under Step 2A Prong Two, the claim as a whole is not integrated into a practical application, because the additional elements recited in the claim beyond the identified judicial exception include ”one or more computing devices,” and a “neural network,” which is recited at a high level of generality, and thus is a generic computer component used to implement the abstract idea, (MPEP § 2106.05(f)), and does not integrate the abstract idea into a practical application. The claim also recites the limitation of “training . . . the neural network to adjust values of parameters of the neural network,” The limitation of “training” is the use of the generic computer component (neural network) in the expected manner, which does not serve to integrate the abstract idea into a practical application. (MPEP § 2106.05(f)). The claim recites more details or specifics of the additional element of “training,” “wherein the training comprises minimizing a loss function based on (1) a respective network output generated by the neural network for each of the plurality of training items and (2) the respective modified target label distribution for each training item,” and accordingly, are merely more specific to the additional item.
The claim further recites the limitation of “obtaining a plurality of training items,” which is an insignificant extra-solution activity of mere data gathering, (MPEP § 2106.05(g)), that does not integrate the abstract idea into a practical application. The claim recites further details or specifics to the additional element of “obtaining,” in that “each training item is associated with an initial target label distribution that specifies a respective target score for each label in a predetermined set of multiple labels,” and accordingly, is merely more specific to the additional element. Therefore, claim 2 is directed to the abstract idea.
Finally, under Step 2B, the additional elements, taken alone or in combination, do not represent significantly more than the abstract idea itself. The additional elements recited in the claim beyond the identified judicial exception include ”one or more computing devices,” and a “neural network,” which is a generic computer component used to implement the abstract idea, (MPEP § 2106.05(f)), and does not amount to significantly more than the abstract idea. The claim also recites the limitation of “training . . . the neural network to adjust values of parameters of the neural network,” The limitation of “training” is the use of the generic computer component (neural network) in the expected manner, which does not amount to significantly more than the abstract idea. (MPEP § 2106.05(f)). The claim recites more details or specifics of the additional element of “training,” “wherein the training comprises minimizing a loss function based on (1) a respective network output generated by the neural network for each of the plurality of training items and (2) the respective modified target label distribution for each training item,” and accordingly, are merely more specific to the additional item. The claim further recites the limitation of “obtaining a plurality of training items,” which is a well-understood, routine, and conventional activity of storing and retrieving information in memory, (MPEP § 2106.05(d) sub II.iv), that does not amount to significantly more than the abstract idea. The claim recites further details or specifics to the additional element of “obtaining,” in that “each training item is associated with an initial target label distribution that specifies a respective target score for each label in a predetermined set of multiple labels,” and accordingly, is merely more specific to the additional element. Therefore, claim 2 is subject-matter ineligible.
Claim 10 recites a “system,” which is a product and thus one of the statutory categories of patentable subject matter. (35 U.S.C. § 101).
However, under Step 2A Prong One, the claim recites the limitation of “for each training item, implementing . . . label smoothing regularization by applying a smoothing label distribution to the initial target label distribution using a smoothing label distribution to obtain a modified target label distribution that is independent of the training item.” This limitation of “modifying” recites a mental process, (MPEP § 2106.04(a)(2) sub III), which is one of the groupings of abstract ideas. (MPEP § 2106.04(a)(2)). Also, the claim recites “training . . . to minimize a loss function,” which is a mathematical concept, (MPEP § 2106.04(a)(2) sub I), and is one of the groupings of abstract ideas. The claim recites more details or specifics to the abstract idea of “applying,” “wherein the smoothing label distribution is independent of the training item,” and accordingly, is merely more specific to the abstract idea. Thus, claim 10 recites an abstract idea.
Under Step 2A Prong Two, the claim as a whole is not integrated into a practical application, because the additional elements recited in the claim beyond the identified judicial exception include a “one or more data processing apparatus,” and “one or more memory devices storing instructions” which are generic computer components used to implement the abstract idea, (MPEP § 2106.05(f)), and do not integrate the abstract idea into a practical application. The claim also recites a “neural network,” which is recited at such a level of generality that it is a generic computer component that does not integrate the abstract idea into a practical application. (MPEP § 2106.05(f)). The claim also recites the limitation of “training the neural network to adjust values of parameters of the neural network,” The limitation of “training” is the use of the generic computer components (neural network, one or more data processing apparatus, one or more memory devices) in the expected and usual manner, which do not serve to integrate the abstract idea into a practical application. (MPEP § 2106.05(f)). The claim recites more details or specifics of the additional element of “training,” “wherein the training comprises minimizing a loss function based on (1) a respective network output generated by the neural network for each of the plurality of training items and (2) the respective modified target label distribution for each training item,” and accordingly, are merely more specific to the additional item.
The claim further recites the limitation of “obtaining a plurality of training items,” which is an insignificant extra-solution activity of mere data gathering, (MPEP § 2106.05(g)), that does not integrate the abstract idea into a practical application. The claim recites further details or specifics to the additional element of “obtaining,” in that “each training item is associated with an initial target label distribution that specifies a respective target score for each label in a predetermined set of multiple labels,” and accordingly, is merely more specific to the additional element. Therefore, claim 10 is directed to the abstract idea.
Finally, under Step 2B, the claim as a whole is not integrated into a practical application, because the additional elements recited in the claim beyond the identified judicial exception include a “one or more data processing apparatus,” and “one or more memory devices storing instructions” which are generic computer components used to implement the abstract idea, (MPEP § 2106.05(f)), and do not amount to significantly more than the abstract idea. The claim also recites a “neural network,” which is recited at such a level of generality that it is a generic computer component that does not amount to significantly more than the abstract idea. (MPEP § 2106.05(f)). The claim also recites the limitation of “training the neural network to adjust values of parameters of the neural network,” The limitation of “training” is the use of the generic computer components (neural network, one or more data processing apparatus, one or more memory devices) in the expected and usual manner, which do not amount to significantly more than the abstract idea. (MPEP § 2106.05(f)). The claim recites more details or specifics of the additional element of “training,” “wherein the training comprises minimizing a loss function based on (1) a respective network output generated by the neural network for each of the plurality of training items and (2) the respective modified target label distribution for each training item,” and accordingly, are merely more specific to the additional item. The claim further recites the limitation of “obtaining a plurality of training items,” which is a well-understood, routine, and conventional activity of storing and retrieving information in memory, (MPEP § 2106.05(d) sub II.iv), that does not amount to significantly more than the abstract idea. The claim recites further details or specifics to the additional element of “obtaining,” in that “each training item is associated with an initial target label distribution that specifies a respective target score for each label in a predetermined set of multiple labels,” and accordingly, is merely more specific to the additional element. Therefore, claim 10 is subject-matter ineligible.
Claim 18 recites a “non-transitory computer-readable medium,” which is a product and thus one of the statutory categories of patentable subject matter. (35 U.S.C. § 101).
However, under Step 2A Prong One, the claim recites the limitation of “for each training item, implementing . . . label smoothing regularization by applying a smoothing label distribution to the initial target label distribution that is independent of the training item using a smoothing label distribution to obtain a modified target label distribution.” This limitation of “modifying” recites a mental process, (MPEP § 2106.04(a)(2) sub III), which is one of the groupings of abstract ideas. (MPEP § 2106.04(a)(2)). Also, the claim recites “training . . . to minimize a loss function,” which is a mathematical concept, (MPEP § 2106.04(a)(2) sub I), and is one of the groupings of abstract ideas. The claim recites more details or specifics to the abstract idea of “applying,” “wherein the smoothing label distribution is independent of the training item,” and accordingly, is merely more specific to the abstract idea. Thus, claim 18 recites an abstract idea.
Under Step 2A Prong Two, the claim as a whole is not integrated into a practical application, because the additional elements recited in the claim beyond the identified judicial exception include a “non-transitory computer-readable medium storing software comprising instructions executable by one or more computers,” which are generic computer components used to implement the abstract idea, (MPEP § 2106.05(f)), and do not integrate the abstract idea into a practical application. The claim also recites a “neural network,” which is recited at such a level of generality is a generic computer component that does not integrate the abstract idea into a practical application. (MPEP § 2106.05(f)). The claim also recites the limitation of “training the neural network to adjust values of parameters of the neural network,” The limitation of “training” is the use of the generic computer components (neural network, non-transitory computer-readable medium, one or more computers) in the expected and usual manner, which do not serve to integrate the abstract idea into a practical application. (MPEP § 2106.05(f)). The claim recites more details or specifics of the additional element of “training,” “wherein the training comprises minimizing a loss function based on (1) a respective network output generated by the neural network for each of the plurality of training items and (2) the respective modified target label distribution for each training item,” and accordingly, are merely more specific to the additional item.
The claim further recites the limitation of “obtaining a plurality of training items,” which is an insignificant extra-solution activity of mere data gathering, (MPEP § 2106.05(g)), that does not integrate the abstract idea into a practical application. The claim recites further details or specifics to the additional element of “obtaining,” in that “each training item is associated with an initial target label distribution that specifies a respective target score for each label in a predetermined set of multiple labels,” and accordingly, is merely more specific to the additional element. Therefore, claim 18 is directed to the abstract idea.
Finally, under Step 2B, the claim as a whole is not integrated into a practical application, because the additional elements recited in the claim beyond the identified judicial exception include a “non-transitory computer-readable medium storing software comprising instructions executable by one or more computers,” which are generic computer components used to implement the abstract idea, (MPEP § 2106.05(f)), and do not amount to significantly more than the abstract idea. The claim also recites a “neural network,” which is recited at such a level of generality that it is a generic computer component that does not amount to significantly more than the abstract idea. (MPEP § 2106.05(f)). The claim also recites the limitation of “training the neural network to adjust values of parameters of the neural network,” The limitation of “training” is the use of the generic computer components (neural network, non-transitory computer-readable medium, one or more computers) in the expected and usual manner, which do not amount to significantly more than the abstract idea. (MPEP § 2106.05(f)). The claim recites more details or specifics of the additional element of “training,” “wherein the training comprises minimizing a loss function based on (1) a respective network output generated by the neural network for each of the plurality of training items and (2) the respective modified target label distribution for each training item,” and accordingly, are merely more specific to the additional item. The claim further recites the limitation of “obtaining a plurality of training items,” which is a well-understood, routine, and conventional activity of storing and retrieving information in memory, (MPEP § 2106.05(d) sub II.iv), that does not amount to significantly more than the abstract idea. The claim recites further details or specifics to the additional element of “obtaining,” in that “each training item is associated with an initial target label distribution that specifies a respective target score for each label in a predetermined set of multiple labels,” and accordingly, is merely more specific to the additional element. Therefore, claim 18 is subject-matter ineligible.
Claims 3-5 depend directly or indirectly from claim 2. Claims 11-13 depend directly or indirectly from claim 10. Claim 19 depends from claim 18. The claims provide more details or specifics to the abstract idea of “modifying,” (claims 3 and 11: “wherein the smoothing label distribution specifies a smoothing score for each label in the predetermined set of multiple labels; claims 4 and 12: “wherein the smoothing label distribution is a uniform distribution that specifies a same smoothing score for each label in the predetermined set of multiple labels; claims 5 and13: “wherein the smoothing label distribution is a non-uniform distribution of smoothing scores for the predetermined set of multiple labels that specifies a smoothing score for at least one label in the predetermined set of multiple labels that is different from a smoothing score for at least one other label in the predetermined set of multiple labels”; claim 19: wherein the smoothing label distribution specifies a smoothing score for each label in the predetermined set of multiple labels and wherein the smoothing label distribution is either a uniform distribution or a non-uniform distribution”). Accordingly, the claims are merely more specific to the abstract idea. The abstract idea of these claims are not integrated into a practical application, (see MPEP § 2106.05(g)), nor do they amount to significantly more than the abstract idea, (MPEP § 2106.05(d)), because the claims recite no more than the abstract idea. Therefore, claims 3-5, 11-13, and 19 are subject-matter ineligible.
Claims 6-8 depend directly or indirectly from claim 2. Claims 14-16 depend directly or indirectly from claim 10. Claims 20 and 21 depend directly or indirectly from claim 18. The claims provide more details or specifics to the abstract idea of “modifying,” (claims 6 and 14: “wherein modifying the initial target label distribution using the smoothing label distribution comprises combining the initial target label distribution with the smoothing label distribution”; claims 7 and 15: “wherein combining the initial target label distribution with the smoothing label distribution comprises: calculating a weighted sum of the initial target label distribution and the smoothing label distribution”; claims 8 and 16: wherein calculating the weighted sum of the initial target label distribution and the smoothing label distribution, comprises: calculating a sum of a first term and a second term, wherein a first term is obtained by applying a weight w to the smoothing label distribution and wherein the second term is obtained by applying a weight 1−w to the initial target label distribution”; claim 20: “wherein modifying the initial target label distribution using the smoothing label distribution comprises combining the initial target label distribution with the smoothing label distribution, including calculating a weighted sum of the initial target label distribution and the smoothing label distribution”; claim 21: “wherein calculating the weighted sum of the initial target label distribution and the smoothing label distribution, comprises: calculating a sum of a first term and a second term, wherein a first term is obtained by applying a weight w to the smoothing label distribution and wherein the second term is obtained by applying a weight 1−w to the initial target label distribution”). Accordingly, the claims are merely more specific to the abstract idea. The abstract idea of these claims are not integrated into a practical application, (see MPEP § 2106.05(g)), nor do they amount to significantly more than the abstract idea, (MPEP § 2106.05(d)), because the claims recite no more than the abstract idea. Therefore, claims 6-8, 14-16, 20, and 21 are subject-matter ineligible.
Claim 9 depends directly or indirectly from claim 2. Claim 17 depends directly or indirectly from claim 10. The claims provide more details or specifics to the additional element of “obtaining,” (claims 9 and 17: wherein, for each training item: the target score for a known label for the training item is assigned a predetermined positive value in the initial target label distribution for the training item, and the target score for each label other than the known label is set to 0 in the initial target label distribution”). Accordingly, the claims are merely more specific to the additional element. Further, the abstract idea of these claims are not integrated into a practical application, (see MPEP § 2106.05(g)), nor do they amount to significantly more than the abstract idea, (MPEP § 2106.05(d)), because the claims recite no more than the abstract idea. Therefore, claims 9 and 17 are subject-matter ineligible.
Response to Arguments
5. Examiner has fully considered Applicant’s arguments, and responds below accordingly.
Non-Statutory Obviousness-Type Double Patenting
6. Applicant advises of “filing a terminal disclaimer in compliance with 37 C.F.R. § 1.321(c) with this reply. . . . Withdrawal of the obviousness-type double-patenting rejection is respectfully requested.” (Response at p. 7).
Examiner’s Response:
Examiner agrees. In view of the terminal disclaimer submitted and approved 27 February 2025, the rejection based on non-statutory obviousness-type double patenting is WITHDRAWN.
Claim Rejections – 35 U.S.C. § 101
7. “Claims 2-21 were rejected under 35 U.S.C. § 101 as allegedly reciting non-patentable
subject matter. Applicant respectfully submits that claims 2-21 as presently amended provide a
technical solution to a specific technical improvement in the field of machine learning..
While Applicant respectfully disagrees with the rejections in the Advisory Action, Applicant has amended the claim to recite
"for each training item, implementing, by the one or more computing devices, label smoothing regularization by applying a smoothing label distribution to the initial target label distribution to obtain a modified target label distribution, wherein the smoothing label distribution is independent of the training item."
[(see claim 1, lines 6-10 (emphasis by Applicant))] The amended claim feature thus recites a specific label-smoothing regularization process performed by a computing device, which involves a programmatic transformation where a smoothing label distribution, which is independent of the training item, is applied to the initial target label distribution, to generate a ‘modified target label distribution.’ This statistical adjustment to high-dimensional data across a ‘plurality’ of training items is a computational process for the underlying machine learning training optimization. See Application, [0004], [0021], [0053].” (Response at p. 7).
Also, Applicant submits that “Moreover, MPEP § 2106.05(a), citing Ex Parte Desjardins, Appeal No. 2024-000567 (PTAB September 26, 2025, Appeals Review Panel Decision) (precedential), states that ‘Examiners and panels should not evaluate claims at such a high level of generality' that potentially meaningful technical limitations are dismissed without adequate explanation’ and ‘[w]hen evaluating a claim as a whole, examiners should not dismiss additional elements as mere 'generic computer components' without considering whether such elements confer a technological improvement to a technical problem, especially as to improvements to computer components or the computer system.; Applicant respectfully submits that the Examiner's reasoning in the Advisory Action is exactly the type of reasoning that this portion of the MPEP says to avoid.” (Response at p. 8).
Examiner’s Response:
Examiner respectfully disagrees because the additional elements of the claim (”one or more computing devices,” a “neural network”) are recited at a high-level of generality that does not serve to integrate an abstract idea into a practical application because the abstract idea does not apply, rely on, or use the judicial exception in a manner that imposes a meaningful limit on the abstract idea, such that the claim is more than a drafting effort designed to monopolize or preempt the judicial exception. (2024 SME Guidance, 89 Fed. Reg. 137 at p. 58136 (17 July 2024)).
Moreover, because the high-level of generality, the claim as a whole, where the limitations containing the abstract idea as well as the additional elements in the claim besides the abstract idea need to be evaluated together to determine whether the claim integrates the abstract idea into a practical application. (Id.). As set out above in detail, the generic computer components (”one or more computing devices,” a “neural network”) are used to implement the abstract idea, (MPEP § 2106.05(f)), that does not serve to integrate the abstract idea into a practical application.
Accordingly, the pending claims recite an abstract idea.
Conclusion
9. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
(US Published Application 20150019463 to Simard et al.) teaches that to work well, the Active Labeling Exploration (ALE) algorithm needs a few labels, a few features, and good generalization properties of the early classifiers. Both positive and negative examples are needed, as well as startup features. Also, the classifier needs to be heavily regularized to avoid over-training. Regularization needs to be adjusted automatically so that the complexity of the algorithm can be matched to the increasing number of labels.
(Simard et al., “ICE: Enabling Non-Experts to Build Models Interactively for Large-Scale Lopsided Problems,” arXiv (2014)) teaches interactive labeling, interactive featuring and the user interface of ICE. We designed our system to allow a single teacher to train models interactively, and our description emphasizes this bias. As we describe below, however, we designed the system to encourage teachers to import others’ models and features into their own sessions to be used as features. This functionality means that the value from a community of teachers on ICE can be much more than the sum of the value from each individual. We also envision ICE supporting cooperative teaching scenarios, where multiple teachers contribute to the same learning task.
10. Any inquiry concerning this communication or earlier communications from the Examiner should be directed to KEVIN L. SMITH whose telephone number is (571) 272-5964. Normally, the Examiner is available on Monday-Thursday 0730-1730.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the Examiner by telephone are unsuccessful, the Examiner’s supervisor, KAKALI CHAKI can be reached on 571-272-3719. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/K.L.S./
Examiner, Art Unit 2122
/KAKALI CHAKI/Supervisory Patent Examiner, Art Unit 2122