Prosecution Insights
Last updated: April 19, 2026
Application No. 18/129,167

METHOD AND SYSTEM FOR KNOWLEDGE TRANSFER BETWEEN DIFFERENT ML MODEL ARCHITECTURES

Non-Final OA §101
Filed
Mar 31, 2023
Examiner
STARKS, WILBERT L
Art Unit
2122
Tech Center
2100 — Computer Architecture & Software
Assignee
Infosys Limited
OA Round
1 (Non-Final)
76%
Grant Probability
Favorable
1-2
OA Rounds
3y 6m
To Grant
80%
With Interview

Examiner Intelligence

Grants 76% — above average
76%
Career Allow Rate
493 granted / 653 resolved
+20.5% vs TC avg
Minimal +4% lift
Without
With
+4.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 6m
Avg Prosecution
47 currently pending
Career history
700
Total Applications
across all art units

Statute-Specific Performance

§101
40.3%
+0.3% vs TC avg
§103
13.1%
-26.9% vs TC avg
§102
35.7%
-4.3% vs TC avg
§112
6.0%
-34.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 653 resolved cases

Office Action

§101
DETAILED ACTION Claims 1-17 have been examined. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 U.S.C. § 101 35 U.S.C. § 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. The invention, as taught in Claims 1-17, is directed to “mental steps” and “mathematical steps” without significantly more. The claims recite: • generating,…, a set of class probabilities for an unlabelled dataset based on a labelling function • unlabelled dataset • associated with the primary ML model • unlabelled dataset and the associated set of class probabilities • secondary ML model Claim 1 Step 1 inquiry: Does this claim fall within a statutory category? The preamble of the claim recites “1. A method for managing knowledge of a primary Machine Learning (ML) model, the method comprising:…” Therefore, it is a “method” (or “process”), which is a statutory category of invention. Therefore, the answer to the inquiry is: “YES.” Step 2A (Prong One) inquiry: Are there limitations in Claim 1 that recite abstract ideas? YES. The following limitations in Claim 1 recite abstract ideas that fall within at least one of the groupings of abstract ideas enumerated in the 2019 PEG. Specifically, they are “mental steps” and “mathematical steps”: • generating,…, a set of class probabilities for an unlabelled dataset based on a labelling function • unlabelled dataset • associated with the primary ML model • unlabelled dataset and the associated set of class probabilities • secondary ML model Step 2A (Prong Two) inquiry: Are there additional elements or a combination of elements in the claim that apply, rely on, or use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that it is more than a drafting effort designed to monopolize the exception? Applicant’s claims contain the following “additional elements”: (1) A “computing system” (2) A “primary ML model employs a first ML model architecture”/ “secondary ML model”/ “first ML model” (3) A “transferring, by the computing system”/ “knowledge transfer technique” A “computing system” is a broad term which is described at a high level and includes general purpose computers. M.P.E.P. § 2106.04(d)(I) recites: The courts have also identified limitations that did not integrate a judicial exception into a practical application: • Merely reciting the words “apply it” (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea, as discussed in MPEP § 2106.05(f); • Adding insignificant extra-solution activity to the judicial exception, as discussed in MPEP § 2106.05(g); and • Generally linking the use of a judicial exception to a particular technological environment or field of use, as discussed in MPEP § 2106.05(h). This “computing system” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)). A “primary ML model employs a first ML model architecture”/ “secondary ML model”/ “first ML model” is a broad term which is described at a high level. Applicant’s Claim 1 merely teaches the embodiment where the claimed “ML model,” which is an “additional element”. The ML model is not used to calculate anything at all. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) This “primary ML model employs a first ML model architecture”/ “secondary ML model”/ “first ML model” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)). A “transferring, by the computing system”/ “knowledge transfer technique” is a broad term for merely applying a computer, which is described at a high level. M.P.E.P. § 2106.05(d)(II) recites: The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity. i. Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network); … M.P.E.P. § 2106.05 (f)(2) recites in part: (2) Whether the claim invokes computers or other machinery merely as a tool to perform an existing process. Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general purpose computer or computer components after the fact to an abstract idea (e.g., a fundamental economic practice or mathematical equation) does not integrate a judicial exception into a practical application or provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit). Similarly, “claiming the improved speed or efficiency inherent with applying the abstract idea on a computer” does not integrate a judicial exception into a practical application or provide an inventive concept. Intellectual Ventures I LLC v. Capital One Bank (USA), 792 F.3d 1363, 1367, 115 USPQ2d 1636, 1639 (Fed. Cir. 2015). In contrast, a claim that purports to improve computer capabilities or to improve an existing technology may integrate a judicial exception into a practical application or provide significantly more. McRO, Inc. v. Bandai Namco Games Am. Inc., 837 F.3d 1299, 1314-15, 120 USPQ2d 1091, 1101-02 (Fed. Cir. 2016); Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 1335-36, 118 USPQ2d 1684, 1688-89 (Fed. Cir. 2016). See MPEP §§ 2106.04(d)(1) and 2106.05(a) for a discussion of improvements to the functioning of a computer or to another technology or technical field. TLI Communications provides an example of a claim invoking computers and other machinery merely as a tool to perform an existing process. The court stated that the claims describe steps of recording, administration and archiving of digital images, and found them to be directed to the abstract idea of classifying and storing digital images in an organized manner. 823 F.3d at 612, 118 USPQ2d at 1747. The court then turned to the additional elements of performing these functions using a telephone unit and a server and noted that these elements were being used in their ordinary capacity (i.e., the telephone unit is used to make calls and operate as a digital camera including compressing images and transmitting those images, and the server simply receives data, extracts classification information from the received data, and stores the digital images based on the extracted information). 823 F.3d at 612-13, 118 USPQ2d at 1747-48. In other words, the claims invoked the telephone unit and server merely as tools to execute the abstract idea. Thus, the court found that the additional elements did not add significantly more to the abstract idea because they were simply applying the abstract idea on a telephone network without any recitation of details of how to carry out the abstract idea. This “transferring, by the computing system”/ “knowledge transfer technique” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)). The answer to the inquiry is “NO”, no additional elements integrate the claimed abstract idea into a practical application. Step 2B inquiry: Does the claim provide an inventive concept, i.e., does the claim recite additional element(s) or a combination of elements that amount to significantly more than the judicial exception in the claim? Applicant’s claims contain the following “additional elements”: (1) A “computing system” (2) A “primary ML model employs a first ML model architecture”/ “secondary ML model”/ “first ML model” (3) A “transferring, by the computing system”/ “knowledge transfer technique” A “computing system” is a broad term which is described at a high level and includes general purpose computers. M.P.E.P. § 2106.05 (I)(A)(i-ii) recites: Limitations that the courts have found not to be enough to qualify as “significantly more” when recited in a claim with a judicial exception include: i. Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, e.g., a limitation indicating that a particular function such as creating and maintaining electronic records is performed by a computer, as discussed in Alice Corp., 573 U.S. at 225-26, 110 USPQ2d at 1984 (see MPEP § 2106.05(f)); ii. Simply appending well-understood, routine, conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception, e.g., a claim to an abstract idea requiring no more than a generic computer to perform generic computer functions that are well-understood, routine and conventional activities previously known to the industry, as discussed in Alice Corp., 573 U.S. at 225, 110 USPQ2d at 1984 (see MPEP § 2106.05(d)); Further, M.P.E.P. § 2016.05(f) recites: 2106.05(f) Mere Instructions To Apply An Exception [R-10.2019] Another consideration when determining whether a claim integrates a judicial exception into a practical application in Step 2A Prong Two or recites significantly more than a judicial exception in Step 2B is whether the additional elements amount to more than a recitation of the words “apply it” (or an equivalent) or are more than mere instructions to implement an abstract idea or other exception on a computer. As explained by the Supreme Court, in order to make a claim directed to a judicial exception patent-eligible, the additional element or combination of elements must do “‘more than simply stat[e] the [judicial exception] while adding the words ‘apply it’”. Alice Corp. v. CLS Bank, 573 U.S. 208, 221, 110 USPQ2d 1976, 1982-83 (2014) (quoting Mayo Collaborative Servs. V. Prometheus Labs., Inc., 566 U.S. 66, 72, 101 USPQ2d 1961, 1965). Thus, for example, claims that amount to nothing more than an instruction to apply the abstract idea using a generic computer do not render an abstract idea eligible. Alice Corp., 573 U.S. at 223, 110 USPQ2d at 1983. See also 573 U.S. at 224, 110 USPQ2d at 1984 (warning against a § 101 analysis that turns on “the draftsman’s art”). Further, M.P.E.P. § 2106.05(f)(2) recites: (2) Whether the claim invokes computers or other machinery merely as a tool to perform an existing process. Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general purpose computer or computer components after the fact to an abstract idea (e.g., a fundamental economic practice or mathematical equation) does not integrate a judicial exception into a practical application or provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit). Similarly, “claiming the improved speed or efficiency inherent with applying the abstract idea on a computer” does not integrate a judicial exception into a practical application or provide an inventive concept. Intellectual Ventures I LLC v. Capital One Bank (USA), 792 F.3d 1363, 1367, 115 USPQ2d 1636, 1639 (Fed. Cir. 2015). In contrast, a claim that purports to improve computer capabilities or to improve an existing technology may integrate a judicial exception into a practical application or provide significantly more. McRO, Inc. v. Bandai Namco Games Am. Inc., 837 F.3d 1299, 1314-15, 120 USPQ2d 1091, 1101-02 (Fed. Cir. 2016); Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 1335-36, 118 USPQ2d 1684, 1688-89 (Fed. Cir. 2016). See MPEP §§ 2106.04(d)(1) and 2106.05(a) for a discussion of improvements to the functioning of a computer or to another technology or technical field. Applicant's Specification, [073] recites: [073] The disclosed methods and systems may be implemented on a conventional or a general-purpose computer system, such as a personal computer (PC) or server computer. FIG. 6 is a block diagram that illustrates a system architecture 600 of a computer system 601 for managing knowledge of a primary ML model, in accordance with an exemplary embodiment of the present disclosure. Variations of computer system 601 may be used for implementing server 101 for determination of personality traits of agents in a contact center. Computer system 601 may include a central processing unit ("CPU" or "processor") 602. Processor 602 may include at least one data processor for executing program components for executing user-generated or system-generated requests. A user may include a person, a person using a device such as such as those included in this disclosure, or such a device itself. The processor may include specialized processing units such as integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, etc. The processor may include a microprocessor, such as AMD® ATHLON®, DURON® OR OPTERON®, ARM's application, embedded or secure processors, IBM® POWERPC®,INTEL® CORE® processor, ITANIUM® processor, XEON® processor, CELERON® processor or other line of processors, etc. The processor 602 may be implemented using mainframe, distributed processor, multi-core, parallel, grid, or other architectures. Some embodiments may utilize embedded technologies like application-specific integrated circuits (ASICs), digital signal processors (DSPs), Field Programmable Gate Arrays (FPGAs), etc. Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)). A “primary ML model employs a first ML model architecture”/ “secondary ML model”/ “first ML model” is a broad term which is described at a high level. Applicant's Specification, paragraph [002] recites: [002] This disclosure relates generally to Machine Learning (ML) Operations, and more particularly to a method and system for knowledge transfer between different ML model architectures. Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)). A “transferring, by the computing system”/ “knowledge transfer technique” is a broad term which is described at a high level. M.P.E.P. § 2106.05(d)(II) recites: The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity. i. Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network); … Further, M.P.E.P. § 2106.05(d)(I)(2) recites in part: 2. A factual determination is required to support a conclusion that an additional element (or combination of additional elements) is well-understood, routine, conventional activity. Berkheimer v. HP, Inc., 881 F.3d 1360, 1368, 125 USPQ2d 1649, 1654 (Fed. Cir. 2018). However, this does not mean that a prior art search is necessary to resolve this inquiry. Instead, examiners should rely on what the courts have recognized, or those in the art would recognize, as elements that are well-understood, routine, conventional activity in the relevant field when making the required determination. For example, in many instances, the specification of the application may indicate that additional elements are well-known or conventional. See, e.g., Intellectual Ventures v. Symantec, 838 F.3d at 1317; 120 USPQ2d at 1359 ("The written description is particularly useful in determining what is well-known or conventional"); Internet Patents Corp. v. Active Network, Inc., 790 F.3d 1343, 1348, 115 USPQ2d 1414, 1418 (Fed. Cir. 2015) (relying on specification’s description of additional elements as "well-known", "common" and "conventional"); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 614, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (Specification described additional elements as "either performing basic computer functions such as sending and receiving data, or performing functions ‘known’ in the art."). Further, Applicant's Specification, [021] recites: [021] The environment 100 may include a computing system 101, a first computing device 102, and a secondary computing device 103. The first computing device 102 may include a primary ML model 102A. The second computing device 103 may include a secondary ML model 103A. The computing system 101, the first computing device 102, and the secondary computing device 103 are configured to communicate with each other via a communication network 104. Examples of the communication network 104 may include, but are not limited to, a wireless fidelity (Wi-Fi) network, a light fidelity (Li-Fi) network, a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a satellite network, the Internet, a fiber optic network, a coaxial cable network, an infrared (IR) network, a radio frequency (RF) network, and a combination thereof. [022] The communication network 104 may facilitate the computing system 101 to perform one or more operations in order to manage knowledge of the primary ML model 102A. In particular, the computing system 101 may communicate with the first computing device 102, and the second computing device 103 to transfer knowledge of the primary ML model 102A having a first ML model architecture 102B to the secondary ML model 103A having a second ML model architecture 103B based on a knowledge transfer technique (for example, a knowledge distillation technique). It should be noted that the first ML model architecture is different from the second ML model architecture. As will be appreciated, in some embodiments the primary ML model 102A, and the secondary ML model 103A may be part of the computing system 101. Merely using the conventional computer to receive data is well known, understood, and conventional. Thus, it adds nothing significantly more to the judicial exception. Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)). Therefore, the answer to the inquiry is “NO”, no additional elements provide an inventive concept that is significantly more than the claimed abstract ideas the claimed abstract idea into a practical application. Claim 1 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 2 Claim 2 recites: 2. The method of claim 1, further comprising: exporting the trained secondary ML model from a first environment to a second environment based on a privacy protection technique. Applicant’s Claim 2 merely teaches the routine use of a computer network. [021] The environment 100 may include a computing system 101, a first computing device 102, and a secondary computing device 103. The first computing device 102 may include a primary ML model 102A. The second computing device 103 may include a secondary ML model 103A. The computing system 101, the first computing device 102, and the secondary computing device 103 are configured to communicate with each other via a communication network 104. Examples of the communication network 104 may include, but are not limited to, a wireless fidelity (Wi-Fi) network, a light fidelity (Li-Fi) network, a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a satellite network, the Internet, a fiber optic network, a coaxial cable network, an infrared (IR) network, a radio frequency (RF) network, and a combination thereof. [022] The communication network 104 may facilitate the computing system 101 to perform one or more operations in order to manage knowledge of the primary ML model 102A. In particular, the computing system 101 may communicate with the first computing device 102, and the second computing device 103 to transfer knowledge of the primary ML model 102A having a first ML model architecture 102B to the secondary ML model 103A having a second ML model architecture 103B based on a knowledge transfer technique (for example, a knowledge distillation technique). It should be noted that the first ML model architecture is different from the second ML model architecture. As will be appreciated, in some embodiments the primary ML model 102A, and the secondary ML model 103A may be part of the computing system 101. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 2 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 3 Claim 3 recites: 3. The method of claim 2, wherein the first environment is a production environment. Applicant’s Claim 3 merely teaches the routine use of a computer network. [021] The environment 100 may include a computing system 101, a first computing device 102, and a secondary computing device 103. The first computing device 102 may include a primary ML model 102A. The second computing device 103 may include a secondary ML model 103A. The computing system 101, the first computing device 102, and the secondary computing device 103 are configured to communicate with each other via a communication network 104. Examples of the communication network 104 may include, but are not limited to, a wireless fidelity (Wi-Fi) network, a light fidelity (Li-Fi) network, a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a satellite network, the Internet, a fiber optic network, a coaxial cable network, an infrared (IR) network, a radio frequency (RF) network, and a combination thereof. [022] The communication network 104 may facilitate the computing system 101 to perform one or more operations in order to manage knowledge of the primary ML model 102A. In particular, the computing system 101 may communicate with the first computing device 102, and the second computing device 103 to transfer knowledge of the primary ML model 102A having a first ML model architecture 102B to the secondary ML model 103A having a second ML model architecture 103B based on a knowledge transfer technique (for example, a knowledge distillation technique). It should be noted that the first ML model architecture is different from the second ML model architecture. As will be appreciated, in some embodiments the primary ML model 102A, and the secondary ML model 103A may be part of the computing system 101. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 3 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 4 Claim 4 recites: 4. The method of claim 2, wherein the second environment is one of a pre-production environment, or a test environment. Applicant’s Claim 4 merely teaches the routine use of a computer network. [021] The environment 100 may include a computing system 101, a first computing device 102, and a secondary computing device 103. The first computing device 102 may include a primary ML model 102A. The second computing device 103 may include a secondary ML model 103A. The computing system 101, the first computing device 102, and the secondary computing device 103 are configured to communicate with each other via a communication network 104. Examples of the communication network 104 may include, but are not limited to, a wireless fidelity (Wi-Fi) network, a light fidelity (Li-Fi) network, a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a satellite network, the Internet, a fiber optic network, a coaxial cable network, an infrared (IR) network, a radio frequency (RF) network, and a combination thereof. [022] The communication network 104 may facilitate the computing system 101 to perform one or more operations in order to manage knowledge of the primary ML model 102A. In particular, the computing system 101 may communicate with the first computing device 102, and the second computing device 103 to transfer knowledge of the primary ML model 102A having a first ML model architecture 102B to the secondary ML model 103A having a second ML model architecture 103B based on a knowledge transfer technique (for example, a knowledge distillation technique). It should be noted that the first ML model architecture is different from the second ML model architecture. As will be appreciated, in some embodiments the primary ML model 102A, and the secondary ML model 103A may be part of the computing system 101. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 4 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 5 Claim 5 recites: 5. The method of claim 2, wherein the privacy protection technique is one of a Fully Homomorphic Encryption (FHE) technique, a Multi-Party Computation (MPC) technique, a Trusted Execution Environments (TEEs) technique, a secure enclave technique, a secure communication channel technique, and an obfuscation technique. Applicant’s Claim 5 merely teaches a generic, well-understood, routine and conventional “privacy protection technique” (e.g., “a secure communication channel technique”.) It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 5 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 6 Claim 6 recites: 6. The method of claim 1, further comprising removing a pre-assigned label from an associated labelled dataset present in the first environment to generate the unlabelled dataset. Applicant’s Claim 6 merely teaches the mental step of removing a label. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 6 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 7 Claim 7 recites: 7. The method of claim 1, wherein the knowledge transfer technique corresponds to a knowledge distillation technique. Applicant’s Claim 7 merely teaches a well-understood, routine and conventional process. Applicant's Specification, paragraph [024] recites: [024] The knowledge distillation technique has been traditionally used to shrink large ML models into smaller efficient ML models so that it may be run on mobile or other edge devices. However, the knowledge distillation technique discussed herein in one or more embodiments of the present disclosure may be used in Machine Learning Operations (MLOps) to transfer the knowledge of the primary ML model 102A from existing architecture (e.g., the first ML model architecture) to a new architecture (e.g., the second ML model architecture) of the secondary ML model 103A. The term MLOps or is the practice of applying DevOps (Development operations) principles to machine learning workflows, enabling the efficient development, deployment, and maintenance of machine learning models at scale. In the process of knowledge distillation, the ML model is made more resilient to data drift issues and make algorithmic changes in the knowledge distillation technique in such as way so that it may cater to a more complex and/or bigger and new ML model architecture. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 7 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 8 Claim 8 recites: 8. The method of claim 1, wherein the labelling function corresponds to a soft max function. Applicant’s Claim 8 merely teaches a mathematical operation used to transform a vector of raw numerical scores, called logits, into a probability distribution. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 8 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 9 Step 1 inquiry: Does this claim fall within a statutory category? . The preamble of the claim recites “9. A system for managing knowledge of a primary Machine Learning (ML) model, the system comprising:…” Therefore, it is a “system” (or “apparatus”), which is a statutory category of invention. Therefore, the answer to the inquiry is: “YES.” Step 2A (Prong One) inquiry: Are there limitations in Claim 9 that recite abstract ideas? YES. The following limitations in Claim 9 recite abstract ideas that fall within at least one of the groupings of abstract ideas enumerated in the 2019 PEG. Specifically, they are “mental steps” and “mathematical steps”: • generate,…, a set of class probabilities for an unlabelled dataset based on a labelling function • unlabelled dataset • associated with the primary ML model • unlabelled dataset and the associated set of class probabilities • secondary ML model Step 2A (Prong Two) inquiry: Are there additional elements or a combination of elements in the claim that apply, rely on, or use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that it is more than a drafting effort designed to monopolize the exception? Applicant’s claims contain the following “additional elements”: (1) A “processing circuitry” (2) A “primary ML model employs a first ML model architecture”/ “secondary ML model”/ “first ML model” (3) A “transferring, by the computing system”/ “knowledge transfer technique” (4) A “memory” A “processing circuitry” is a broad term which is described at a high level and includes general purpose computers. M.P.E.P. § 2106.04(d)(I) recites: The courts have also identified limitations that did not integrate a judicial exception into a practical application: • Merely reciting the words “apply it” (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea, as discussed in MPEP § 2106.05(f); • Adding insignificant extra-solution activity to the judicial exception, as discussed in MPEP § 2106.05(g); and • Generally linking the use of a judicial exception to a particular technological environment or field of use, as discussed in MPEP § 2106.05(h). This “processing circuitry” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)). A “primary ML model employs a first ML model architecture”/ “secondary ML model”/ “first ML model” is a broad term which is described at a high level. Applicant’s Claim 9 merely teaches the embodiment where the claimed “ML model,” which is an “additional element”. The ML model is not used to calculate anything at all. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) This “primary ML model employs a first ML model architecture”/ “secondary ML model”/ “first ML model” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)). A “transferring, by the computing system”/ “knowledge transfer technique” is a broad term for merely applying a computer, which is described at a high level. M.P.E.P. § 2106.05(d)(II) recites: The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity. i. Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network); … M.P.E.P. § 2106.05 (f)(2) recites in part: (2) Whether the claim invokes computers or other machinery merely as a tool to perform an existing process. Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general purpose computer or computer components after the fact to an abstract idea (e.g., a fundamental economic practice or mathematical equation) does not integrate a judicial exception into a practical application or provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit). Similarly, “claiming the improved speed or efficiency inherent with applying the abstract idea on a computer” does not integrate a judicial exception into a practical application or provide an inventive concept. Intellectual Ventures I LLC v. Capital One Bank (USA), 792 F.3d 1363, 1367, 115 USPQ2d 1636, 1639 (Fed. Cir. 2015). In contrast, a claim that purports to improve computer capabilities or to improve an existing technology may integrate a judicial exception into a practical application or provide significantly more. McRO, Inc. v. Bandai Namco Games Am. Inc., 837 F.3d 1299, 1314-15, 120 USPQ2d 1091, 1101-02 (Fed. Cir. 2016); Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 1335-36, 118 USPQ2d 1684, 1688-89 (Fed. Cir. 2016). See MPEP §§ 2106.04(d)(1) and 2106.05(a) for a discussion of improvements to the functioning of a computer or to another technology or technical field. TLI Communications provides an example of a claim invoking computers and other machinery merely as a tool to perform an existing process. The court stated that the claims describe steps of recording, administration and archiving of digital images, and found them to be directed to the abstract idea of classifying and storing digital images in an organized manner. 823 F.3d at 612, 118 USPQ2d at 1747. The court then turned to the additional elements of performing these functions using a telephone unit and a server and noted that these elements were being used in their ordinary capacity (i.e., the telephone unit is used to make calls and operate as a digital camera including compressing images and transmitting those images, and the server simply receives data, extracts classification information from the received data, and stores the digital images based on the extracted information). 823 F.3d at 612-13, 118 USPQ2d at 1747-48. In other words, the claims invoked the telephone unit and server merely as tools to execute the abstract idea. Thus, the court found that the additional elements did not add significantly more to the abstract idea because they were simply applying the abstract idea on a telephone network without any recitation of details of how to carry out the abstract idea. This “transferring, by the computing system”/ “knowledge transfer technique” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)). A “memory” is a broad term which is described at a high level. M.P.E.P. § 2106.05(d)(II) recites: The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity. *** iv. Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93; This “memory” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)). The answer to the inquiry is “NO”, no additional elements integrate the claimed abstract idea into a practical application. Step 2B inquiry: Does the claim provide an inventive concept, i.e., does the claim recite additional element(s) or a combination of elements that amount to significantly more than the judicial exception in the claim? Applicant’s claims contain the following “additional elements”: (1) A “processing circuitry” (2) A “primary ML model employs a first ML model architecture”/ “secondary ML model”/ “first ML model” (3) A “transferring, by the computing system”/ “knowledge transfer technique” (4) A “memory” A “processing circuitry” is a broad term which is described at a high level and includes general purpose computers. M.P.E.P. § 2106.05 (I)(A)(i-ii) recites: Limitations that the courts have found not to be enough to qualify as “significantly more” when recited in a claim with a judicial exception include: i. Adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, e.g., a limitation indicating that a particular function such as creating and maintaining electronic records is performed by a computer, as discussed in Alice Corp., 573 U.S. at 225-26, 110 USPQ2d at 1984 (see MPEP § 2106.05(f)); ii. Simply appending well-understood, routine, conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception, e.g., a claim to an abstract idea requiring no more than a generic computer to perform generic computer functions that are well-understood, routine and conventional activities previously known to the industry, as discussed in Alice Corp., 573 U.S. at 225, 110 USPQ2d at 1984 (see MPEP § 2106.05(d)); Further, M.P.E.P. § 2016.05(f) recites: 2106.05(f) Mere Instructions To Apply An Exception [R-10.2019] Another consideration when determining whether a claim integrates a judicial exception into a practical application in Step 2A Prong Two or recites significantly more than a judicial exception in Step 2B is whether the additional elements amount to more than a recitation of the words “apply it” (or an equivalent) or are more than mere instructions to implement an abstract idea or other exception on a computer. As explained by the Supreme Court, in order to make a claim directed to a judicial exception patent-eligible, the additional element or combination of elements must do “‘more than simply stat[e] the [judicial exception] while adding the words ‘apply it’”. Alice Corp. v. CLS Bank, 573 U.S. 208, 221, 110 USPQ2d 1976, 1982-83 (2014) (quoting Mayo Collaborative Servs. V. Prometheus Labs., Inc., 566 U.S. 66, 72, 101 USPQ2d 1961, 1965). Thus, for example, claims that amount to nothing more than an instruction to apply the abstract idea using a generic computer do not render an abstract idea eligible. Alice Corp., 573 U.S. at 223, 110 USPQ2d at 1983. See also 573 U.S. at 224, 110 USPQ2d at 1984 (warning against a § 101 analysis that turns on “the draftsman’s art”). Further, M.P.E.P. § 2106.05(f)(2) recites: (2) Whether the claim invokes computers or other machinery merely as a tool to perform an existing process. Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general purpose computer or computer components after the fact to an abstract idea (e.g., a fundamental economic practice or mathematical equation) does not integrate a judicial exception into a practical application or provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit). Similarly, “claiming the improved speed or efficiency inherent with applying the abstract idea on a computer” does not integrate a judicial exception into a practical application or provide an inventive concept. Intellectual Ventures I LLC v. Capital One Bank (USA), 792 F.3d 1363, 1367, 115 USPQ2d 1636, 1639 (Fed. Cir. 2015). In contrast, a claim that purports to improve computer capabilities or to improve an existing technology may integrate a judicial exception into a practical application or provide significantly more. McRO, Inc. v. Bandai Namco Games Am. Inc., 837 F.3d 1299, 1314-15, 120 USPQ2d 1091, 1101-02 (Fed. Cir. 2016); Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 1335-36, 118 USPQ2d 1684, 1688-89 (Fed. Cir. 2016). See MPEP §§ 2106.04(d)(1) and 2106.05(a) for a discussion of improvements to the functioning of a computer or to another technology or technical field. Applicant's Specification, [073] recites: [073] The disclosed methods and systems may be implemented on a conventional or a general-purpose computer system, such as a personal computer (PC) or server computer. FIG. 6 is a block diagram that illustrates a system architecture 600 of a computer system 601 for managing knowledge of a primary ML model, in accordance with an exemplary embodiment of the present disclosure. Variations of computer system 601 may be used for implementing server 101 for determination of personality traits of agents in a contact center. Computer system 601 may include a central processing unit ("CPU" or "processor") 602. Processor 602 may include at least one data processor for executing program components for executing user-generated or system-generated requests. A user may include a person, a person using a device such as such as those included in this disclosure, or such a device itself. The processor may include specialized processing units such as integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, etc. The processor may include a microprocessor, such as AMD® ATHLON®, DURON® OR OPTERON®, ARM's application, embedded or secure processors, IBM® POWERPC®,INTEL® CORE® processor, ITANIUM® processor, XEON® processor, CELERON® processor or other line of processors, etc. The processor 602 may be implemented using mainframe, distributed processor, multi-core, parallel, grid, or other architectures. Some embodiments may utilize embedded technologies like application-specific integrated circuits (ASICs), digital signal processors (DSPs), Field Programmable Gate Arrays (FPGAs), etc. Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)). A “primary ML model employs a first ML model architecture”/ “secondary ML model”/ “first ML model” is a broad term which is described at a high level. Applicant's Specification, paragraph [002] recites: [002] This disclosure relates generally to Machine Learning (ML) Operations, and more particularly to a method and system for knowledge transfer between different ML model architectures. Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)). A “transferring, by the computing system”/ “knowledge transfer technique” is a broad term which is described at a high level. M.P.E.P. § 2106.05(d)(II) recites: The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity. i. Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network); … Further, M.P.E.P. § 2106.05(d)(I)(2) recites in part: 2. A factual determination is required to support a conclusion that an additional element (or combination of additional elements) is well-understood, routine, conventional activity. Berkheimer v. HP, Inc., 881 F.3d 1360, 1368, 125 USPQ2d 1649, 1654 (Fed. Cir. 2018). However, this does not mean that a prior art search is necessary to resolve this inquiry. Instead, examiners should rely on what the courts have recognized, or those in the art would recognize, as elements that are well-understood, routine, conventional activity in the relevant field when making the required determination. For example, in many instances, the specification of the application may indicate that additional elements are well-known or conventional. See, e.g., Intellectual Ventures v. Symantec, 838 F.3d at 1317; 120 USPQ2d at 1359 ("The written description is particularly useful in determining what is well-known or conventional"); Internet Patents Corp. v. Active Network, Inc., 790 F.3d 1343, 1348, 115 USPQ2d 1414, 1418 (Fed. Cir. 2015) (relying on specification’s description of additional elements as "well-known", "common" and "conventional"); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 614, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (Specification described additional elements as "either performing basic computer functions such as sending and receiving data, or performing functions ‘known’ in the art."). Applicant's Specification, [021] recites: [021] The environment 100 may include a computing system 101, a first computing device 102, and a secondary computing device 103. The first computing device 102 may include a primary ML model 102A. The second computing device 103 may include a secondary ML model 103A. The computing system 101, the first computing device 102, and the secondary computing device 103 are configured to communicate with each other via a communication network 104. Examples of the communication network 104 may include, but are not limited to, a wireless fidelity (Wi-Fi) network, a light fidelity (Li-Fi) network, a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a satellite network, the Internet, a fiber optic network, a coaxial cable network, an infrared (IR) network, a radio frequency (RF) network, and a combination thereof. [022] The communication network 104 may facilitate the computing system 101 to perform one or more operations in order to manage knowledge of the primary ML model 102A. In particular, the computing system 101 may communicate with the first computing device 102, and the second computing device 103 to transfer knowledge of the primary ML model 102A having a first ML model architecture 102B to the secondary ML model 103A having a second ML model architecture 103B based on a knowledge transfer technique (for example, a knowledge distillation technique). It should be noted that the first ML model architecture is different from the second ML model architecture. As will be appreciated, in some embodiments the primary ML model 102A, and the secondary ML model 103A may be part of the computing system 101. Merely using the conventional computer to receive data is well known, understood, and conventional. Thus, it adds nothing significantly more to the judicial exception. Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)). A “memory” is a broad term which is described at a high level. M.P.E.P. § 2106.05(d)(II) recites: The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity. *** iv. Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93; Further, Applicant's Specification, paragraph [032] recites: [032] FIG. 2 is a diagram that illustrates a process for managing knowledge of a primary ML model, in accordance with an exemplary embodiment of the present disclosure. FIG. 2 is explained in conjunction with elements from FIG. 1. The computing system 101 may include a processing circuitry 201 and a memory 202 communicatively coupled to the processing circuitry 201 via a communication bus 203. The memory 202 may be a non-volatile memory or a volatile memory. Examples of non-volatile memory may include, but are not limited to, a flash memory, a Read Only Memory (ROM), a Programmable ROM (PROM), Erasable PROM (EPROM), and Electrically EPROM (EEPROM) memory. Examples of volatile memory may include, but are not limited to, Dynamic Random Access Memory (DRAM), and Static Random-Access Memory (SRAM). Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)). Therefore, the answer to the inquiry is “NO”, no additional elements provide an inventive concept that is significantly more than the claimed abstract ideas the claimed abstract idea into a practical application. Claim 9 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 10 Claim 10 recites: 10. The system of claim 9, wherein the processor instructions, on execution, further cause the processing circuitry to export the trained secondary ML model from a first environment to a second environment based on a privacy protection technique. Applicant’s Claim 10 merely teaches the routine use of a computer network. [021] The environment 100 may include a computing system 101, a first computing device 102, and a secondary computing device 103. The first computing device 102 may include a primary ML model 102A. The second computing device 103 may include a secondary ML model 103A. The computing system 101, the first computing device 102, and the secondary computing device 103 are configured to communicate with each other via a communication network 104. Examples of the communication network 104 may include, but are not limited to, a wireless fidelity (Wi-Fi) network, a light fidelity (Li-Fi) network, a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a satellite network, the Internet, a fiber optic network, a coaxial cable network, an infrared (IR) network, a radio frequency (RF) network, and a combination thereof. [022] The communication network 104 may facilitate the computing system 101 to perform one or more operations in order to manage knowledge of the primary ML model 102A. In particular, the computing system 101 may communicate with the first computing device 102, and the second computing device 103 to transfer knowledge of the primary ML model 102A having a first ML model architecture 102B to the secondary ML model 103A having a second ML model architecture 103B based on a knowledge transfer technique (for example, a knowledge distillation technique). It should be noted that the first ML model architecture is different from the second ML model architecture. As will be appreciated, in some embodiments the primary ML model 102A, and the secondary ML model 103A may be part of the computing system 101. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 10 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 11 Claim 1 recites: 11. The system of claim 10, wherein the first environment is a production environment. Applicant’s Claim 11 merely teaches the routine use of a computer network. [021] The environment 100 may include a computing system 101, a first computing device 102, and a secondary computing device 103. The first computing device 102 may include a primary ML model 102A. The second computing device 103 may include a secondary ML model 103A. The computing system 101, the first computing device 102, and the secondary computing device 103 are configured to communicate with each other via a communication network 104. Examples of the communication network 104 may include, but are not limited to, a wireless fidelity (Wi-Fi) network, a light fidelity (Li-Fi) network, a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a satellite network, the Internet, a fiber optic network, a coaxial cable network, an infrared (IR) network, a radio frequency (RF) network, and a combination thereof. [022] The communication network 104 may facilitate the computing system 101 to perform one or more operations in order to manage knowledge of the primary ML model 102A. In particular, the computing system 101 may communicate with the first computing device 102, and the second computing device 103 to transfer knowledge of the primary ML model 102A having a first ML model architecture 102B to the secondary ML model 103A having a second ML model architecture 103B based on a knowledge transfer technique (for example, a knowledge distillation technique). It should be noted that the first ML model architecture is different from the second ML model architecture. As will be appreciated, in some embodiments the primary ML model 102A, and the secondary ML model 103A may be part of the computing system 101. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 11 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 12 Claim 12 recites: 12. The system of claim 10, wherein the second environment is one of a pre-production environment, or a test environment. Applicant’s Claim 12 merely teaches the routine use of a computer network. [021] The environment 100 may include a computing system 101, a first computing device 102, and a secondary computing device 103. The first computing device 102 may include a primary ML model 102A. The second computing device 103 may include a secondary ML model 103A. The computing system 101, the first computing device 102, and the secondary computing device 103 are configured to communicate with each other via a communication network 104. Examples of the communication network 104 may include, but are not limited to, a wireless fidelity (Wi-Fi) network, a light fidelity (Li-Fi) network, a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a satellite network, the Internet, a fiber optic network, a coaxial cable network, an infrared (IR) network, a radio frequency (RF) network, and a combination thereof. [022] The communication network 104 may facilitate the computing system 101 to perform one or more operations in order to manage knowledge of the primary ML model 102A. In particular, the computing system 101 may communicate with the first computing device 102, and the second computing device 103 to transfer knowledge of the primary ML model 102A having a first ML model architecture 102B to the secondary ML model 103A having a second ML model architecture 103B based on a knowledge transfer technique (for example, a knowledge distillation technique). It should be noted that the first ML model architecture is different from the second ML model architecture. As will be appreciated, in some embodiments the primary ML model 102A, and the secondary ML model 103A may be part of the computing system 101. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 12 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 13 Claim 13 recites: 13. The system of claim 10, wherein the privacy protection technique is one of a Fully Homomorphic Encryption (FHE) technique, a Multi-Party Computation (MPC) technique, a Trusted Execution Environments (TEEs) technique, a secure enclave technique, a secure communication channel technique, and an obfuscation technique. Applicant’s Claim 13 merely teaches a generic, well-understood, routine and conventional “privacy protection technique” (e.g., “a secure communication channel technique”.) It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 13 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 14 Claim 14 recites: 14. The system of claim 1, wherein the processor instructions, on execution, further cause the processing circuitry to remove a pre-assigned label from an associated labelled dataset present in the first environment to generate the unlabelled dataset. Applicant’s Claim 14 merely teaches the mental step of removing a label. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 14 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 15 Claim 15 recites: 15. The system of claim 9, wherein the knowledge transfer technique corresponds to a knowledge distillation technique. Applicant’s Claim 15 merely teaches a well-understood, routine and conventional process. Applicant's Specification, paragraph [024] recites: [024] The knowledge distillation technique has been traditionally used to shrink large ML models into smaller efficient ML models so that it may be run on mobile or other edge devices. However, the knowledge distillation technique discussed herein in one or more embodiments of the present disclosure may be used in Machine Learning Operations (MLOps) to transfer the knowledge of the primary ML model 102A from existing architecture (e.g., the first ML model architecture) to a new architecture (e.g., the second ML model architecture) of the secondary ML model 103A. The term MLOps or is the practice of applying DevOps (Development operations) principles to machine learning workflows, enabling the efficient development, deployment, and maintenance of machine learning models at scale. In the process of knowledge distillation, the ML model is made more resilient to data drift issues and make algorithmic changes in the knowledge distillation technique in such as way so that it may cater to a more complex and/or bigger and new ML model architecture. Claim 15 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 16 Claim 16 recites: 16. The system of claim 9, wherein the labelling function corresponds to a soft max function. Applicant’s Claim 16 merely teaches a mathematical operation used to transform a vector of raw numerical scores, called logits, into a probability distribution. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) Claim 16 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claim 17 Step 1 inquiry: Does this claim fall within a statutory category? The preamble of the claim recites “17. A non-transitory computer-readable medium storing computer-executable instructions for managing knowledge of a primary Machine Learning (ML) model, the computer-executable instructions configured for…” Therefore, it is a “non-transitory computer-readable medium” (or “product of manufacture”), which is a statutory category of invention. Therefore, the answer to the inquiry is: “YES.” Step 2A (Prong One) inquiry: Are there limitations in Claim 17 that recite abstract ideas? YES. The following limitations in Claim 17 recite abstract ideas that fall within at least one of the groupings of abstract ideas enumerated in the 2019 PEG. Specifically, they are “mental steps” and “mathematical steps”: • generating,…, a set of class probabilities for an unlabelled dataset based on a labelling function • unlabelled dataset • associated with the primary ML model • unlabelled dataset and the associated set of class probabilities • secondary ML model Step 2A (Prong Two) inquiry: Are there additional elements or a combination of elements in the claim that apply, rely on, or use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that it is more than a drafting effort designed to monopolize the exception? Applicant’s claims contain the following “additional elements”: (1) A “primary ML model employs a first ML model architecture”/ “secondary ML model”/ “first ML model” (2) A “transferring, by the computing system”/ “knowledge transfer technique” A “primary ML model employs a first ML model architecture”/ “secondary ML model”/ “first ML model” is a broad term which is described at a high level. Applicant’s Claim 17 merely teaches the embodiment where the claimed “ML model,” which is an “additional element”. The ML model is not used to calculate anything at all. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).) This “primary ML model employs a first ML model architecture”/ “secondary ML model”/ “first ML model” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)). A “transferring, by the computing system”/ “knowledge transfer technique” is a broad term for merely applying a computer, which is described at a high level. M.P.E.P. § 2106.05(d)(II) recites: The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity. i. Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network); … M.P.E.P. § 2106.05 (f)(2) recites in part: (2) Whether the claim invokes computers or other machinery merely as a tool to perform an existing process. Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general purpose computer or computer components after the fact to an abstract idea (e.g., a fundamental economic practice or mathematical equation) does not integrate a judicial exception into a practical application or provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit). Similarly, “claiming the improved speed or efficiency inherent with applying the abstract idea on a computer” does not integrate a judicial exception into a practical application or provide an inventive concept. Intellectual Ventures I LLC v. Capital One Bank (USA), 792 F.3d 1363, 1367, 115 USPQ2d 1636, 1639 (Fed. Cir. 2015). In contrast, a claim that purports to improve computer capabilities or to improve an existing technology may integrate a judicial exception into a practical application or provide significantly more. McRO, Inc. v. Bandai Namco Games Am. Inc., 837 F.3d 1299, 1314-15, 120 USPQ2d 1091, 1101-02 (Fed. Cir. 2016); Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 1335-36, 118 USPQ2d 1684, 1688-89 (Fed. Cir. 2016). See MPEP §§ 2106.04(d)(1) and 2106.05(a) for a discussion of improvements to the functioning of a computer or to another technology or technical field. TLI Communications provides an example of a claim invoking computers and other machinery merely as a tool to perform an existing process. The court stated that the claims describe steps of recording, administration and archiving of digital images, and found them to be directed to the abstract idea of classifying and storing digital images in an organized manner. 823 F.3d at 612, 118 USPQ2d at 1747. The court then turned to the additional elements of performing these functions using a telephone unit and a server and noted that these elements were being used in their ordinary capacity (i.e., the telephone unit is used to make calls and operate as a digital camera including compressing images and transmitting those images, and the server simply receives data, extracts classification information from the received data, and stores the digital images based on the extracted information). 823 F.3d at 612-13, 118 USPQ2d at 1747-48. In other words, the claims invoked the telephone unit and server merely as tools to execute the abstract idea. Thus, the court found that the additional elements did not add significantly more to the abstract idea because they were simply applying the abstract idea on a telephone network without any recitation of details of how to carry out the abstract idea. This “transferring, by the computing system”/ “knowledge transfer technique” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)). The answer to the inquiry is “NO”, no additional elements integrate the claimed abstract idea into a practical application. Step 2B inquiry: Does the claim provide an inventive concept, i.e., does the claim recite additional element(s) or a combination of elements that amount to significantly more than the judicial exception in the claim? Applicant’s claims contain the following “additional elements”: (1) A “primary ML model employs a first ML model architecture”/ “secondary ML model”/ “first ML model” (2) A “transferring, by the computing system”/ “knowledge transfer technique” A “primary ML model employs a first ML model architecture”/ “secondary ML model”/ “first ML model” is a broad term which is described at a high level. Applicant's Specification, paragraph [002] recites: [002] This disclosure relates generally to Machine Learning (ML) Operations, and more particularly to a method and system for knowledge transfer between different ML model architectures. Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)). A “transferring, by the computing system”/ “knowledge transfer technique” is a broad term which is described at a high level. M.P.E.P. § 2106.05(d)(II) recites: The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity. i. Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network); … Further, M.P.E.P. § 2106.05(d)(I)(2) recites in part: 2. A factual determination is required to support a conclusion that an additional element (or combination of additional elements) is well-understood, routine, conventional activity. Berkheimer v. HP, Inc., 881 F.3d 1360, 1368, 125 USPQ2d 1649, 1654 (Fed. Cir. 2018). However, this does not mean that a prior art search is necessary to resolve this inquiry. Instead, examiners should rely on what the courts have recognized, or those in the art would recognize, as elements that are well-understood, routine, conventional activity in the relevant field when making the required determination. For example, in many instances, the specification of the application may indicate that additional elements are well-known or conventional. See, e.g., Intellectual Ventures v. Symantec, 838 F.3d at 1317; 120 USPQ2d at 1359 ("The written description is particularly useful in determining what is well-known or conventional"); Internet Patents Corp. v. Active Network, Inc., 790 F.3d 1343, 1348, 115 USPQ2d 1414, 1418 (Fed. Cir. 2015) (relying on specification’s description of additional elements as "well-known", "common" and "conventional"); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 614, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (Specification described additional elements as "either performing basic computer functions such as sending and receiving data, or performing functions ‘known’ in the art."). Applicant's Specification, [021] recites: [021] The environment 100 may include a computing system 101, a first computing device 102, and a secondary computing device 103. The first computing device 102 may include a primary ML model 102A. The second computing device 103 may include a secondary ML model 103A. The computing system 101, the first computing device 102, and the secondary computing device 103 are configured to communicate with each other via a communication network 104. Examples of the communication network 104 may include, but are not limited to, a wireless fidelity (Wi-Fi) network, a light fidelity (Li-Fi) network, a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a satellite network, the Internet, a fiber optic network, a coaxial cable network, an infrared (IR) network, a radio frequency (RF) network, and a combination thereof. [022] The communication network 104 may facilitate the computing system 101 to perform one or more operations in order to manage knowledge of the primary ML model 102A. In particular, the computing system 101 may communicate with the first computing device 102, and the second computing device 103 to transfer knowledge of the primary ML model 102A having a first ML model architecture 102B to the secondary ML model 103A having a second ML model architecture 103B based on a knowledge transfer technique (for example, a knowledge distillation technique). It should be noted that the first ML model architecture is different from the second ML model architecture. As will be appreciated, in some embodiments the primary ML model 102A, and the secondary ML model 103A may be part of the computing system 101. Merely using the conventional computer to receive data is well known, understood, and conventional. Thus, it adds nothing significantly more to the judicial exception. Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)). Therefore, the answer to the inquiry is “NO”, no additional elements provide an inventive concept that is significantly more than the claimed abstract ideas the claimed abstract idea into a practical application. Claim 17 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101. Claims 1-17 are not rejected under art since when reading the claims in light of the specification, as per MPEP § 2111.01, none of the references of record, whether taken alone or in combination, discloses or suggests the combination of limitations specified in independent Claim 1. Specifically: Claim 1’s "...transferring, by the computing system, the unlabelled dataset and the associated set of class probabilities for training a secondary ML model based on a knowledge transfer technique..." Further, none of the references of record, whether taken alone or in combination, discloses or suggests the combination of limitations specified in independent Claim 9. Specifically: Claim 9’s "...transfer, by the computing system, the unlabelled dataset and the associated set of class probabilities for training a secondary ML model based on a knowledge transfer technique..." Further, none of the references of record, whether taken alone or in combination, discloses or suggests the combination of limitations specified in independent Claim 17. Specifically: Claim 17’s "...transferring, by the computing system, the unlabelled dataset and the associated set of class probabilities for training a secondary ML model based on a knowledge transfer technique..." Conclusion Any inquiries concerning this communication or earlier communications from the examiner should be directed to Wilbert L. Starks, Jr., who may be reached Monday through Friday, between 8:00 a.m. and 5:00 p.m. EST. or via telephone at (571) 272-3691 or email: Wilbert.Starks@uspto.gov. If you need to send an Official facsimile transmission, please send it to (571) 273-8300. If attempts to reach the examiner are unsuccessful the Examiner’s Supervisor (SPE), Kakali Chaki, may be reached at (571) 272-3719. Hand-delivered responses should be delivered to the Receptionist @ (Customer Service Window Randolph Building 401 Dulany Street, Alexandria, VA 22313), located on the first floor of the south side of the Randolph Building. Finally, information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Moreover, status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have any questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) toll-free @ 1-866-217-9197. /WILBERT L STARKS/ Primary Examiner, Art Unit 2122 WLS 06 JAN 2026
Read full office action

Prosecution Timeline

Mar 31, 2023
Application Filed
Jan 07, 2026
Non-Final Rejection — §101 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12561587
DATA PROCESSING METHOD, ELECTRONIC DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Feb 24, 2026
Patent 12555007
METHOD AND SYSTEM FOR INFERRING DEVICE FINGERPRINT
2y 5m to grant Granted Feb 17, 2026
Patent 12541694
GENERATING A DOMAIN-SPECIFIC KNOWLEDGE GRAPH FROM UNSTRUCTURED COMPUTER TEXT
2y 5m to grant Granted Feb 03, 2026
Patent 12525251
METHOD, SYSTEM AND PROGRAM PRODUCT FOR PERCEIVING AND COMPUTING EMOTIONS
2y 5m to grant Granted Jan 13, 2026
Patent 12518149
IMPLICIT VECTOR CONCATENATION WITHIN 2D MESH ROUTING
2y 5m to grant Granted Jan 06, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
76%
Grant Probability
80%
With Interview (+4.4%)
3y 6m
Median Time to Grant
Low
PTA Risk
Based on 653 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month