Prosecution Insights
Last updated: April 19, 2026
Application No. 18/216,982

FEDERATED LEARNING WITH MODEL DIVERSITY

Non-Final OA §101§112§DP
Filed
Jun 30, 2023
Examiner
SIPPEL, MOLLY CLARKE
Art Unit
2122
Tech Center
2100 — Computer Architecture & Software
Assignee
Robert Bosch GmbH
OA Round
1 (Non-Final)
50%
Grant Probability
Moderate
1-2
OA Rounds
3y 7m
To Grant
99%
With Interview

Examiner Intelligence

Grants 50% of resolved cases
50%
Career Allow Rate
7 granted / 14 resolved
-5.0% vs TC avg
Strong +58% interview lift
Without
With
+58.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 7m
Avg Prosecution
25 currently pending
Career history
39
Total Applications
across all art units

Statute-Specific Performance

§101
33.8%
-6.2% vs TC avg
§103
32.0%
-8.0% vs TC avg
§102
9.8%
-30.2% vs TC avg
§112
23.6%
-16.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 14 resolved cases

Office Action

§101 §112 §DP
DETAILED ACTION This action is responsive to the application filed on 06/30/2023 . Claims 1-20 are pending in the case. Claims 1, 11, and 16 are independent claims. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA. Information Disclosure Statement The information disclosure statement (IDS) submitted on 06/30/2023 is being considered by the examiner. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claims 5-7 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA), first paragraph, as failing to comply with the enablement requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to enable one skilled in the art to which it pertains, or with which it is most nearly connected, to make and/or use the invention. Regarding claim 5, the claim recites “wherein the loss is determined by: L i =loss( j=1 l i f W j D i , y i )+ λ i R( g W 1 i D i , g W 2 i D i ,…, g W l i i D i ) wherein ( D i , y i ) represents the locally-stored data at a particular one of the clients i, wherein λ is an adjustable hyperparameter corresponding to the regularization term R, and wherein g is the one or more latent features of the local machine models W 1 i , W 2 i , …, W l i i ”. This includes any reasonable interpretation of the loss equation, “ L i =loss( j=1 l i f W j D i , y i )+ λ i R( g W 1 i D i , g W 2 i D i ,…, g W l i i D i )”, including interpretations in which the second term of the equation is included in the summation and interpretations in which the second term of the equation is excluded in the summation. Regarding the breadth of the claims, the claims are specific to a method that trains neural networks with federated learning by sending at least portions of the server-maintained machine learning models from the server to the clients, training the models at the clients using a loss function, which is specified in claim 4 to include a regularization term and one or more latent features of the local machine learning models, and further specified to be determined by “ L i =loss( j=1 l i f W j D i , y i )+ λ i R( g W 1 i D i , g W 2 i D i ,…, g W l i i D i )” in claim 5. Regarding the nature of the invention, the invention is directed to a method described above. Regarding the state of the art, the state of the prior art clearly shows that using a loss function for training local models on client devices in a federated learning system is well known. However, the prior art fails to provide any evidence that the person of ordinary skill int eh art would have been able to develop a method that performs training of neural networks with federated learning with the provided loss function (including interpretations in which the second term of the equation is included in the summation and/or interpretations in which the second term of the equation is excluded in the summation). Regarding the level of one of ordinary skill in the art, it is fairly high and the level of predictability in the art is fairly low. Regarding the amount of direction provided by the inventor, applicant’s specification provides little or no guidance in how the loss is to be calculated using the loss function (including interpretations in which the second term of the equation is included in the summation and/or interpretations in which the second term of the equation is excluded in the summation). The current claims cover a loss function in which the second term of the equation is included in the summation and/or interpretations in which the second term of the equation is excluded in the summation, which is not enabled by the applicant’s specification. No working examples are provided in the specification for the invention of claim 5. With no guidance from the specification or disclosures in the prior art, it would require an undue amount of experimentation to make and use the invention of claim 5. Claims 6-7 are rejected as being dependent upon a rejected base claim without curing any of the deficiencies. The following is a quotation of 35 U.S.C. 112(b): (b ) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the appl icant regards as his invention. Claim s 5- 7 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Regarding claim 5, the claim recites “ wherein ( D i , y i ) represents…” on line 3. This limitation is unclear because it appears as though applicant is attempting to refer to a previously recited claim element, however, no such prior claim element exists. The claim also recites “ ( D i ), y i ) ” on line 2. For examination purposes, this limitation has been interpreted as “wherein ( D i ), y i ) represents…”. Claim 6 is rejected as being dependent upon a rejected base claim without curing any of the deficiencies. Regarding claim 7, the claim recites “ the concatenated weights” in line 5. There is insufficient antecedent basis for this limitation in the claim. It is unclear if applicant is attempting to refer to a previously recited claim element or if applicant is attempting to recite a new claim element. For examination purposes, the limitation is considered to be “concatenated weights”, reciting a new claim element. Double Patenting A rejection based on double patenting of the “same invention” type finds its support in the language of 35 U.S.C. 101 which states that “whoever invents or discovers any new and useful process... may obtain a patent therefor...” (Emphasis added). Thus, the term “same invention,” in this context, means an invention drawn to identical subject matter. See Miller v. Eagle Mfg. Co. , 151 U.S. 186 (1894); In re Vogel , 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Ockert , 245 F.2d 467, 114 USPQ 330 (CCPA 1957). A statutory type (35 U.S.C. 101) double patenting rejection can be overcome by canceling or amending the claims that are directed to the same invention so they are no longer coextensive in scope. The filing of a terminal disclaimer cannot overcome a double patenting rejection based upon 35 U.S.C. 101. Claim FILLIN "Indicate the claim(s) of the present application." \d "[ 1 ]" 8 provisionally rejected under 35 U.S.C. 101 as claiming the same invention as that of claim FILLIN "Indicate the copending claim(s)." \d "[ 2 ]" 1 of copending Application No. FILLIN "Insert the number of the copending application." \d "[ 3 ]" 18/217,006 (reference application). Instant Application Copending application no. 18/217,006 Claim 8 Claim 1 Claim 1: A method of training neural networks with federated learning, the method comprising: A method of training neural networks with federated learning, the method comprising: sending at least portions of a plurality of server-maintained machine learning models from a server to a plurality of clients, yielding a plurality of local machine learning models; sending at least portions of a plurality of server-maintained machine learning models from a server to a plurality of clients, yielding a plurality of local machine learning models; at each client, training the plurality of local machine learning models with locally-stored data that is stored locally at that respective client, wherein the training at each client includes determining a respective loss for each of the plurality of local machine learning models and updating respective weights for each of the plurality of local machine learning models; at each client, training the plurality of local machine learning models with locally-stored data that is stored locally at that respective client, wherein the training at each client includes determining a respective loss for each of the plurality of local machine learning models and updating respective weights for each of the plurality of local machine learning models; transferring the respective updated weights from each client to the server without transferring the locally-stored data of the clients; and transferring the respective updated weights from each client to the server without transferring the locally-stored data of the clients; at the server, training the plurality of server-maintained machine learning models with the updated weights sent from each of the clients. at the server, training the plurality of server-maintained machine learning models with the updated weights sent from each of the clients; Claim 8: determining that a first client of the plurality of clients is disconnected or otherwise unable to receive the at least portions of the plurality of server-maintained machine learning models from the server; connecting the first client to a neighboring client that is able to communicate with the server; and sending the portions of the plurality of server-maintained machine learning models from the neighboring client to the first client. determining that a first client of the plurality of clients is disconnected or otherwise unable to receive the at least portions of the plurality of server-maintained machine learning models from the server; connecting the first client to a neighboring client that is able to communicate with the server; and sending the portions of the plurality of server-maintained machine learning models from the neighboring client to the first client. This is a provisional statutory double patenting rejection since the claims directed to the same invention have not in fact been patented. The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg , 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman , 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi , 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum , 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel , 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington , 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA. A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA/25, or PTO/AIA/26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer . Claim s FILLIN "Indicate the claim(s) of the present application." \d "[ 1 ]" 1 and 9-11 are provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claim s FILLIN "Indicate the claim(s) of the copending application." \d "[ 2 ]" 1 , 3-4, and 8 of copending Application No. FILLIN "Insert the number of the reference application." \d "[ 3 ]" 18/217,006 (reference application). Although the claims at issue are not identical, they are not patentably distinct from each other because the instant claims are anticipated by the corresponding copending claims as follows: Instant Application Copending application no. 18/217,006 Claim 1 Claim 1 A method of training neural networks with federated learning, the method comprising: A method of training neural networks with federated learning, the method comprising: sending at least portions of a plurality of server-maintained machine learning models from a server to a plurality of clients, yielding a plurality of local machine learning models; sending at least portions of a plurality of server-maintained machine learning models from a server to a plurality of clients, yielding a plurality of local machine learning models; at each client, training the plurality of local machine learning models with locally-stored data that is stored locally at that respective client, wherein the training at each client includes determining a respective loss for each of the plurality of local machine learning models and updating respective weights for each of the plurality of local machine learning models; at each client, training the plurality of local machine learning models with locally-stored data that is stored locally at that respective client, wherein the training at each client includes determining a respective loss for each of the plurality of local machine learning models and updating respective weights for each of the plurality of local machine learning models; transferring the respective updated weights from each client to the server without transferring the locally-stored data of the clients; and transferring the respective updated weights from each client to the server without transferring the locally-stored data of the clients; at the server, training the plurality of server-maintained machine learning models with the updated weights sent from each of the clients. at the server, training the plurality of server-maintained machine learning models with the updated weights sent from each of the clients; determining that a first client of the plurality of clients is disconnected or otherwise unable to receive the at least portions of the plurality of server- maintained machine learning models from the server; connecting the first client to a neighboring client that is able to communicate with the server; and sending the portions of the plurality of server-maintained machine learning models from the neighboring client to the first client. All limitations of claim 1 in the instant application are anticipated by claim 1 of the copending application 18/217,006 . Instant Application Copending application no. 18/217,006 Claim 9 Claim 3 wherein the connecting includes connecting the first client to a plurality of neighboring clients, the method further comprising: Claim 2: wherein the connecting includes connecting the first client to a plurality of neighboring clients that are able to communicate with the server, and wherein the sending of the portions includes sending the portions of the plurality of server- maintained machine learning models from the plurality of neighboring clients to the first client. performing an interpolation of the portions of the plurality of server-maintained machine learning models received from the plurality of neighboring clients. Claim 3: further comprising: at the first client , performing an interpolation of the portions of the plurality of server-maintained machine learning models received from the plurality of neighboring clients All limitations of claim 9 in the instant application are anticipated by claim 3 of the copending application 18/217,006 . Instant Application Copending application no. 18/217,006 Claim 10 Claim 4 wherein the interpolation is W + =W+ {i∈ C b } A i ∙ (W i -W) , wherein the interpolation is W + =W+ {i∈ C b } A i ∙ (W i -W) , wherein W + is an interpolated model for models W received by the plurality of neighboring clients C b , wherein W + is an interpolated model for models W received by the plurality of neighboring clients C b , and wherein A i is a linear combination weight for model W i and wherein A i is a linear combination weight for model W i All limitations of claim 10 in the instant application are anticipated by claim 4 of the copending application 18/217,006 . Instant Application Copending application no. 18/217,006 Claim 1 Claim 1 A system of training neural networks with federated learning, the system comp risin g: A system of training neural networks with federated learning, the system comprising: memory storing instructions; and memory storing instructions; and a plurality of processors that, when executing the instructions stored in the memory, collectively perform: a plurality of processors that, when executing the instructions stored in the memory, collectively perform: sending at least portions of a plurality of server-maintained machine learning models from a server to a plurality of clients, yielding a plurality of local machine learning models; sending at least portions of a plurality of server-maintained machine learning models from a server to a plurality of clients, yielding a plurality of local machine learning models; at each client, training the plurality of local machine learning models with locally-stored data that is stored locally at that respective client, wherein the training at each client includes determining a respective loss for each of the plurality of local machine learning models and updating respective weights for each of the plurality of local machine learning models; at each client, training the plurality of local machine learning models with locally-stored data that is stored locally at that respective client, wherein the training at each client includes determining a respective loss for each of the plurality of local machine learning models and updating respective weights for each of the plurality of local machine learning models; transferring the respective updated weights from each client to the server without transferring the locally-stored data of the clients; and transferring the respective updated weights from each client to the server without transferring the locally-stored data of the clients; at the server, training the plurality of server-maintained machine learning models with the updated weights sent from each of the clients. at the server, training the plurality of server-maintained machine learning models with the updated weights sent from each of the clients; determining that a first client of the plurality of clients is disconnected or otherwise unable to receive the at least portions of the plurality of server-maintained machine learning models from the server; connecting the first client to a neighboring client that is able to communicate with the server; and sending the portions of the plurality of server-maintained machine learning models from the neighboring client to the first client. All limitations of claim 11 in the instant application are anticipated by claim 8 of the copending application 18/217,006 . This is a provisional nonstatutory double patenting rejection because the patentably indistinct claims have not in fact been patented. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Regarding claim 1 : Step 1 Statutory Category: Claim 1 is directed to a method, which falls under one of the four statutory categories. Step 2A Prong 1 Judicial Exception: Claim 1 recites, in part, “determining a respective loss for each of the plurality of local machine learning models”. This limitation, under the broadest reasonable interpretation, covers the recitation of mathematical calculations, as directed to “a claim that recites a mathematical calculation, when the claim is given its broadest reasonable interpretation in light of the specification, will be considered as falling within the "mathematical concepts" grouping. A mathematical calculation is a mathematical operation (such as multiplication) or an act of calculating using mathematical methods to determine a variable or number” . See MPEP §2106.04(a)(2)(I)(C). Step 2A Prong 2 Integration into a Practical Application: This judicial exception is not integrated into a practical application. In particular the claim recites: “training neural networks with federated learning”. This limitation is an additional element that generally links the use of the judicial exception to a particular technological environment or field of use. See MPEP §2106.05(h). Further, the claim recites: “ sending at least portions of a plurality of server-maintained machine learning models from a server to a plurality of clients, yielding a plurality of local machine learning models ”. This limitation is an additional element that amounts to adding insignificant extra-solution activity to the judicial exception. See MPEP §2106.05(g). Further, the claim recites: “ at each client, training the plurality of local machine learning models with locally-stored data that is stored locally at that respective client, wherein the training at each client includes … updating respective weights for each of the plurality of local machine learning models ”. This limitation is an additional element that amounts to adding insignificant extra-solution activity to the judicial exception. See MPEP §2106.05(g). Further, the claim recites: “ transferring the respective updated weights from each client to the server without transferring the locally-stored data of the clients ”. This limitation is an additional element that amounts to adding insignificant extra-solution activity to the judicial exception. See MPEP §2106.05(g). Finally, the claim recites: “ at the server, training the plurality of server-maintained machine learning models with the updated weights sent from each of the clients ”. This limitation is an additional element that amounts to adding the words “apply it” (or an equivalent) with the judicial exception , or mere instructions to implement an abstract idea on a computer, or merely uses a computer in its ordinary capacity as a tool to perform an existing process. See MPEP §2106.05(f) . Alternatively, “ at the server, training the plurality of server-maintained machine learning models with the updated weights sent from each of the clients ” is an additional element that amounts to adding insignificant extra-solution activity to the judicial exception. See MPEP §2106.05(g). Step 2B Significantly more: The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element: “training neural networks with federated learning” generally links the use of the judicial exception to a particular technological environment or field of use . Elements that merely generally link the use of the judicial exception to a particular technological environment or field of use cannot provide an inventive concept. Further, the additional elements: “ sending at least portions of a plurality of server-maintained machine learning models from a server to a plurality of clients, yielding a plurality of local machine learning models ” and “ transferring the respective updated weights from each client to the server without transferring the locally-stored data of the clients ” amount to adding insignificant extra-solution activity to the judicial exception and further, are directed to receiving or transmitting data over a network which courts have recognized as well-understood, routine, and conventional when they are claimed in a generic manner, see MPEP §2106.05(d)(II). Further, the additional element “ at each client, training the plurality of local machine learning models with locally-stored data that is stored locally at that respective client, wherein the training at each client includes … updating respective weights for each of the plurality of local machine learning models ” amounts to adding insignificant extra-solution activity to the judicial exception , and further, is well-understood, routine, and conventional as taught by activity supported under Berkheimer Option 2, Ryden, U.S. Patent Application Publication No. 20240155714 , Paragraphs 0036-0037, “ The FL steps performed in a wireless communication system involving access nodes and wireless devices typically comprise [0037] 1) Local Model Training: After receiving the global/initial model, all UEs perform local training based on the global model to update their local weights ”. Further, the additional element “ at the server, training the plurality of server-maintained machine learning models with the updated weights sent from each of the clients ” amounts to adding the words “apply it” (or an equivalent) with the judicial exception , or mere instructions to implement an abstract idea on a computer, or merely uses a computer in its ordinary capacity as a tool to perform an existing process . Elements that merely amount to adding the words “apply it” (or an equivalent) with the judicial exception , or mere instructions to implement an abstract idea on a computer, or merely uses a computer in its ordinary capacity as a tool to perform an existing process cannot provide an inventive concept. Alternatively, the additional element “ at the server, training the plurality of server-maintained machine learning models with the updated weights sent from each of the clients ” amounts to adding insignificant extra-solution activity to the judicial exception , and further, is well-understood, routine, and conventional as taught by activity supported under Berkheimer Option 2, Ryden, U.S. Patent Application Publication No. 20240155714 , Paragraphs 0036-0038, “ The FL steps performed in a wireless communication system involving access nodes and wireless devices typically comprise … 2) Global Model Aggregation: The gNB receives the local weights from each UE and updates the global model weights and then send back the updated global model weights to all the participants in the FL process ”. The claim is not patent eligible. Regarding claim 2 , the rejection of claim 1 is incorporated, and further, the claim recites: “ selecting the plurality of server-maintained machine learning models from a pool of machine learning models ”. This limitation is the abstract idea of a mental process that can practically be performed in the human mind, with or without the use of a physical aid such as pen and paper (including an observation, evaluation, judgment, opinion) , in this case a judgment . See MPEP § 2106.04(a)(2)(III). Thus, the claim recites a judicial exception. The claim does not include any additional elements that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible. Regarding claim 3 , the rejection of claim 2 is incorporated, and further, the claim recites: “ wherein the plurality of server-maintained machine learning models are selected from the pool based on resource limits associated with the plurality of clients ”. This limitation is a continuation of the “ selecting the plurality of server-maintained machine learning models from a pool of machine learning models ” limitation identified in the rejection of the parent claim. Thus, the claim recites a judicial exception. The claim does not include any additional elements that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible. Regarding claim 4 , the rejection of claim 1 is incorporated, and further, the claim recites: “ wherein the training of the plurality of local machine learning models at each client includes determining a loss based on a regularization term and one or more latent features of the local machine learning models ”. This limitation recites mathematical concepts in addition to those identified in the rejection of the parent claim. Thus, the claim recites a judicial exception. The claim does not include any additional elements that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible. Regarding claim 5 , the rejection of claim 4 is incorporated, and further, the claim recites: “wherein the loss is determined by: L i =loss( j=1 l i f W j D i , y i )+ λ i R( g W 1 i D i , g W 2 i D i ,…, g W l i i D i ) wherein ( D i , y i ) represents the locally-stored data at a particular one of the clients i, wherein λ is an adjustable hyperparameter corresponding to the regularization term R, and wherein g is the one or more latent features of the local machine models W 1 i , W 2 i , …, W l i i ”. This limitation recites mathematical concepts in addition to those identified in the parent claim, in this case a mathematical formula or equation, as directed to “a claim that recites a numerical formula or equation will be considered as falling within the "mathematical concepts" grouping. In addition, there are instances where a formula or equation is written in text format that should also be considered as falling within this grouping” . See MPEP § 2106.04(a)(2)(I)(B) . The claim does not include any additional elements that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible. Regarding claim 6 , the rejection of claim 5 is incorporated, and further, the claim recites: “ at the server, aggregating information received from the clients to perform the training of the plurality of server-maintained machine learning models with the updated weights ”. This limitation recites mathematical concepts in addition to those identified in the rejection of the parent claim and thus, recites a judicial exception. The claim does not include any additional elements that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible. Regarding claim 7 , the rejection of claim 6 is incorporated, and further, the claim recites: “wherein the aggregating includes performing: min W 1 , W 2 , …, W L i=1 L 1 Z i i∈ Z l W l (i) - W l F 2 +λ W con T W con -I F 2 wherein Z represents an index of the plurality of server-maintained machine learning models, and wherein W con denotes the concatenated weights of the plurality of server-maintained machine learning models W 1 , W 2 , …, W L ”. This limitation recites mathematical concepts in addition to those identified in the parent claim, in this case a mathematical formula or equation, as directed to “a claim that recites a numerical formula or equation will be considered as falling within the "mathematical concepts" grouping. In addition, there are instances where a formula or equation is written in text format that should also be considered as falling within this grouping” . See MPEP § 2106.04(a)(2)(I)(B) . The claim does not include any additional elements that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible. Regarding claim 8 , the rejection of claim 1 is incorporated, and further, the claim recites: “determining that a first client of the plurality of clients is disconnected or otherwise unable to receive the at least portions of the plurality of server-maintained machine learning models from the server”. This limitation is the abstract idea of a mental process that can practically be performed in the human mind, with or without the use of a physical aid such as pen and paper (including an observation, evaluation, judgment, opinion) , in this case an observation . See MPEP § 2106.04(a)(2)(III). Further, the claim recites: “connecting the first client to a neighboring client that is able to communicate with the server”. This limitation is an additional element that amounts to adding insignificant extra-solution activity to the judicial exception. See MPEP §2106.05(g). Further, the limitation is well-understood, routine, and conventional as taught by activity supported under Berkheimer Option 2, Iyer et al., U.S. Patent Application Publication No. 20250036959 , Paragraph 0024, Lines 31-34, “ Typical electronic devices also include a set of one or more physical network interface(s) (NI(s)) to establish network connections (to transmit and/or receive code and/or data using propagating signals) with other electronic devices ”. Further, the claim recites: “sending the portions of the plurality of server-maintained machine learning models from the neighboring client to the first client”. This limitation is an additional element that amounts to adding insignificant extra-solution activity to the judicial exception. See MPEP §2106.05(g). Further, this limitation is directed to receiving or transmitting data over a network which courts have recognized as well-understood, routine, and conventional when they are claimed in a generic manner, see MPEP §2106.05(d)(II). The claim is not patent eligible. Regarding claim 9 , the rejection of claim 8 is incorporated, and further, the claim recites: “performing an interpolation of the portions of the plurality of server-maintained machine learning models received from the plurality of neighboring clients”. This limitation recites mathematical concepts in addition to those identified in the rejection of the parent claim. Thus, the claim recites a judicial exception. Further, the claim recites: “wherein the connecting includes connecting the first client to a plurality of neighboring clients”. This limitation is an additional element that amounts to adding insignificant extra-solution activity to the judicial exception. See MPEP §2106.05(g). Further, the limitation is well-understood, routine, and conventional as taught by activity supported under Berkheimer Option 2, Iyer et al., U.S. Patent Application Publication No. 20250036959 , Paragraph 0024, Lines 31-34, “ Typical electronic devices also include a set of one or more physical network interface(s) (NI(s)) to establish network connections (to transmit and/or receive code and/or data using propagating signals) with other electronic devices ”. The claim is not patent eligible. Regarding claim 10 , the rejection of claim 9 is incorporated, and further, the claim recites: “wherein the interpolation is W + =W+ {i∈ C b } A i ∙ (W i -W) , wherein W + is an interpolated model for models W received by the plurality of neighboring clients C b , and wherein A i is a linear combination weight for model W i ”. This limitation recites mathematical concepts in addition to those identified in the parent claim, in this case a mathematical formula or equation, as directed to “a claim that recites a numerical formula or equation will be considered as falling within the "mathematical concepts" grouping. In addition, there are instances where a formula or equation is written in text format that should also be considered as falling within this grouping” . See MPEP § 2106.04(a)(2)(I)(B) . The claim does not include any additional elements that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible. Regarding claim 11 : Step 1 Statutory Category: Claim 11 is directed to a system, which falls under one of the four statutory categories. Step 2A Prong 1 Judicial Exception: Claim 11 recites, in part, “determining a respective loss for each of the plurality of local machine learning models”. This limitation, under the broadest reasonable interpretation, covers the recitation of mathematical calculations, as directed to “a claim that recites a mathematical calculation, when the claim is given its broadest reasonable interpretation in light of the specification, will be considered as falling within the "mathematical concepts" grouping. A mathematical calculation is a mathematical operation (such as multiplication) or an act of calculating using mathematical methods to determine a variable or number” . See MPEP §2106.04(a)(2)(I)(C). Step 2A Prong 2 Integration into a Practical Application: This judicial exception is not integrated into a practical application. In particular the claim recites: “a system”, “memory storing instructions”, and “a plurality of processors”. These limitations are additional element s that amount to adding the words “apply it” (or an equivalent) with the judicial exception , or mere instructions to implement an abstract idea on a computer, or merely uses a computer in its ordinary capacity as a tool to perform an existing process. See MPEP §2106.05(f) . Further, the claim recites: “training neural networks with federated learning”. This limitation is an additional element that generally links the use of the judicial exception to a particular technological environment or field of use. See MPEP §2106.05(h). Further, the claim recites: “ sending at least portions of a plurality of server-maintained machine learning models from a server to a plurality of clients, yielding a plurality of local machine learning models ”. This limitation is an additional element that amounts to adding insignificant extra-solution activity to the judicial exception. See MPEP §2106.05(g). Further, the claim recites: “ at each client, training the plurality of local machine learning models with locally-stored data that is stored locally at that respective client, wherein the training at each client includes … updating respective weights for each of the plurality of local machine learning models ”. This limitation is an additional element that amounts to adding insignificant extra-solution activity to the judicial exception. See MPEP §2106.05(g). Further, the claim recites: “ transferring the respective updated weights from each client to the server without transferring the locally-stored data of the clients ”. This limitation is an additional element that amounts to adding insignificant extra-solution activity to the judicial exception. See MPEP §2106.05(g). Finally, the claim recites: “ at the server, training the plurality of server-maintained machine learning models with the updated weights sent from each of the clients ”. This limitation is an additional element that amounts to adding the words “apply it” (or an equivalent) with the judicial exception , or mere instructions to implement an abstract idea on a computer, or merely uses a computer in its ordinary capacity as a tool to perform an existing process. See MPEP §2106.05(f) . Alternatively, “ at the server, training the plurality of server-maintained machine learning models with the updated weights sent from each of the clients ” is an additional element that amounts to adding insignificant extra-solution activity to the judicial exception. See MPEP §2106.05(g). Step 2B Significantly more: The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element s : “a system”, “memory storing instructions”, and “a plurality of processors” amount to adding the words “apply it” (or an equivalent) with the judicial exception , or mere instructions to implement an abstract idea on a computer, or merely uses a computer in its ordinary capacity as a tool to perform an existing process. Elements that merely amount to adding the words “apply it” (or an equivalent) with the judicial exception , or mere instructions to implement an abstract idea on a computer, or merely uses a computer in its ordinary capacity as a tool to perform an existing process. Further, the additional element “training neural networks with federated learning” generally links the use of the judicial exception to a particular technological environment or field of use . Elements that merely generally link the use of the judicial exception to a particular technological environment or field of use cannot provide an inventive concept. Further, the additional elements: “ sending at least portions of a plurality of server-maintained machine learning models from a server to a plurality of clients, yielding a plurality of local machine learning models ” and “ transferring the respective updated weights from each client to the server without transferring the locally-stored data of the clients ” amount to adding insignificant extra-solution activity to the judicial exception and further, are directed to receiving or transmitting data over a network which courts have recognized as well-understood, routine, and conventional when they are claimed in a generic manner, see MPEP §2106.05(d)(II). Further, the additional element “ at each client, training the plurality of local machine learning models with locally-stored data that is stored locally at that respective client, wherein the training at each client includes … updating respective weights for each of the plurality of local machine learning models ” amounts to adding insignificant extra-solution activity to the judicial exception , and further, is well-understood, routine, and conventional as taught by activity supported under Berkheimer Option 2, Ryden, U.S. Patent Application Publication No. 20240155714 , Paragraphs 0036-0037, “ The FL steps performed in a wireless communication system involving access nodes and wireless devices typically comprise [0037] 1) Local Model Training: After receiving the global/initial model, all UEs perform local training based on the global model to update their local weights ”. Further, the additional element “ at the server, training the plurality of server-maintained machine learning models with the updated weights sent from each of the clients ” amounts to adding the words “apply it” (or an equivalent) with the judicial exception , or mere instructions to implement an abstract idea on a computer, or merely uses a computer in its ordinary capacity as a tool to perform an existing process . Elements that merely amount to adding the words “apply it” (or an equivalent) with the judicial exception , or mere instructions to implement an abstract idea on a computer, or merely uses a computer in its ordinary capacity as a tool to perform an existing process cannot provide an inventive concept. Alternatively, the additional element “ at the server, training the plurality of server-maintained machine learning models with the updated weights sent from each of the clients ” amounts to adding insignificant extra-solution activity to the judicial exception , and further, is well-understood, routine, and conventional as taught by activity supported under Berkheimer Option 2, Ryden, U.S. Patent Application Publication No. 20240155714 , Paragraphs 0036-0038, “ The FL steps performed in a wireless communication system involving access nodes and wireless devices typically comprise … 2) Global Model Aggregation: The gNB receives the local weights from each UE and updates the global model weights and then send back the updated global model weights to all the participants in the FL process ”. The claim is not patent eligible. Regarding claim 12 , the rejection of claim 11 is incorporated, and further, claim 12 is substantially similar to claim 2 respectively, and is rejected in the same manner and reasoning applying. Regarding claim 13 , the rejection of claim 12 is incorporated, and further, the claim recites: “ wherein the plurality of server-maintained machine learning models are selected from the pool and sent to the plurality of clients based on resource limits of the plurality of clients ”. This limitation is a continuation of the mental process limitation of claim 12, and thus the claim recites a judicial exception. The claim does not include any additional elements that amount to an integration of the judicial exception into a practical application, nor to significantly more than the judicial exception. The claim is not patent eligible. Regarding claim 14 , the rejection of claim 11 is incorporated, and further, claim 14 is substantially similar to claim 4 respectively, and is rejected in the same manner and reasoning applying. Regarding claim 15 , the rejection of claim 14 is incorporated, and further, claim 15 is substantially similar to claim 6 respectively, and is rejected in the same manner and reasoning applying. Regarding claim 16 : Step 1 Statutory Category: Claim 16 is directed to a machine, which falls under one of the four statutory categories. Step 2A Prong 1 Judicial Exception: Claim 16 recites, in part, “determining a respective loss for each of the plurality of local machine learning models”. This limitation, under the broadest reasonable interpretation, covers the recitation of mathematical calculations, as directed to “a claim that recites a mathematical calculation, when the claim is given its broadest reasonable interpretation in light of the specification, will be considered as falling within the "mathematical concepts" grouping. A mathematical calculation is a mathematical operation (such as multiplication) or an act of calculating using mathematical methods to determine a variable or number” . See MPEP §2106.04(a)(2)(I)(C). Step 2A Prong 2 Integration into a Practical Application: This judicial exception is not integrated into a practical application. In particular the claim recites: “ A non-transitory computer readable storage medium ”, “ a computer readable program code ”, “ computer readable instructions ”, and “a computing system”. These limitations are additional element s that amount to adding the words “apply it” (or an equivalent) with the judicial exception , or mere instructions to implement an abstract idea on a computer, or merely uses a computer in its ordinary capacity as a tool to perform an existing process. See MPEP §2106.05(f) . Further, the claim recites: “train neural networks with federated learning”. This limitation is an additional element that generally links the use of the judicial exception to a particular technological environment or field of use. See MPEP §2106.05(h). Further, the claim recites: “ sending at least portions of a plurality of server-maintained machine learning models from a server to a plurality of clients, yielding a plurality of local machine learning models ”. This limitation is an additional element that amounts to adding insignificant extra-solution activity to the judicial exception. See MPEP §2106.05(g). Further, the claim recites: “ at each client, training the plurality of local machine learning models with locally-stored data that is stored locally at that respective client, wherein the training at each client includes … updating respective weights for each of the plurality of local machine learning models ”. This limitation is an additional element that amounts to adding insignificant extra-solution activit
Read full office action

Prosecution Timeline

Jun 30, 2023
Application Filed
Mar 17, 2026
Non-Final Rejection — §101, §112, §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602592
NOISE COMMUNICATION FOR FEDERATED LEARNING
2y 5m to grant Granted Apr 14, 2026
Patent 12596916
CONSTRAINED MASKING FOR SPARSIFICATION IN MACHINE LEARNING
2y 5m to grant Granted Apr 07, 2026
Study what changed to get past this examiner. Based on 2 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
50%
Grant Probability
99%
With Interview (+58.3%)
3y 7m
Median Time to Grant
Low
PTA Risk
Based on 14 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month