DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This action is made non-final.
This action is made in response to the claims filed on January 5th, 2026. Claims 1-30 are pending in the case and have been examined. Claims 1-30 are rejected.
Response to Amendment
The amendment filed January 5th, 2026 has been entered. Claims 1-30 remain pending in the application.
Response to Arguments
Regarding the 112(f) Arguments
Applicant Argues:
The Office noted that claims 27-29 are interpreted under 35 U.S.C. § 112(f) because the recite means plus function language. Office Action, p. 4-5. Applicant reserves the right to provide further comment on the corresponding structures and does not concede the accuracy or completeness of the structures, if any, identified in the Office Action. Applicant also notes that the interpretation of such claims should not be limited only to cover the corresponding structure, material, or acts described in Applicant's specification, but should also include "equivalents thereof." See 35 U.S.C. § 112(f) (or 35 U.S.C. § 112, sixth paragraph); In re Donaldson Co., 16 F.3d 1189, 1193 (Fed. Cir. 1994).
Examiners Response
The claims still recite elements in “means for…” format, thereby invoking 35 U.S.C. 112(f). There is structure in the specification, but this does not remove the 112(f) invocation. Applicant arguments are persuasive and the claims are still interpreted under 112(f).
Regarding the 101 Argument’s101
Applicant Argues:
Applicants argue that the claimed invention improves federated learning by reducing communication overhead and addressing non-IID data heterogeneity, and therefore integrates the alleged abstract idea into a practical application.
Applicant submits that federated learning is an established "technology or technical field" and that the subject matter of independent claims 1, 12, 23, and 27 provides a particular solution/improvement to federated learning. For example, at least the claimed features of "obtain[ing], from a first client device and a second client device, respective updated model parameters for a machine learning model,"determin[ing] a first data heterogeneity level associated with input data for training the machine learning model at the first client device and the second client device, wherein the first data heterogeneity level is based on the respective updated model parameters received from the first client device and the second client device," and "configure[ing], using the first data heterogeneity level, the first client device and the second client device with a first data aggregation period associated with a number of training cycles, within a communication round between the apparatus and the first client device and a communication round between the apparatus and the second client device, for training the machine learning model at the first client device and the second client device" provide a solution to the problem of "federated training ... incur[ring] higher communication costs based on the number of client devices transmitting the updated machine learning model or more specifically, neural network parameter updates to the centralized parameter server" and/or "local models trained by client devices ... deviat[ing] from centralized training and/or each other because of data heterogeneity (e.g., the training data may not be independent and identically distributed (IID) among client devices)." See, e.g., US 2023 0297875 Al (Applicant's Published Application), paras. [0028], [0030], [0117]- [0120], among others. For example, the claimed features involve an apparatus/network entity solving such a problem by configuring first and second client devices with a particular data aggregation period determined using a data heterogeneity level of input data for training a machine learning model.
Applicant submits that the above clearly "provide[s] an indication that the claimed invention provides an improvement" and "show[s] where in the specification a technical problem and explanation of an unconventional solution" is provided, in contrast to the assertion from the Office to the contrary. See Office Action, pp. 5-6.
Based on at least the above remarks, the subject matter of independent claims 1, 12, 23, and 27 is directed to statutory subject matter. Accordingly, Applicant requests withdrawal of the rejection under 35 U.S.C. § 101.
Examiners Response
Although the claim is directed to a federated learning context, the recited steps, including obtaining model parameters, determining a heterogeneity level based on received parameters, configuring a data aggregation period associated with a number of training cycles, and combining updated ,model parameters recite mathematical analysis and data manipulation operations performed using generic processor and memory components.
The claim does not recite a specific technological improvement to computer functionality, network architecture, or communication protocol. Rather, it recites result-oriented functional language for adjusting training parameters based on analyzed data. Improving the efficiency of a machine learning model or training process constitutes an improvement to an abstract mathematical process, not the functioning of the computer itself.
Furthermore, the additional elements of memory and processor performing the recited functions are generic computing components executing conventional data processing operations. The claim does not recite any unconventional hardware or technological implementation beyond the abstract data analysis itself. Therefore, the claim does not include an inventive concept sufficient to transform the abstract idea int patent-eligible subject matter.
Accordingly, the claim does not integrate the abstract idea into a practical application.
Regarding the 102 Arguments:
Applicant’s arguments, see Pages 13-18, filed January 5th, 2026, with respect to the rejection(s) of claim(s) 1, 5-12, 16-23, 26, 27, and 30 under U.S.C. 102 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of updated search due to amended language and applicant arguments. Resulting in new art which collectively teach or suggest the limitations of the pending claims.
Regarding the 103 Arguments:
Applicant’s arguments, see page 18, filed January 5th, 2026, with respect to the rejection(s) of claim(s) 2-4, 13,-15, 24-25, and 28-29 under 35 U.S.C. 103 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of updated search due to amended language and applicant arguments. Resulting in new art which collectively teach or suggest the limitations of the pending claims.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AlA35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are:
means for determining a first heterogeneity level as recited in claim 27 and invokes 112(f), is interpreted as a variance, dispersion algorithm, or equivalent thereof performing the claimed functions as supported in paragraph(s) [0116] of Applicant’ s Specification.
means for determining... a first aggregation period as recited in claim 27 and invokes 112(f), is interpreted as a period selection, or equivalent thereof performing the claimed functions as supported in paragraph(s) [0117-0119] of Applicant’s Specification.
means for obtaining the first and second set of updated model parameters as recited in claim 27 and invokes 112(f), is interpreted as a antenna, demodulator, or equivalent thereof performing the claimed functions as supported in paragraph(s) [0136] of Applicant’s Specification.
means for combining the first and second set of updated model parameters as recited in claim 27 and invokes 112(f), is interpreted as a averaging algorithm, or equivalent thereof performing the claimed functions as supported in paragraph(s) [0128-0138] of Applicant’s Specification.
means for determining a second heterogeneity level as recited in claim 28 and invokes 112(f), is interpreted as a variance, dispersion algorithm, or equivalent thereof performing the claimed functions as supported in paragraph(s) [0139] of Applicant’s Specification.
means for determining...a second data aggregation period as recited in claim 28 and invokes 112(f), is interpreted as a period selection, or equivalent thereof performing the claimed functions as supported in paragraph(s) [139-0140] of Applicant’s Specification.
means for sending the first and second set of updated model parameters as recited in claim 29 and invokes 112(f), is interpreted as a transmitter, antenna, or equivalent thereof performing the claimed functions as supported in paragraph(s) [0140-0142] of Applicant’s Specification.
means for receiving a first and second decoded set of updated model parameters as recited in claim 29 and invokes 112(f), is interpreted as a demodulator, or equivalent thereof performing the claimed functions as supported in paragraph(s) [0140-0142] of Applicant’s Specification.
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AlA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AlA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AlA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AlA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
To determine if a claim is directed to patent ineligible subject matter, the Court has guided the Office to apply the Alice/Mayo test, which requires:
Step 1: Determining if the claim falls within a statutory category.
Step 2A: Determining if the claim is directed to a patent ineligible judicial exception consisting of a law of nature, a natural phenomenon, or abstract idea; and Step 2A is a two prong inquiry. MPEP 2106.04(II)(A). Under the first prong, examiners evaluate whether a law of nature, natural phenomenon, or abstract idea is set forth or described in the claim. Abstract ideas include mathematical concepts, certain methods of organizing human activity, and mental processes. MPEP 2104.04(a)(2). The second prong is an inquiry into whether the claim integrates a judicial exception into a practical application. MPEP 2106.04(d).
Step 2B: If the claim is directed to a judicial exception, determining if the claim recites limitations or elements that amount to significantly more than the judicial exception. (See MPEP 2106).
Claims 1-30 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1: Claims 1-11 are directed to an apparatus (a machine), Claims 12-23 are directed to a method (a process), and Claims 23-26 are directed to a computer-readable medium (a manufacture), and Claims 27-30 are directed to an apparatus (a machine). Therefore, Claims 1-30 are directed to a process, machine or manufacture or composition of matter.
Regarding claim 1
Step 2A Prong 1
Claim 1 recites the following mental processes, that in each case under the broadest reasonable interpretation, covers performance of the limitation in the mind (including an observation, evaluation, judgment, opinion) or with the aid of pencil and paper but for the recitation of generic computer components (e.g., “memory”, “processor”, “machine learning model”, and “federated learning”) [see MPEP 2106.04(a)(2)(III)].
“ determine a first data heterogeneity level associated with input data…” (e.g., a human can organize data based on its values, analyzing data to determine its metrics based on parameters)
“ configure, using the first data heterogeneity level, the first client device and the second client device with a first data aggregation period …” (e.g., a human can decide parameter value (number of cycles) based on an analysis)
Claim 1 recites the following mathematical processes, that in each case under the broadest reasonable interpretation, covers performance of mathematical relationships, mathematical formulas or equations, and mathematical calculations but for recitation of generic computer components (e.g., “memory”, “processor”, “machine learning model”, and “federated learning”) [see MPEP 2106.04(a)(2)(I)].
“combine the first set of updated model parameters and the second set of updated model parameters to yield a first combined set of updated model parameters” (e.g., averaging, weighted aggregation)
Accordingly, at Step 2A, prong one, the claim is directed to an abstract idea.
Step 2A Prong 2
The judicial exception is not integrated into a practical application. In particular, the claim recites the additional elements of a “memory”, “processor”, “machine learning model”, and “federated learning” which are recited at a high-level of generality such that they amount to no more than mere instructions to apply the exception using a generic computer component (See MPEP 2106.05(f)). In particular, the recited “processor” is merely a generic computer component, because it is recited to perform the function of implementing the “training the machine learning model” and the claims do not recite any particular structure for how such “processor” is implemented.
Regarding the “obtain, from a first client device and a second client device, respective updated model parameters for a machine learning model” limitation, the additional element is recited at a high-level of generality and amounts to extra-solution activity of obtaining data to input for a model, i.e., pre-solution activity of data gathering (e.g., obtaining information for processing in a computer system (see MPEP 2106.05(g)).
Regarding the “…for training the machine learning model at the first client device and the second client device, wherein the first data heterogeneity level is based on the respective updated model parameters received from the first client device and the second client device”, and “… associated with a number of training cycles, within a communication round between the apparatus and the first client device and a communication round between the apparatus and the second client device, for training the machine learning model at the first client device and the second client device”, limitations, these are merely reciting that the input data is from two devices, and that the measure is based on the received parameters. This is merely just applying the abstract idea to generic computer components (See MPEP 2106.05(f)).
Regarding the “obtain a first set of updated model parameters from the first client device and a second set of updated model parameters from the second client device, wherein the first set of updated model parameters and the second set of updated model parameters are based on the first data aggregation period” limitation, this additional element is recited at a high-level of generality and amounts to extra-solution activity of receiving data, i.e. pre-solution activity of gathering data for use in the claimed process (see MPEP 2106.05(g)).
Accordingly, at Step 2A, prong two, the additional elements individually or in combination do not integrate the judicial exception into a practical application.
Step 2B
In accordance with Step 2B, the claim does not include additional elements that are sufficient to amount to significantly more that the judicial exception. As discussed above, the additional elements of a “memory”, “processor”, “machine learning model” which are recited at a high-level of generality such that they amount to no more than mere instructions to apply the exception using a generic computer component (See MPEP 2106.05(f)).
Regarding the “obtain, from a first client device and a second client device, respective updated model parameters for a machine learning model” limitation, the additional element is recited at a high-level of generality and amounts to extra-solution activity of obtaining data to input for a model, i.e., pre-solution activity of data gathering. The courts have found limitations directed to obtaining information electronically, recited at a high-level of generality, to be well-understood, routine, and conventional (see MPEP 2106.05(d)(II), “receiving or transmitting data over a network”, "electronic record keeping," and "storing and retrieving information in memory").
Regarding the “…for training the machine learning model at the first client device and the second client device, wherein the first data heterogeneity level is based on the respective updated model parameters received from the first client device and the second client device”, and “… associated with a number of training cycles, within a communication round between the apparatus and the first client device and a communication round between the apparatus and the second client device, for training the machine learning model at the first client device and the second client device”, limitations, these are merely just applying the abstract idea to generic computer components (See MPEP 2106.05(f)).
Regarding the “obtain a first set of updated model parameters from a first client device and a second set of updated model parameters from a second client device, wherein the first set of updated model parameters and the second set of updated model parameters are based on the first data aggregation period” limitation, this additional element is recited at a high level of generality and amounts to extra-solution activity of receiving data, i.e. pre-solution activity of gathering data for use in the claimed process (see MPEP 2106.05(g)).
Accordingly, at Step 2B, the additional elements individually or in combination do not integrate the judicial exception into a practical application.
Regarding Claim 2
Step 2A Prong 1
Claim 2 inherits the same abstract ideas as claim 1 and further recites the following mental processes, that in each case under the broadest reasonable interpretation, covers performance of the limitation in the mind (including an observation, evaluation, judgment, opinion) or with the aid of pencil and paper but for the recitation of generic computer components (e.g., “processor”, “ conditional generative model”, “graph neural network”) [see MPEP 2106.04(a)(2)(III)].
“ determine that a second data heterogeneity level associated with input data for training the machine learning model is less than the first data heterogeneity level” (e.g., a human can organize data based on its values)
“determine, based on the second data heterogeneity level, a second data aggregation period” (e.g., a human can create a schedule based on differing values)
Accordingly, at Step 2A, prong one, the claim is directed to an abstract idea.
Step 2A Prong 2
The judicial exception is not integrated into a practical application. In particular, the claim recites the additional elements of a “processor” which is recited at a high-level of generality such that they amount to no more than mere instructions to apply the exception using a generic computer component (See MPEP 2106.05(f)).
Accordingly, at Step 2A, prong two, the additional elements individually or in combination do not integrate the judicial exception into a practical application.
Step 2B
In accordance with Step 2B, the claim does not include additional elements that are sufficient to amount to significantly more that the judicial exception. As discussed above, the additional elements of a “processor”, which is recited at a high-level of generality such that it amounts to no more than mere instructions to apply the exception using a generic computer component (See MPEP 2106.05(f)).
Accordingly, at Step 2B, the additional elements individually or in combination do not integrate the judicial exception into a practical application.
Regarding Claim 3
Step 2A Prong 1
Claim 3 does not introduce a new judicial exception, but it inherits the same abstract ideas as claim 1.
Accordingly, at Step 2A, prong one, the claim is directed to an abstract idea.
Step 2A Prong 2
The judicial exception is not integrated into a practical application. In particular, the claim recites the additional elements of a “processor” which is recited at a high-level of generality such that they amount to no more than mere instructions to apply the exception using a generic computer component (See MPEP 2106.05(f)).
Regarding the “send the first set of updated model parameters and the second set of updated model parameters to a network entity” this additional element is recited at a high level of generality and amounts to extra-solution activity of outputting refined data, i.e. post-solution activity of outputting data for use in the claimed process (see MPEP 2106.05(g)).
Regarding the “receive a first decoded set of updated model parameters and a second decoded set of updated model parameters from the network entity” limitation, this additional element is recited at a high level of generality and amounts to extra-solution activity of receiving data, i.e. pre-solution activity of selecting a particular data source or type of data to be manipulated for use in the claimed process (see MPEP 2106.05(g)).
Accordingly, at Step 2A, prong two, the additional elements individually or in combination do not integrate the judicial exception into a practical application.
Step 2B
In accordance with Step 2B, the claim does not include additional elements that are sufficient to amount to significantly more that the judicial exception. As discussed above, the additional elements of a “processor”, which is recited at a high-level of generality such that it amounts to no more than mere instructions to apply the exception using a generic computer component (See MPEP 2106.05(f)).
Regarding the “send the first set of updated model parameters and the second set of updated model parameters to a network entity” this additional element is recited at a high level of generality and amounts to extra-solution activity of outputting refined data, i.e. post-solution activity of outputting data for use in the claimed process (see MPEP 2106.05(g)).
Regarding the “receive a first decoded set of updated model parameters and a second decoded set of updated model parameters from the network entity” limitation, as discussed above, this additional element is recited at a high level of generality and amounts to extra-solution activity of receiving data, i.e. pre-solution activity of selecting a particular data source or type of data to be manipulated for use in the claimed process. The courts have found limitations directed to obtaining information electronically, recited at a high level of generality, to be well-understood, routine, and conventional (See MPEP 2106.05(d)(II), “receiving or transmitting data over a network”, "electronic record keeping," and "storing and retrieving information in memory").
Accordingly, at Step 2B, the additional elements individually or in combination do not integrate the judicial exception into a practical application.
Regarding Claim 4
Step 2A Prong 1
Claim 4 does not introduce a new judicial exception, but it inherits the same abstract ideas as claim 1.
Accordingly, at Step 2A, prong one, the claim is directed to an abstract idea.
Step 2A Prong 2
The judicial exception is not integrated into a practical application. In particular, the claim recites the additional element of an “apparatus corresponds to a radio unit (RU) and the network entity corresponds to a distributed unit (DU)” limitation, amounts to no more than generally linking the use of a judicial exception to a particular technological environment or field of use. As explained by the Supreme Court, a claim directed to a judicial exception cannot be made eligible "simply by having the applicant acquiesce to limiting the reach of the patent for the formula to a particular technological use." Diamond v. Diehr, 450 U.S. 175, 192 n.14, 209 USPQ 1, 10 n. 14 (1981). Thus, limitations that amount to merely indicating a field of use or technological environment in which to apply a judicial exception do not amount to significantly more than the exception itself, and cannot integrate a judicial exception into a practical application.
Accordingly, at Step 2A, prong two, the additional elements individually or in combination do not integrate the judicial exception into a practical application.
Step 2B
In accordance with Step 2B, the claim does not include additional elements that are sufficient to amount to significantly more that the judicial exception. As discussed above, the additional element of an “apparatus corresponds to a radio unit (RU) and the network entity corresponds to a distributed unit (DU)” limitation, discusses generally linking the use of a judicial exception to a particular technological environment or field of use (See MPEP2106.05(h)).
Accordingly, at Step 2B, the additional elements individually or in combination do not integrate the judicial exception into a practical application.
Regarding Claim 5
Step 2A Prong 1
Claim 5 does not introduce a new judicial exception, but it inherits the same abstract ideas as claim 1.
Accordingly, at Step 2A, prong one, the claim is directed to an abstract idea.
Step 2A Prong 2
The judicial exception is not integrated into a practical application. In particular, the claim recites the additional elements of a “processor” which is recited at a high-level of generality such that they amount to no more than mere instructions to apply the exception using a generic computer component (See MPEP 2106.05(f)).
Regarding the “send the first combined set of updated model parameters to a network entity for aggregation with a second combined set of updated model parameters, wherein the network entity is upstream from the apparatus” this additional element is recited at a high level of generality and amounts to extra-solution activity of outputting refined data, i.e. post-solution activity of outputting data for use in the claimed process (see MPEP 2106.05(g)).
Accordingly, at Step 2A, prong two, the additional elements individually or in combination do not integrate the judicial exception into a practical application.
Step 2B
In accordance with Step 2B, the claim does not include additional elements that are sufficient to amount to significantly more that the judicial exception. As discussed above, the additional elements of a “processor”, which is recited at a high-level of generality such that it amounts to no more than mere instructions to apply the exception using a generic computer component (See MPEP 2106.05(f)).
Regarding the “send the first combined set of updated model parameters to a network entity for aggregation with a second combined set of updated model parameters, wherein the network entity is upstream from the apparatus” this additional element is recited at a high level of generality and amounts to extra-solution activity of outputting refined data, i.e. post-solution activity of outputting data for use in the claimed process (see MPEP 2106.05(g)).
Accordingly, at Step 2B, the additional elements individually or in combination do not integrate the judicial exception into a practical application.
Regarding Claim 6
Step 2A Prong 1
Claim 6 inherits the same abstract ideas as claim 1 and further recites the following mathematical concepts, that in each case under the broadest reasonable interpretation, involves mathematical relationships, formulas, calculations, or algorithms implemented using generic computer components (e.g., “processor”, “ conditional generative model”, “graph neural network”) [see MPEP 2106.04(a)(2)(I)].
“average the first set of updated model parameters and the second set of updated model parameters.” (e.g., modifying a set of parameters via mathematical concept)
Accordingly, at Step 2A, prong one, the claim is directed to an abstract idea.
Step 2A Prong 2
The judicial exception is not integrated into a practical application. In particular, the claim recites the additional elements of a “processor” which is recited at a high-level of generality such that they amount to no more than mere instructions to apply the exception using a generic computer component (See MPEP 2106.05(f)).
Accordingly, at Step 2A, prong two, the additional elements individually or in combination do not integrate the judicial exception into a practical application.
Step 2B
In accordance with Step 2B, the claim does not include additional elements that are sufficient to amount to significantly more that the judicial exception. As discussed above, the additional elements of a “processor”, which is recited at a high-level of generality such that it amounts to no more than mere instructions to apply the exception using a generic computer component (See MPEP 2106.05(f)).
Accordingly, at Step 2B, the additional elements individually or in combination do not integrate the judicial exception into a practical application.
Regarding Claim 7
Step 2A Prong 1
Claim 7 does not introduce a new judicial exception, but it inherits the same abstract ideas as claim 1.
Accordingly, at Step 2A, prong one, the claim is directed to an abstract idea.
Step 2A Prong 2
The judicial exception is not integrated into a practical application. In particular, the claim recites the additional element of an “the first set of updated model parameters and the second set of updated model parameters are combined using a same shared channel” limitation, amounts to no more than generally linking the use of a judicial exception to a particular technological environment or field of use. As explained by the Supreme Court, a claim directed to a judicial exception cannot be made eligible "simply by having the applicant acquiesce to limiting the reach of the patent for the formula to a particular technological use." Diamond v. Diehr, 450 U.S. 175, 192 n.14, 209 USPQ 1, 10 n. 14 (1981). Thus, limitations that amount to merely indicating a field of use or technological environment in which to apply a judicial exception do not amount to significantly more than the exception itself, and cannot integrate a judicial exception into a practical application.
Accordingly, at Step 2A, prong two, the additional elements individually or in combination do not integrate the judicial exception into a practical application.
Step 2B
In accordance with Step 2B, the claim does not include additional elements that are sufficient to amount to significantly more that the judicial exception. As discussed above, the additional element of an “the first set of updated model parameters and the second set of updated model parameters are combined using a same shared channel” limitation, discusses generally linking the use of a judicial exception to a particular technological environment or field of use (See MPEP2106.05(h)).
Accordingly, at Step 2B, the additional elements individually or in combination do not integrate the judicial exception into a practical application.
Regarding Claim 8
Step 2A Prong 1
Claim 8 does not introduce a new judicial exception, but it inherits the same abstract ideas as claim 1.
Accordingly, at Step 2A, prong one, the claim is directed to an abstract idea.
Step 2A Prong 2
The judicial exception is not integrated into a practical application. In particular, the claim recites the additional element of “the input data corresponds to data that is not independently and identically distributed” which is recited at a high-level of generality and amounts to extra-solution activity of data preparation, i.e. pre-solution activity of selecting a particular data source or type of data to be manipulated for use in the claimed process (see MPEP 2106.05(g)).
Accordingly, at Step 2A, prong two, the additional elements individually or in combination do not integrate the judicial exception into a practical application.
Step 2B
In accordance with Step 2B, the claim does not include additional elements that are sufficient to amount to significantly more that the judicial exception. As discussed above, the additional element of “the input data corresponds to data that is not independently and identically distributed”, which is recited at a high-level of generality and amounts to extra-solution activity of receiving data, i.e. pre-solution activity of selecting a particular data source or type of data to be manipulated for use in the claimed process. The courts have found limitations directed to obtaining information electronically, recited at a high level of generality, to be well-understood, routine, and conventional (See MPEP 2106.05(d)(II), “receiving or transmitting data over a network”, "electronic record keeping," and "storing and retrieving information in memory").
Accordingly, at Step 2B, the additional elements individually or in combination do not integrate the judicial exception into a practical application.
Regarding Claim 9
Step 2A Prong 1
Claim 9 does not introduce a new judicial exception, but it inherits the same abstract ideas as claim 1.
Accordingly, at Step 2A, prong one, the claim is directed to an abstract idea.
Step 2A Prong 2
The judicial exception is not integrated into a practical application. In particular, the claim recites the additional element of “the first set of updated model parameters and the second set of updated model parameters correspond to a single layer of a plurality of layers of the machine learning model” which are recited at a high-level of generality such that they amount to no more than mere instructions to apply the exception using a generic computer component (See MPEP 2106.05(f)). In particular, the recited “machine learning model” is merely a generic computer component, because it is recited to perform the function of implementing the “first” and “second set of updated model parameters” and the claims do not recite any particular structure for how such “machine learning model” is implemented.
Accordingly, at Step 2A, prong two, the additional elements individually or in combination do not integrate the judicial exception into a practical application.
Step 2B
In accordance with Step 2B, the claim does not include additional elements that are sufficient to amount to significantly more that the judicial exception. As discussed above, the additional element of “the first set of updated model parameters and the second set of updated model parameters correspond to a single layer of a plurality of layers of the machine learning model” which is recited at a high-level of generality such that they amount to no more than mere instructions to apply the exception using a generic computer component (See MPEP 2106.05(f)).
Accordingly, at Step 2B, the additional elements individually or in combination do not integrate the judicial exception into a practical application.
Regarding Claim 10
Step 2A Prong 1
Claim 10 does not add any additional judicial exceptions, but it inherits the same abstract ideas as claim 1.
Accordingly, at Step 2A, prong one, the claim is directed to an abstract idea.
Step 2A Prong 2
The judicial exception is not integrated into a practical application. In particular, the claim recites the additional elements of a “processor” which is recited at a high-level of generality such that it amounts to no more than mere instructions to apply the exception using a generic computer component (See MPEP 2106.05(f)).
Regarding the “update the machine learning model based on the first combined set of updated model parameters to yield a modified machine learning model”, limitation, this additional element is recited at a high-level of generality such that it amounts to no more than mere instructions to apply the exception using a generic computer component (See MPEP 2106.05(f)).
Accordingly, at Step 2A, prong two, the additional elements individually or in combination do not integrate the judicial exception into a practical application.
Step 2B
In accordance with Step 2B, the claim does not include additional elements that are sufficient to amount to significantly more that the judicial exception. As discussed above, the additional elements of a “processor” which is recited at a high-level of generality such that it amounts to no more than mere instructions to apply the exception using a generic computer component (See MPEP 2106.05(f)).
Regarding the “update the machine learning model based on the first combined set of updated model parameters to yield a modified machine learning model” limitation, this additional element is recited at a high-level of generality and amounts to no more than adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea. Accordingly, this additional element does not add significantly more than the judicial exception. (See MPEP 2106.05(f)).
Accordingly, at Step 2B, the additional elements individually or in combination do not integrate the judicial exception into a practical application.
Regarding Claim 11
Step 2A Prong 1
Claim 11 inherits the same abstract ideas as claim 1 and further recites the following mathematical concepts, that in each case under the broadest reasonable interpretation, involves mathematical relationships, formulas, calculations, or algorithms implemented using generic computer components (e.g., “memory”, “processor”, “machine learning model”) [see MPEP 2106.04(a)(2)(I)].
“the first data heterogeneity level is based on at least one of a variance and a dispersion among the respective updated model parameters” (e.g., mathematical operations performed on input data, including variance and dispersion)
Step 2A Prong 2
The judicial exception is not integrated into a practical application. In particular, the claim recites the additional elements of a “first data heterogeneity level is based on at least one of a variance and a dispersion among the respective updated model parameter received from the first client device and the second client device” which is recited at a high-level of generality such that they amount to no more than mere instructions to apply the exception using a generic computer component (See MPEP 2106.05(f)).
Accordingly, at Step 2A, prong two, the additional elements individually or in combination do not integrate the judicial exception into a practical application.
Step 2B
In accordance with Step 2B, the claim does not include additional elements that are sufficient to amount to significantly more that the judicial exception. As discussed above, the additional elements of a “processor”, and “first data heterogeneity level is based on at least one of a variance and a dispersion among the respective updated model parameter received from the first client device and the second client device” which are recited at a high-level of generality such that it amounts to no more than mere instructions to apply the exception using a generic computer component (See MPEP 2106.05(f)).
Accordingly, at Step 2B, the additional elements individually or in combination do not integrate the judicial exception into a practical application.
Regarding claims 12-22
Claims 12-22 recites a method. Each of these claims corresponds directly to the system steps of claims 1-11, respectively, with the addition of generic hardware components such as a memory and a processor which are insufficient to render the claims subject matter eligible for the same reasons described above.
Regarding Claims 23-25
Claims 23-25 recites a computer readable storage medium. Each of these claims corresponds directly to the method steps of claims 1-3, respectively, with the addition of generic hardware components such as computer readable storage medium, and a processor which are insufficient to render the claims subject matter eligible for the same reasons described above.
Regarding claim 26
Claim 26 recites a computer readable storage medium. Which corresponds directly to the system step of claim 5, respectively, with the addition of generic hardware components such as computer readable storage medium, and a processor which are insufficient to render the claims subject matter eligible for the same reasons described above.
Regarding Claims 27-29
Claims 27-29 recites an apparatus. Each of these claims corresponds directly to the system steps of claims 1-3, respectively, with the addition of generic hardware components such as a memory and a processor which are insufficient to render the claims subject matter eligible for the same reasons described above.
Regarding claim 30
Claim 30 recites an apparatus. Which corresponds directly to the system step of claim 7, respectively, with the addition of generic hardware components such as a memory and a processor which are insufficient to render the claims subject matter eligible for the same reasons described above.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 5-12, 16-23, 26-27, and 30 is/are rejected under 35 U.S.C. 103 as being unpatentable over Akdeniz et al. (US 20230068386 A1, referred to as Akdeniz) in view of Li et al. ("Federated optimization in heterogeneous networks.", referred to as Li), in view of Zeng et al. (“Local Epochs Inefficiency Caused by Device Heterogeneity in Federated Learning”, referred to as Zeng).
Regarding claim 1 , Akdeniz teaches an apparatus for performing federated learning, comprising:
at least one memory comprising instructions ([0020-0030], and [0040-0045]: Describes a master node (central server/MEC system 200) coordinating distributed or federated model training among heterogenous edge devices (UEs 01, MEC servers 201). The process is called distributed gradient descent corresponding to federated learning.; FIG. 6, and [0077-0078]: Describes a system that comprises “ edge provisioning node 644 includes one or more servers and one or more storage devices” and “the processor platform(s) that execute the computer readable instructions “.) ; and
at least one processor configured to execute the instructions and cause the apparatus to (FIG. 8, and [0083-0088]: Describes an edge computing node 850 and wireless transceiver 866 to implement radio functions, “ edge computing device 850 may include processing circuitry in the form of a processor 852,” and the device may store instructions to be executed in the system/apparatus “The processor 852 may communicate with a system memory 854 over an interconnect 856 (e.g., a bus) through an interconnect interface 853 of the processor.”):
obtain, from a first client device and a second client device, respective updated model parameters for a machine learning model ([0127-0128], and [0186]: Describes that the federated learning system where multiple clients (edge devise) perform local training and provides “updates”/”weight values”/”locally updated models” to a server (MEC/central server), which corresponds to obtaining those client-provided updates in order to aggregate them into a global model. Each client (multiple client computing node/edge devices) computing node obtains a global model, updates aspects of the global model (model parameters or neural network weights) using its local data, and communicates the updates to the central server, which aggregates the received updates and updates model weight values based on an average of the weight values received from the clients. The model updates include updated values for neural network nodes and that the aggregated node weight values are used for subsequent implementations of the model. The clients perform multiple local epochs of training on their respective local raw data, and that the MEC receives the locally updated models. The data across edge devices is non-IID, and that it is used to account for such differences across clients during training):
Although Akdeniz teaches obtain, from a first client device and a second client device, respective updated model parameters for a machine learning model and receiving updated model parameters from client devices in a federated learning environment. It does not teach to determining a data heterogeneity level and that such heterogeneity level is based on the respective updated model parameters received form the client devices.
Li teaches, determine a first data heterogeneity level associated with input data for training the machine learning model at the first client device and the second client device, (Pages 5-6, Section 4.1, and Page 10 Section 5.3.3, Figure 2: Describes that statistical heterogeneity (non-IID data) can be characterized by a scalar dissimilarity metric derived from local gradient information and tracks gradient variance to measure divergence among client updates. Local gradients are computed from client training data, divergence among local gradients reflects differences in client data distributions (statistical heterogeneity).)
It would have been obvious to one of ordinary skill in the art at the time of the claimed invention to have combine Akdeniz’s federated learning MEC system with Li’s data heterogeneity metrics based on gradients. Doing so would have enabled the system to quantify statistical heterogeneity among client devices. This would improve convergence, stability, and adaptive scheduling in the federated leaning process.
Akdeniz in view of Li further teaches, wherein the first data heterogeneity level is based on the respective updated model parameters received from the first client device and the second client device (The scalar dissimilarity metric B in Li represents a data heterogeneity level derived from client model update information. In the federated learning system of Akdeniz, client devices transmit updated model parameters (gradients or weight updates) to the server.)
Akdeniz in view of Li teaches, configure, using on the first data heterogeneity level, the first client device and the second client device with a first data aggregation period associated with a number of training cycles, within a communication round between the apparatus and the first client device and a communication round between the apparatus and the second client device(Akdeniz [0186]: Describes a controller that communicates global model, and clients carry out multiple local epochs. The MEC receives locally updated models. Showing that the controller/server orchestrates how clients train per global epoch/round and they configure client behaviors at least at the level of training/round structure (global epoch, to local epoch, to return updates).)
Although Akdeniz in view of Li, teaches to configure… the first client device and the second client device. It does not teach using on the first data heterogeneity level, … with a first data aggregation period associated with a number of training cycles, within a communication round between the apparatus and the first client device and a communication round between the apparatus and the second client device.
Zeng teaches using on the first data heterogeneity level, … with a first data aggregation period associated with a number of training cycles, within a communication round between the apparatus and the first client device and a communication round between the apparatus and the second client device ( Page 1 Abstract: Describes client heterogeneity causes inconsistent training speeds and that the time cost of clients’ local training reflects such heterogeneity and can be used to guide dynamic setting of local epochs (training cycles). ; Page 6, Algorithm 2: Describes that a client predicts the training time of one local epoch Ttrain and computes the number of local epochs as Ei = T/Ttrain, and then performs local training for epochs 1 through Ei before returning updates to the server. ; Pages 10-11 Section 3.4: Describes dynamically setting the number of local epochs (training cycles) for a client within a given communication round/time window between the client and the server. Specifically, it shows that the number of local epochs may be calculated according to a preset communication time window T between the client and server. The correspond to configuring a client with a data aggregation period (communication window T / per-round sync window) that is associated with a number of training cycles (local epochs Ei) within a communication round between server and client.)
It would have been obvious to one of ordinary skill in the art at the time of the claimed invention to have combine Akdeniz’s, in view of Li’s federated learning MEC system with Zeng’s technique for setting client’s number of epochs within a communication time window. Doing so would have enabled the system to reduce inefficiency caused by heterogeneous client training speeds, to improve round completion efficiency by configuring clients with an appropriate number of local training cycles per round.
Akdeniz in view of Li, in view of Zeng, further teaches, for training the machine learning model at the first client device and the second client device (Akdeniz [0186]: Describes that during training for a given global epoch, the controller communicates the global model to participating client devices and the clients perform multiple local epochs of training on their local raw data, producing locally updated models that are received by the MEC, corresponding to training the machine learning model at the client devices.);
obtain a first set of updated model parameters from a first client device and a second set of updated model parameters from a second client device, wherein the first set of updated model parameters and the second set of updated model parameters are based on the first data aggregation period (Akdeniz FIG. 15, and [0194-0195]: Describes “At operation 1530, the client computing nodes 1591 transmit the locally updated models to the controller node 1593.” Wherein each client sends its own updated model parameters, and further details that the operations are repeated until the model converges. These updates are obtained after the clients conduct their distance calculations.; [0328], [0345], and [0347]: Describes receiving partial/updated gradients or model parameters from each client after the locally determined training cycles (aggregation period) and before the next communication round.); and
combine the first set of updated model parameters and the second set of updated model parameters to yield a first combined set of updated model parameters (Akdeniz FIG. 13, and [0150]: Describes that multiple clients compute and send model updates to the server. This server aggregates the different models from the clients to update the global machine learning model. This aggregation may include averaging, such as straight or weighted averaging of the model updates. This corresponds to combining sets of updated parameters from multiple clients to combine a set of updated parameters.).
Regarding claim 5, Akdeniz in view of Li, in view of Zeng further teaches the apparatus of claim 1, wherein the at least one processor is further configured to: send the first combined set of updated model parameters to a network entity for aggregation with a second combined set of updated model parameters, wherein the network entity is upstream from the apparatus (Akdeniz, FIG. 13, and [0150]: Describes that multiple clients compute and send model updates to the server. This server receives these updates as the head device in the system, to execute the steps of combining the parameters. This corresponds to combining sets of updated parameters in a networked entity that is upstream.).
Regarding claim 6, Akdeniz in view of Li, in view of Zeng further teaches the apparatus of claim 1, wherein to combine the first set of updated model parameters and the second set of updated model parameters the at least one processor is further configured to:
average the first set of updated model parameters and the second set of updated model parameters (Akdeniz FIG. 13, and [0150]: Describes that “The central server 1360 uses the coded training data and label data to compute a model update (e.g., partial gradients) at 1312, and then aggregates the different model updates at 1314 to update the global ML model. The aggregation may include an averaging, e.g., a straight or weighted averaging, of the model updates.” Which corresponds to averaging a first and second set of client provided parameters.) .
Regarding claim 7, Akdeniz in view of Li, in view of Zeng further teaches the apparatus of claim 1, wherein the first set of updated model parameters and the second set of updated model parameters are combined using a same shared channel (Akdeniz [0036]: Describes “exploit wireless for computation including over the air combining, and to promote multi-stage learning” to transmit model updates over the air on a common channel so the server can receive the parameter sets before further processing.).
Regarding claim 8, Akdeniz in view of Li, in view of Zeng further teaches the apparatus of claim 1, wherein the input data corresponds to data that is not independently and identically distributed (Akdeniz [0388-0392]: Describes that “ clients may have statistically differing datasets, i.e., not independent and identically distributed” and scheduling methods that cope with those non-IID client data distributions.).
Regarding claim 9, Akdeniz in view of Li, in view of Zeng further teaches the apparatus of claim 1, wherein the first set of updated model parameters and the second set of updated model parameters correspond to a single layer of a plurality of layers of the machine learning model (Akdeniz [0392-0397]: Describes that “based on the norm of the local gradients computed with respect to the last layer” the server selects uploads of layer specific gradients, each client transmits only the parameter updates that one later of the first and second sets obtained by the server.).
Regarding claim 10, Akdeniz in view of Li, in view of Zeng further teaches the apparatus of claim 1, wherein the at least one processor is further configured to:
update the machine learning model based on the first combined set of updated model parameters to yield a modified machine learning model (Akdeniz [0167]: Describes that after averaging client updates, the central server “aggregates the received weights to obtain a new global weight” and then shares the aggregated weights with the client nodes, so both server and clients replace previous model weights with the newly combined sets, corresponding to a modified machine learning model.).
Regarding claim 11 Akdeniz in view of Li, in view of Zeng further teaches the apparatus of claim 1, wherein the first data heterogeneity level is based on at least one of a variance and a dispersion among the respective updated model parameters received from the first client device and the second client device (Akdeniz [0343-0347]: Describes that the heterogeneity metric as a variance/dispersion of the updated vectors received from different clients, corresponding to the heterogeneity level based on the variance determined from the system.).
Regarding claims 12, 16-22, which recites substantially the same limitations as claims 1, 5-11. Claims 12, 16-22 further recite a method (Akdeniz, [0082]: Describes a method to execute instructions.) to perform the system steps of claims 1, 5-11, respectively, and are therefore rejected on the same premise.
Regarding claims 23 and 26, which recites substantially the same limitations as claims 1 and 5. Claims 23 and 26 further recite a computer-readable medium (Akdeniz, [0100]: Describes a computer readable medium storing instructions to be executed.) to perform the system steps of claims 1 and 5, respectively, and are therefore rejected on the same premise.
Regarding claims 27 and 30, which recites substantially the same limitations as claims 1 and 7. Claims 27 and 30 further recite a apparatus for wireless communication (Akdeniz, [0036]: Describes wireless devices to transmit data.) to perform the system steps of claims 1 and 7, respectively, and are therefore rejected on the same premise.
Claim(s) 2-4, 13-15, 24-25 and 28-29 is/are rejected under 35 U.S.C. 103 as being unpatentable over Akdeniz et al. (US 20230068386 A1, referred to as Akdeniz)in view of Li et al. ("Federated optimization in heterogeneous networks.", referred to as Li), Zeng et al. (“Local Epochs Inefficiency Caused by Device Heterogeneity in Federated Learning”, referred to as Zeng) in view of Prakash et al. (US 20190220703 A1, referred to as Parkash).
Regarding claim 2, Akdeniz in view of Li, in view of Zeng teaches the apparatus of claim 1, wherein the at least one processor is further configured to cause the apparatus to: determine that a second data heterogeneity level associated with input data for training the machine learning model is less than the first data heterogeneity level (Akdeniz[0217]: Describes measuring heterogeneous compute/communication characteristics and model them per client.; [0320-0328] Describes that the server classifies clients into sets and compares divergence(heterogeneity) values, selecting sets where divergence is lower than a previous (higher) level to decide scheduling.);
Although Akdeniz teaches determine that a second data heterogeneity level associated with input data for training the machine learning model is less than the first data heterogeneity level, it does not teach determine, based on the second data heterogeneity level, a second data aggregation period for training the machine learning model, wherein the second data aggregation period is greater than the first data aggregation period.
Prakash teaches determine, based on the second data heterogeneity level, a second data aggregation period for training the machine learning model, wherein the second data aggregation period is greater than the first data aggregation period ([0084]: Describes a predetermined epoch/aggregation time and transmits a probability value tied to that epoch; [0085]: Describes load-allocation criterion uses the heterogeneity of compute/comm parameters to adjust epoch time, which lengthens the epoch when heterogeneity is lower so that more data can be aggregated.).
It would have been obvious to one of ordinary skill in the art at the time of the claimed invention to modify the system of Akdeniz with Prakash’s time selection logic. Doing so would all the system to automatically lengthen the aggregation epoch whenever client heterogeneity is low, lowering communication rounds and idle wait times and speed convergence without loss in accuracy.
Regarding claim 3, Akdeniz in view of Li, in view of Zeng teaches the apparatus of claim 1, wherein the at least one processor is further configured to cause the apparatus to: send the first set of updated model parameters and the second set of updated model parameters to a network entity (Akdeniz [0216]: Describes that clients send partial/updated gradients (model parameters) to the controller for aggregation.);
Although Akdeniz teaches send the first set of updated model parameters and the second set of updated model parameters to a network entity, it does not teach determine, receive a first decoded set of updated model parameters and a second decoded set of updated model parameters from the network entity.
Prakash teaches receive a first decoded set of updated model parameters and a second decoded set of updated model parameters from the network entity ([0221-0222]: Describes that a central server decodes coded updates then distributes the resulting decoded model parameters back to the clients.).
It would have been obvious to one of ordinary skill in the art at the time of the claimed invention to modify the system of Akdeniz with Prakash’s time selection logic. Doing so would apply probability weighting framework to send and receive data, boosting reliability over variable links and aligning parameter flow within standard 5G RAN splits with predictable performance gains.
Regarding claim 4, Akdeniz in view of Li, in view of Zeng, in view of Prakash teaches the apparatus of claim 3, Prakash further teaches wherein the apparatus corresponds to a radio unit (RU) and the network entity corresponds to a distributed unit (DU) ([0087-0088]: Describes edge compute nodes acting as radio units and identifies distributed units as upstream aggregation points in a RAN split.).
Regarding claims 13-15, which recites substantially the same limitations as claims 2-4. Claims 13-15 further recite a method (Akdeniz, [0082], and Prakash, [0020]: Both describe a method to execute instructions.) to perform the system steps of claims 2-4, respectively, and are therefore rejected on the same premise.
Regarding claims 24 and 25, which recites substantially the same limitations as claims 2 and 3. Claims 24 and 25 further recite a computer-readable medium (Akdeniz, [0100], and Prakash, [0076]: Both describe a computer readable medium storing instructions to be executed.) to perform the system steps of claims 2 and 3, respectively, and are therefore rejected on the same premise.
Regarding claims 28 and 29, which recites substantially the same limitations as claims 2 and 3. Claims 24 and 25 further recite a apparatus for wireless communication (Akdeniz, [0036], and Prakash, [0019]: Both describe wireless devices to transmit data.) to perform the system steps of claims 2 and 3, respectively, and are therefore rejected on the same premise.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. See attached PTO-89 for additional references including:
US 20220129706 A1: heterogeneous federated learning
US 20230016827 A1: federated learning, aggregation updated weights from devices
US 11715044 B2: Heterogeneity metrics
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DONALD T RODEN whose telephone number is (571)272-6441. The examiner can normally be reached Mon-Thur 8:00-5:00 EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Omar Fernandez Rivas can be reached at (571) 272-2589. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/D.T.R./Examiner, Art Unit 2128
/OMAR F FERNANDEZ RIVAS/Supervisory Patent Examiner, Art Unit 2128