Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-4, 6-11, 13-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Lo et al. (U.S. Pub No. 2023/0006913 A1) in view of MU et al. (U.S. Pub No. 2024/0276247 A1).
1, Lo teaches an apparatus comprising at least one processor and at least one memory including computer code for one or more programs, the at least one memory and the computer code configured, with the at least one processor [par 0102, The memory 290 is coupled to the controller/processor 288. Part of the memory 290 could include a RAM, and another part of the memory 290 could include a Flash memory or other ROM],
to cause the apparatus at least to: receive, from user equipment, user equipment capability information [par 0006, 0120, UE capability for support of machine-learning (ML) based channel environment classification may be reported by a user equipment to a base station. FIG. 6 shows an example flowchart for BS operation to support AI/ML techniques for UL channel coding according to embodiments of the present disclosure. FIG. 6 is an example of a method 600 for operations at BS side to support AI/ML techniques for UL channel coding. At operation 601, a BS receives the UE capability information from a UE, including the support of AI/ML approach for channel coding. At operation 602, the BS sends configuration information to UE, which can include configuration information related to AI/ML encoder such as AI/ML model be used];
compute statistics for a set of user equipment capability classes based on the user equipment capability information [para 0153, 0159, shows the UE reports capability information to the BS, including the support of an ML approach for channel environment classification. At operation 1402, the UE receives configuration information from the BS, which can include ML-related configuration information such as enabling/disabling of an ML approach for channel environment classification, an ML model to be used, trained model parameters, and/or whether model parameter updates reported by the UE will be used or not; this information will be described. At operation 1404, the UE receives configuration information from the BS. In one example, a configuration message can include an index to a pre-defined lookup table for a transmission mode. In another example, a configuration message can include an index to a pre-defined lookup table for an RS pattern. In yet another example, information can consist of a BS handover command, a recommendation for transmission mode, a recommendation for scheduled time/frequency resource, a recommendation for MIMO beamforming adjustment from the current served beams, etc., along with the feedback on the inferred channel environment classification]
to train a model generic for at least a subset of user equipment capability classes selected amongst the set of user equipment capability classes based on the statistics for a set of user equipment capability classes [par 0152, claim 15, the BS performs model training, or receives model parameters from a network entity. In one embodiment, model training can be performed at the BS. Alternatively, model training can be performed at another network entity (e.g., the O-RAN defined RAN Intelligent Controller), and trained model parameters can be sent to the BS. In yet another embodiment, model training can be performed offline (e.g., model training is performed outside of the network), and the BS may receive the trained model parameters from a network entity or may have them stored in memory inside the BS. The processor is configured to perform model training based on the configuration and a received information on channel environment determined by the UE].
Lo fail to show cause a network management function to train models specific for at least a subset of user equipment capability classes selected amongst the set of user equipment capability classes based on the statistics for a set of user equipment capability
In an analogous art MU show cause a network management function to train models specific for at least a subset of user equipment capability classes selected amongst the set of user equipment capability classes based on the statistics for a set of user equipment capability[par 0059, The terminal initiates an analysis subscription request, and the gNB-CU generates model subscription request information based on its own AI processing capability and the analysis subscription request information and sending it to the OAM. According to the model subscription request, the OAM network function initiates a training supplementary data subscription request to the gNB-CU, and relevant network element(s) collects and processes data and uploads the data to the OAM].
Before the effective filing date it would have been obvious to one of ordinary skill in the art to combine the teachings of Lo and Mu because this provides a model data management method, so that the wireless network architecture supporting AI has higher stability and efficiency in the mobile terminal scenario. [Mu para 0054]
2, Lo and Mu provide the apparatus of claim 1, wherein the user equipment capability information comprises at least one of: a memory space available at the user equipment to store a model; a number of bits used by the user equipment to represent a model parameter; or a number of trainable model parameters used by the user equipment [Lo, par 0106, 0138, table2, The UE 116 also includes a speaker 306, a controller or processor 307, an input/output (I/O) interface (IF) 308, a touchscreen display 310, and a memory 311. The memory 311 includes an OS 312 and one or more applications 313. The configuration information can include the model parameters of ML algorithms. In another embodiment, the parameters of the ML model can be either directly sent or indicated through the index in a predefined table. For example, there can be K predefined operation modes, where each mode corresponding to certain operation/use case (e.g., the physical channel and block sizes combination) with certain ML model].
3, Lo and Mu disclose the apparatus of claim 1, each user equipment capability class of the set of user equipment capability classes comprises a different combination of user equipment capability information [Lo, par 0007, 0145, The transceiver is configured to transmit a report of UE capability for support of machine-learning (ML) based channel environment classification, where the channel environment classification classifies a channel environment of a channel between the UE and a base station based on one or more of UE speed or Doppler spread, UE trajectory, frequency selectivity or delay spread, coherence bandwidth, coherence time, radio resource management (RRM) metrics, block error rate, throughput, or UE acceleration. In one embodiment, a channel environment can be classified in terms of UE speed (or, similarly, Doppler spread) and/or frequency selectivity (or, similarly, delay spread). In another embodiment, a channel environment can be classified in terms of coherence bandwidth and/or coherence time. As an example, one possible categorization can be done for four classes comprising of low Doppler spread-low delay spread class, high Doppler spread-low delay spread case, low Doppler spread-high delay spread case, and high Doppler spread-high delay spread case].
4, Lo and Mu disclose the apparatus of claim 1, wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: identify the set of user equipment capability classes based on the user equipment capability information [Lo par 0148, 0153, the BS receives UE capability information from the UE, including the support of an ML approach for channel environment classification. At operation 10102, the BS sends configuration information to the UE, which can include ML-related configuration information such as enabling/disabling of an ML approach for channel environment classification, an ML model to be used, trained model parameters. At operation 1401, the UE reports capability information to the BS, including the support of an ML approach for channel environment classification. At operation 1402, the UE receives configuration information from the BS, which can include ML-related configuration information such as enabling/disabling of an ML approach for channel environment classification, an ML model to be used, trained model parameters];
identify the set of user equipment capability classes based on user equipment capability classes predefined at the apparatus [par 0136, In one embodiment, the configuration information can include which AI/ML model to be used for certain operation/use case. For example, there can be M predefined ML models, with index 1, 2, . . . , M corresponding to one ML model defined by the configuration of the encoder and/or decoder];
Lo fail to show receive, from the network management function, the set of user equipment capability classes identified by the network management function based on previous statistics for a previous set of user equipment classes sent by the apparatus to the network management function; or receive, from the network management function, the set of user equipment capability classes identified by the network management function based on user equipment capability classes predefined at the network management function.
In an analogous art Mu show receive, from the network management function, the set of user equipment capability classes identified by the network management function based on previous statistics for a previous set of user equipment classes sent by the apparatus to the network management function; or receive, from the network management function, the set of user equipment capability classes identified by the network management function based on user equipment capability classes predefined at the network management function [par 0106, The control radio access network device collects and processes the local training data of the control radio access network device, combines the local training data and the terminal training data, determines the model training supplementary data, and uploads the model training supplementary data to the OAM. The OAM collects and processes local training data of the OAM, and uses the local training data of the OAM and the model training supplementary data as the model training data. The OAM continues the model training by using the model training data, obtains a model that meets the model subscription request, and sends it to the control radio access network device].
Before the effective filing date it would have been obvious to one of ordinary skill in the art to combine the teachings of Lo and Mu because this provides a model data management method, so that the wireless network architecture supporting AI has higher stability and efficiency in the mobile terminal scenario. [Mu para 0054]
6. Lo and Mu illustrate the apparatus of claim 1, Lo and Mu fail to show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: receive, from the network management function, a request to compute the statistics for the set of user equipment capability classes based on the user equipment capability information.
In an analogous art Mu show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: receive, from the network management function, a request to compute the statistics for the set of user equipment capability classes based on the user equipment capability information [0064, 0078, The OAM initiates a training supplementary data subscription request to the gNB-CU newly accessed by the terminal, and relevant network element(s) collects and processes data and uploads it to the OAM. The OAM continues the model training by using the local training data and the training supplementary data to obtain a model that meets the model subscription request, and sends it to the gNB-CU newly accessed by the terminal. The OAM initiating a training supplementary data subscription request to the gNB-CU, the gNB-CU initiating a training supplementary data subscription request to the gNB-DU, the gNB-DU collecting the training data and sending the training supplementary data to the gNB-CU, the gNB-CU collecting and processing the local training data]
Before the effective filing date it would have been obvious to one of ordinary skill in the art to combine the teachings of Lo and Mu because this provides a model data management method, so that the wireless network architecture supporting AI has higher stability and efficiency in the mobile terminal scenario. [Mu para 0054]
7. Lo and Mu illustrate the apparatus of claim 1, Lo fail to show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: send, to a network management function the statistics for the set user equipment capability classes to cause the network management function to select at least the subset of user equipment capability classes amongst the set of user equipment capability classes based on the statistics for the set of user equipment capability classes and to train the models specific for at least the subset of user equipment capability classes or the model generic for at least the subset of user equipment capability classes
In an analogous art Mu show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: send, to a network management function the statistics for the set user equipment capability classes to cause the network management function to select at least the subset of user equipment capability classes amongst the set of user equipment capability classes based on the statistics for the set of user equipment capability classes [par 0059, 0088,The terminal initiates an analysis subscription request, and the gNB-CU generates model subscription request information based on its own AI processing capability and the analysis subscription request information and sending it to the OAM. The gNB-CU generates model subscription request information according to its own AI processing capability and the analysis subscription request information. 3. The gNB-CU sends the model subscription request signaling to the OAM, where the content indicated by the signaling is to initiate a model subscription request to the OAM. 4. The OAM performs an initial model selection according to the model subscription request information to select a model to be trained that meets the analysis subscription request].
and to train the models specific for at least the subset of user equipment capability classes or the model generic for at least the subset of user equipment capability classes [par 0059, According to the model subscription request, the OAM initiates a training supplementary data subscription request to the gNB-CU, and relevant network element(s) collects and processes data and uploads the data to the OAM. The OAM performs model training by using local training data and the training supplementary data to obtain a model that meets the model subscription request],
Before the effective filing date it would have been obvious to one of ordinary skill in the art to combine the teachings of Lo and Mu because this provides a model data management method, so that the wireless network architecture supporting AI has higher stability and efficiency in the mobile terminal scenario. [Mu para 0054]
8. Lo and MU create the apparatus of claim 1, wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: select at least the subset of user equipment capability classes amongst the set of user equipment capability classes based on the statistics for the set user equipment capability classes[par 0148, FIG. 11 is an example of a method 1100 for operations at a BS to support AI/ML techniques for channel environment classification where a UE sends information on a particular channel environment. At operation 1101, the BS receives UE capability information from the UE, including the support of an ML approach for channel environment classification. At operation 10102, the BS sends configuration information to the UE, which can include ML-related configuration information such as enabling/disabling of an ML approach for channel environment classification, an ML model to be used, trained model parameters, and/or whether model parameter updates reported by the UE will be used or not; this information will be described below];
Lo fail to show and send, to the network management function, at least the subset of user equipment capability classes to cause the network management function to train the models specific for at least the subset of user equipment capability classes or the model generic for at least part the subset of user equipment capability classes
In an analogous art Mu show and send, to the network management function, at least the subset of user equipment capability classes to cause the network management function to train the models specific for at least the subset of user equipment capability classes or the model generic for at least part the subset of user equipment capability classes[par 0059, 0088,The terminal initiates an analysis subscription request, and the gNB-CU generates model subscription request information based on its own AI processing capability and the analysis subscription request information and sending it to the OAM. The gNB-CU generates model subscription request information according to its own AI processing capability and the analysis subscription request information. 3. The gNB-CU sends the model subscription request signaling to the OAM, where the content indicated by the signaling is to initiate a model subscription request to the OAM. 4. The OAM performs an initial model selection according to the model subscription request information to select a model to be trained that meets the analysis subscription request].
Before the effective filing date it would have been obvious to one of ordinary skill in the art to combine the teachings of Lo and Mu because this provides a model data management method, so that the wireless network architecture supporting AI has higher stability and efficiency in the mobile terminal scenario. [Mu para 0054]
9. Lo and Mu provide the apparatus of claim 1, Lo and Mu fail to show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: subscribe, with the network management function, to a notification that the models specific for at least the subset of user equipment capability classes are available or a notification that the model generic for at least the subset of user equipment capability classes is available; and receive, from the network management function, the notification that the models specific for at least the subset of user equipment capability classes are available or the notification that the model generic for at least the subset of user equipment capability classes is available.
In an analogous art Mu show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: subscribe, with the network management function [par 0059, The terminal initiates an analysis subscription request, and the gNB-CU generates model subscription request information based on its own AI processing capability and the analysis subscription request information and sending it to the OAM].
to a notification that the models specific for at least the subset of user equipment capability classes are available or a notification that the model generic for at least the subset of user equipment capability classes is available[par 0059, 0088,The terminal initiates an analysis subscription request, and the gNB-CU generates model subscription request information based on its own AI processing capability and the analysis subscription request information and sending it to the OAM. The gNB-CU generates model subscription request information according to its own AI processing capability and the analysis subscription request information. 3. The gNB-CU sends the model subscription request signaling to the OAM, where the content indicated by the signaling is to initiate a model subscription request to the OAM. 4. The OAM performs an initial model selection according to the model subscription request information to select a model to be trained that meets the analysis subscription request];
and receive, from the network management function, the notification that the models specific for at least the subset of user equipment capability classes are available or the notification that the model generic for at least the subset of user equipment capability classes is available [par 0088, The OAM collects and processes the local training data and the uploaded training supplementary data as model training data. 8. The OAM performs model training by using the model training data, and obtains a model that satisfies the model subscription request information. 9. The OAM sends the model to the gNB-CU. 10. The gNB-CU sends the model inference data subscription request signaling to the gNB-DU, where the content indicated by the signaling is to initiate the model inference data subscription request to the gNB-DU]
Before the effective filing date it would have been obvious to one of ordinary skill in the art to combine the teachings of Lo and Mu because this provides a model data management method, so that the wireless network architecture supporting AI has higher stability and efficiency in the mobile terminal scenario. [Mu para 0054]
10. Lo and Mu provides the apparatus of claim 1, Lo fail to show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: send, to the network management function, a request to receive the models specific for at least the subset of user equipment capability classes or the model generic for at least the subset of user equipment capability classes.
In an analogous art Mu show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: send, to the network management function, a request to receive the models specific for at least the subset of user equipment capability classes or the model generic for at least the subset of user equipment capability classes [par 0062, the gNB-CU resends the model subscription request to the OAM, and the OAM updates the analysis subscription request based on the information reported by the gNB-CU. The OAM re-initiates the training supplementary data subscription request, and the relevant network element(s) collects and processes the data and uploads it to the OAM. The OAM continues the model training by using the local training data and the training supplementary data to obtain a model that meets the model subscription request]
Before the effective filing date it would have been obvious to one of ordinary skill in the art to combine the teachings of Lo and Mu because this provides a model data management method, so that the wireless network architecture supporting AI has higher stability and efficiency in the mobile terminal scenario. [Mu para 0054]
11. Lo and Mu describe the apparatus of claim 1, Lo fail to show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: receive, from the network management function, the models specific for at least the subset of user equipment capability classes or the model generic for at least the subset of user equipment capability classes.
In an analogous art Mu show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: receive, from the network management function, the models specific for at least the subset of user equipment capability classes or the model generic for at least the subset of user equipment capability classes [par 0088, The gNB-CU sends the model performance data and the terminal performance feedback data to the OAM. 20. The OAM performs training optimization on the model by using the model performance data and the performance feedback data. 21. The OAM sends the updated model parameter to the gNB-CU].
Before the effective filing date it would have been obvious to one of ordinary skill in the art to combine the teachings of Lo and Mu because this provides a model data management method, so that the wireless network architecture supporting AI has higher stability and efficiency in the mobile terminal scenario. [Mu para 0054]
13. Lo disclose the apparatus of claim 12, Lo fail to show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: receive, from a base station, previous statistics for a previous set of user equipment classes; identify the set of user equipment capability classes based on the previous statistics for the previous set of user equipment classes; and send, to the base station, the set of user equipment capability classes.
In an analogous Mu show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: receive, from a base station [par 0059, The terminal initiates an analysis subscription request, and the gNB-CU generates model subscription request information based on its own AI processing capability and the analysis subscription request information and sending it to the OAM];
Before the effective filing date it would have been obvious to one of ordinary skill in the art to combine the teachings of Lo and Mu because this provides a model data management method, so that the wireless network architecture supporting AI has higher stability and efficiency in the mobile terminal scenario. [Mu para 0054]
Mu fail to show sending previous statistics for a previous set of user equipment classes; identify the set of user equipment capability classes based on the previous statistics for the previous set of user equipment classes.
In an analogous art Lo show sending previous statistics for a previous set of user equipment classes; identify the set of user equipment capability classes based on the previous statistics for the previous set of user equipment classes[par 0172, CSI-Reportconfig IE may include an additional field chan-env-classifier as illustrated in TABLE 8 that configures a particular classifier from pre-determined set of K classifiers]
14. Lo illustrates the apparatus of claim 12, wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: identify the set of user equipment capability classes based on user equipment capability classes predefined at the apparatus [par 0006, 0145-0147, When ML based channel environment classification is enabled, UE assistance information for ML based channel environment classification, and/or an indication of the channel environment (which may be a pre-defined channel environment associated with a lookup table), A channel environment can be classified in terms of UE speed (or, similarly, Doppler spread) and/or frequency selectivity (or, similarly, delay spread). In another embodiment, a channel environment can be classified in terms of coherence bandwidth and/or coherence time. Multiple channel environment classes can be defined with the above-mentioned attributes or in conjunction with other parameters such as RRM metrics, such as RSRP, RSRQ, and SINR. the framework for supporting AI/ML techniques for channel environment classification can include model training at a UE or a network entity or outside of the network (e.g., via offline training)];
Lo fail to show send, to a base station, the set of user equipment capability classes.
In an analogous at Mu send, to a base station, the set of user equipment capability classes [par 0059, The OAM performs model training by using local training data and the training supplementary data to obtain a model that meets the model subscription request, and sends the training model to the gNB-CU. The gNB-CU initiates a model inference data subscription request, and relevant network element(s) collects and processes data and uploads it to the gNB-CU].
Before the effective filing date it would have been obvious to one of ordinary skill in the art to combine the teachings of Lo and Mu because this provides a model data management method, so that the wireless network architecture supporting AI has higher stability and efficiency in the mobile terminal scenario. [Mu para 0054]
15. Lo provide the apparatus of claim 12, Lo fail to show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: send, to a base station, a request to compute statistics for the set of user equipment capability classes based on user equipment capability information received by the base station from user equipment.
In an analogous art Mu show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: send, to a base station, a request to compute statistics for the set of user equipment capability classes based on user equipment capability information received by the base station from user equipment [0064, 0078, The OAM initiates a training supplementary data subscription request to the gNB-CU newly accessed by the terminal, and relevant network element(s) collects and processes data and uploads it to the OAM. The OAM continues the model training by using the local training data and the training supplementary data to obtain a model that meets the model subscription request, and sends it to the gNB-CU newly accessed by the terminal. The OAM initiating a training supplementary data subscription request to the gNB-CU, the gNB-CU initiating a training supplementary data subscription request to the gNB-DU, the gNB-DU collecting the training data and sending the training supplementary data to the gNB-CU, the gNB-CU collecting and processing the local training data]
Before the effective filing date it would have been obvious to one of ordinary skill in the art to combine the teachings of Lo and Mu because this provides a model data management method, so that the wireless network architecture supporting AI has higher stability and efficiency in the mobile terminal scenario. [Mu para 0054]
16. Lo demonstrate the apparatus of claim 12, Lo fail to show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: receive, from a base station, and train the models specific for at least the subset of user equipment capability classes or the model generic for at least the subset of user equipment capability classes.
In an analogous art Mu to show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: receive, from a base station [par 0059, The gNB-CU collects and processes the model performance data and the terminal performance feedback data and reports them to the OAM. The OAM performs training optimization on the model, and sends the updated model to the gNB-CU],
and train the models specific for at least the subset of user equipment capability classes or the model generic for at least the subset of user equipment capability classes[par 0059, According to the model subscription request, the OAM initiates a training supplementary data subscription request to the gNB-CU, and relevant network element(s) collects and processes data and uploads the data to the OAM. The OAM performs model training by using local training data and the training supplementary data to obtain a model that meets the model subscription request],
Before the effective filing date it would have been obvious to one of ordinary skill in the art to combine the teachings of Lo and Mu because this provides a model data management method, so that the wireless network architecture supporting AI has higher stability and efficiency in the mobile terminal scenario. [Mu para 0054]
Lo discloses the statistics for the set of user equipment capability classes; select at least the subset of user equipment capability classes amongst the set of user equipment capability classes based on the statistics for the set of user equipment capability classes[par 0148, FIG. 11 is an example of a method 1100 for operations at a BS to support AI/ML techniques for channel environment classification where a UE sends information on a particular channel environment. At operation 1101, the BS receives UE capability information from the UE, including the support of an ML approach for channel environment classification. At operation 10102, the BS sends configuration information to the UE, which can include ML-related configuration information such as enabling/disabling of an ML approach for channel environment classification, an ML model to be used, trained model parameters, and/or whether model parameter updates reported by the UE will be used or not; this information will be described below];
17. Lo reveal the apparatus of claim 12, Lo fail to show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: receive, from a base station, and train the models specific for at least the subset of user equipment capability classes or the model generic for at least the subset of user equipment capability classes.
In an analogous art Mu show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: receive, from a base station[par 0059, The gNB-CU collects and processes the model performance data and the terminal performance feedback data and reports them to the OAM. The OAM performs training optimization on the model, and sends the updated model to the gNB-CU],
and train the models specific for at least the subset of user equipment capability classes or the model generic for at least the subset of user equipment capability classes[par 0059, According to the model subscription request, the OAM initiates a training supplementary data subscription request to the gNB-CU, and relevant network element(s) collects and processes data and uploads the data to the OAM. The OAM performs model training by using local training data and the training supplementary data to obtain a model that meets the model subscription request],
Before the effective filing date it would have been obvious to one of ordinary skill in the art to combine the teachings of Lo and Mu because this provides a model data management method, so that the wireless network architecture supporting AI has higher stability and efficiency in the mobile terminal scenario. [Mu para 0054]
Lo show at least the subset of user equipment capability classes selected amongst the set of user equipment capability classes by the base station based on the statistics for the set user equipment capability classes[abstract, UE capability for support of machine-learning (ML) based channel environment classification may be reported by a user equipment to a base station, where the channel environment classification classifies a channel environment of a channel between the UE and a base station based on one or more of UE speed or Doppler spread, UE trajectory, frequency selectivity or delay spread, coherence bandwidth, coherence time, radio resource management (RRM) metrics, block error rate, throughput, or UE acceleration]
18. Lo disclose the apparatus of claim 12, Lo fail to show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: send, to a base station, a notification that the models specific for at least the subset of user equipment capability classes are available or the notification that the model generic for at least the subset of user equipment capability classes is available.
In an analogous art Mu show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: send, to a base station, a notification that the models specific for at least the subset of user equipment capability classes are available or the notification that the model generic for at least the subset of user equipment capability classes is available [par 0065, The OAM initiates a training supplementary data subscription request to the gNB-CU newly accessed by the terminal, and relevant network element(s) collects and processes data and uploads it to the OAM. The OAM continues the model training by using the local training data and the training supplementary data to obtain a model that meets the model subscription request, and sends it to the gNB-CU newly accessed by the terminal].
Before the effective filing date it would have been obvious to one of ordinary skill in the art to combine the teachings of Lo and Mu because this provides a model data management method, so that the wireless network architecture supporting AI has higher stability and efficiency in the mobile terminal scenario. [Mu para 0054]
19. Lo provide the apparatus of claim 12, Lo fail to show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: receive, from a base station, a request to send the models specific for at least the subset of user equipment capability classes or the model generic for at least the subset of user equipment capability classes.
In an analogous art Mu show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: receive, from a base station, a request to send the models specific for at least the subset of user equipment capability classes or the model generic for at least the subset of user equipment capability classes[par 0062, the gNB-CU resends the model subscription request to the OAM, and the OAM updates the analysis subscription request based on the information reported by the gNB-CU. The OAM re-initiates the training supplementary data subscription request, and the relevant network element(s) collects and processes the data and uploads it to the OAM. The OAM continues the model training by using the local training data and the training supplementary data to obtain a model that meets the model subscription request]
Before the effective filing date it would have been obvious to one of ordinary skill in the art to combine the teachings of Lo and Mu because this provides a model data management method, so that the wireless network architecture supporting AI has higher stability and efficiency in the mobile terminal scenario. [Mu para 0054]
20, Lo discloses the apparatus of claim 12, Lo fail to show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: send, to a base station, the models specific for at least the subset of user equipment capability classes or the model generic for at least the subset of user equipment capability classes.
In an analogous art Mu show wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: send, to a base station, the models specific for at least the subset of user equipment capability classes or the model generic for at least the subset of user equipment capability classes [par 0062, The OAM continues the model training by using the local training data and the training supplementary data to obtain a model that meets the model subscription request, and sends it to the gNB-CU].
Before the effective filing date it would have been obvious to one of ordinary skill in the art to combine the teachings of Lo and Mu because this provides a model data management method, so that the wireless network architecture supporting AI has higher stability and efficiency in the mobile terminal scenario. [Mu para 0054]
Claim(s) 5 is/are rejected under 35 U.S.C. 103 as being unpatentable over Lo et al. (U.S. Pub No. 2023/0006913 A1) in view of MU et al. (U.S. Pub No. 2024/0276247 A1) in further view of Polehn et al. (U.S. Pub No. 2022/0272567 A1)
5. Lo and Mu provides the apparatus of claim 1, Lo and Mu fail to show wherein the statistics for the set of user equipment capability classes comprises a number of user equipment connected to the apparatus for each user equipment capability class of the set of user equipment capability classes.
In an analogous art Polehn show wherein the statistics for the set of user equipment capability classes comprises a number of user equipment connected to the apparatus for each user equipment capability class of the set of user equipment capability classes [par 0024, In some embodiments, the RAN metrics may include congestion metrics, such as a quantity of UEs connected to base station 103, a measure of the amount or proportion of RF resources (e.g., PRBs) associated with base station 103 that are utilized and/or available].
Before the effective filing date it would have been obvious to one of ordinary skill in the art to combine the teachings of Lo, Mu, and Polehn because the latency sensitivity scores for certain traffic may be determined such that a yield associated with delivery of the traffic is optimized, where such yield may be based on performance metrics [Polehn par 0023]
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 12 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Lo et al. (U.S. Pub No. 2023/0006913 A1)
12. Lo teaches an apparatus comprising at least one processor and at least one memory including computer code for one or more programs, the at least one memory and the computer code configured[par 0102, The memory 290 is coupled to the controller/processor 288. Part of the memory 290 could include a RAM, and another part of the memory 290 could include a Flash memory or other ROM],
with the at least one processor, to cause the apparatus at least to: train models specific for at least a subset of user equipment capability classes selected amongst a set of user equipment capability classes based on statistics for the set of user equipment classes or train a model generic for at least a subset of user equipment capability classes selected amongst a set of user equipment capability classes based on statistics for the set of user equipment classes [par 0152, claim 15, the BS performs model training, or receives model parameters from a network entity. In one embodiment, model training can be performed at the BS. Alternatively, model training can be performed at another network entity (e.g., the O-RAN defined RAN Intelligent Controller), and trained model parameters can be sent to the BS. In yet another embodiment, model training can be performed offline (e.g., model training is performed outside of the network), and the BS may receive the trained model parameters from a network entity or may have them stored in memory inside the BS. The processor is configured to perform model training based on the configuration and a received information on channel environment determined by the UE].
Response to Arguments
As discussed below, Applicant respectfully submits that the cited references fail to disclose or suggest all the elements of the present claims, and therefore do not provide the novel and unobvious advantages noted above, and noted throughout the present application.
However, Applicant respectfully submits that Lo and Mu, individually or in combination, fail to disclose or suggest, at least, "compute statistics for a set of user equipment capability classes based on the user equipment capability information," and "cause a network management function to train models specific for at least a subset of user equipment capability classes selected amongst the set of user equipment capability classes based on the statistics for a set of user equipment capability or to train a model generic for at least a subset of user equipment capability classes selected amongst the set of user equipment capability classes based on the statistics for a set of user equipment capability classes," as recited in claim 1.
Claim 12 has its own unique scope, and recites certain elements that are similar to those of claim 1 highlighted above. Thus, the deficiencies of Lo and Mu as to claim 1 are also applicable to claim 12, and all the claims dependent upon claims 1 and 12.
Lo is silent as to determining the statistics based on the UE capability information. Mu focuses on a model data management method and a model data management apparatus, and is also silent as to the elements of the claimed computation. Thus, Mu fails to cure this deficiency of Lo.
Mu fails to disclose or suggest performing the model training based on statistics for a set of UE capability classes. Thus, contrary to the Office Action's position, Mu is silent as to claimed model trainings, and does not cure this deficiency in Lo.
In other words, Lo does not disclose or suggest that the model training is performed based on statistics for a set of UE classes. Thus, contrary to the Office Action's position, Lo fails to disclose or suggest an apparatus that performs the claimed model trainings. For at least the reasons presented above, Mu also fails to disclose or suggest the claimed model trainings, and therefore does not cure this deficiency in Lo. For at least the above reasons, Applicant respectfully submits that a combination of Lo and Mu would not have rendered claims 1-4, 6-11, and 13-20 obvious to one of ordinary skill in the art, and Lo fails to anticipate all the elements of claim 12. Accordingly, reconsideration and withdrawal of the rejections is respectfully requested.
Polehn discloses systems and methods for rate control of ingress traffic in a radio access network, and focuses on rate-controlled forwarding of traffic. However, Polehn does not cure the above-discussed deficiencies of Lo and Mu as to claim 1, upon which claim 5 is dependent. Thus, Lo, Mu, and Polehn, individually or in any combination, would not have rendered claim 5 obvious to one of ordinary skill in the art. Accordingly, reconsideration and withdrawal of the rejection is respectfully requested.
The examiner respectfully disagrees in Lo para 0153, 0159, shows the UE reports capability information to the BS, including the support of an ML approach for channel environment classification. At operation 1402, the UE receives configuration information from the BS, which can include ML-related configuration information such as enabling/disabling of an ML approach for channel environment classification, an ML model to be used, trained model parameters, and/or whether model parameter updates reported by the UE will be used or not; this information will be described. At operation 1404, the UE receives configuration information from the BS. In one example, a configuration message can include an index to a pre-defined lookup table for a transmission mode. In another example, a configuration message can include an index to a pre-defined lookup table for an RS pattern. In yet another example, information can consist of a BS handover command, a recommendation for transmission mode, a recommendation for scheduled time/frequency resource, a recommendation for MIMO beamforming adjustment from the current served beams, etc., along with the feedback on the inferred channel environment classification, the paragraphs shows based upon the receipt of the capability information from the UE, the base station send the UE, configuration information such as enabling/disabling of an ML approach for channel environment classification, an ML model to be used, trained model parameters, and/or whether model parameter updates reported by the UE will be used or not. The configuration message can include an index to a pre-defined lookup table for a transmission mode. In another example, a configuration message can include an index to a pre-defined lookup table for an RS pattern. Also a recommendation for transmission mode, a recommendation for scheduled time/frequency resource, a recommendation for MIMO beamforming adjustment from the current served beams, etc., along with the feedback on the inferred channel environment classification. The examiners interpretation the configuration information sent by the BS to the UE determining the statistics based on the UE capability information, the configuration is shown to include an index to a pre-defined lookup table for a transmission mode, index to a pre-defined lookup table for an RS pattern, a recommendation for transmission mode, and a recommendation for scheduled time/frequency resource, a recommendation for MIMO beamforming.
In claim 15 the claim shows the base station comprise a processor; and a transceiver operatively coupled to the processor, one of the processor is configured to perform model training based on the configuration and a received information on channel environment determined by the UE.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JASON A HARLEY whose telephone number is (571)270-5435. The examiner can normally be reached 7:30-300 6:30-8:30.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Marcus Smith can be reached at (571) 270-1096. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JASON A HARLEY/Examiner, Art Unit 2468
/MARCUS SMITH/Supervisory Patent Examiner, Art Unit 2468