DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 22-41 is/are rejected under 35 U.S.C. 103 as being unpatentable over Larish et al., US PGPUB No. 20190149425 A1, hereinafter Larish, and further in view of O’Shea et al., NPL: An Introduction to Deep Learning for the Physical Layer, hereinafter O’Shea.
Regarding claim 22, Larish discloses a system (Larish; a system [¶ 0038, ¶ 0059, and ¶ 0068], as illustrated within Fig. 1 and Fig. 2) comprising:
one or more processors (Larish; the system, as addressed above, comprises one or more processors [¶ 0060 and ¶ 0066], as illustrated within Fig. 2); and
one or more storage devices on which are stored instructions that, when executed, are configured to cause the one or more processors to perform operations (Larish; one or more storage devices on which are stored instructions that are configured to cause the one or more processors to perform operations when executed [¶ 0060-0061 and ¶ 0066], as illustrated within Fig. 2) comprising:
generating, using a generator machine learning network, data representing a signal for transmission over a wireless communications channel (Larish; generating data representing a signal for transmission over a wireless communications channel using a generator machine learning network [¶ 0019-0020, ¶ 0085-0086, and ¶ 0091], as illustrated within Fig. 5B and Fig. 6A; moreover, one or more processed (input and output) singles corresponds to 1st and 2nd information [¶ 0091-0094]), wherein the generator machine learning network is trained using a discriminator machine learning network and an optimizer based on signals transmitted over the wireless communications channel (Larish; the generator machine learning network is implicit trained (given a ML environment, e.g. GAN) using a discriminator machine learning network and an optimizer based on signals transmitted over the wireless communications channel [¶ 0077-0079 and ¶ 0085-0086]; wherein, one or more generators are configured to accept input [¶ 0090-0091 and ¶ 0093], as illustrated within Fig. 5B and Fig. 6A; moreover, training set of input to train the GAN [¶ 0077-0080]);
performing one or more encodings of the data to generate a radio frequency signal (Larish; performing one or more implicit encodings (given GAN) of the data to generate a radio frequency signal [¶ 0085-0086 and ¶ 0091], as illustrated within Fig. 5B and Fig. 6A; moreover, the communication between one or more generators and the discriminator [¶ 0092-0094]);
transmitting the radio frequency signal over the wireless communications channel (Larish; transmitting the radio frequency signal over the wireless communications channel [¶ 0085-0086 and ¶ 0091], as illustrated within Fig. 5B and Fig. 6A);
receiving one or more communication metrics based on transmitting the radio frequency signal over the wireless communications channel (Larish; receiving one or more communication metrics based on transmitting the radio frequency signal over the wireless communications channel [¶ 0085-0086 and ¶ 0093-0094]); and
updating the generator machine learning network based on the received one or more communication metrics (Larish; updating the generator machine learning network [¶ 0085-0086] based on the received one or more communication metrics [¶ 0091 and ¶ 0093-0094]).
Larish fails to explicitly disclose training.
However, O’Shea teaches machine-learning training using a generator and discriminator (O’Shea; ML training using a generator/encoder and a discriminator/decoder [Page 4, Col. 2, ¶ 2 to Page 5, Col. 2, ¶ 1], as illustrated within Fig. 2; wherein radio communication signals are generated [i.d.]).
Larish and O’Shea are considered to be analogous art because both pertain to generating and/or managing data in relation with utilizing a machine learning model, wherein one or more computerized units process data through a neural network.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing of the claimed invention was made to modify Larish, to incorporate machine-learning training using a generator and discriminator (as taught by O’Shea), in order to provide an optimized processed of information in relation with data transfer (O’Shea; [Abstract, Page 1, Col. 1, ¶ 1 to Page 1, Col. 1, ¶ 2, and Page 11, Col. 2, ¶ 2]).
Regarding claim 23, Larish in view of O’Shea further discloses the system of claim 22, wherein receiving the one or more communication metrics based on transmitting the radio frequency signal over the wireless communications channel (Larish; [as addressed within parent claim(s)]) comprises:
receiving data indicating at least one of construction error, power consumption, or delay corresponding to transmitting the radio frequency signal over the wireless communications channel (Larish; receiving data indicating at least one of construction error, power consumption, or delay corresponding to transmitting the radio frequency signal over the wireless communications channel [¶ 0092-0095]; moreover, optimization parameter [¶ 0085-0086 and ¶ 0091]).
Regarding claim 24, Larish in view of O’Shea further discloses the system of claim 22, wherein receiving the one or more communication metrics based on transmitting the radio frequency signal over the wireless communications channel (Larish; [as address within the parent claim(s)]) comprises:
receiving error feedback corresponding to transmitting the radio frequency signal over the wireless communications channel (Larish; receiving error feedback (i.e. backpropagation in relation with error handling) corresponding to transmitting the radio frequency signal over the wireless communications channel [¶ 0092-0095]; moreover, optimization corresponds to error/reward feedback [¶ 0085-0086 and ¶ 0091]), and wherein updating the generator machine learning network based on the received one or more communication metrics (Larish; [as address within the parent claim(s)]) comprises:
updating the generator machine learning network based on the received error feedback (Larish; updating the generator machine learning network based on the received error feedback [¶ 0091 and ¶ 0093-0094]).
Regarding claim 25, Larish in view of O’Shea further discloses the system of claim 24, wherein receiving the error feedback corresponding to transmitting the radio frequency signal over the wireless communications channel (Larish; [as addressed within the parent claim(s)]) comprises:
receiving data indicating the error feedback using at least one of a communications bus or protocol message (Larish; receiving data [¶ 0085-0086] indicating the error feedback (i.e. backpropagation) using at least one of a communications bus or protocol message (corresponding to a gradient techinque) [¶ 0093-0094]).
O’Shea further teaches machine-learning operations associated with encoding and decoding (O’Shea; ML operations associated with encoding and decoding [Page 3, Col. 1, ¶ 1 to Page 4, Col.1, ¶ 3 and Page 4, Col. 2, ¶ 2 to Page 5, Col. 2, ¶ 1]).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing of the claimed invention was made to modify Larish as modified by O’Shea, to incorporate (as taught by O’Shea), in order to provide an optimized processed to minimize loss of information in relation with data transfer (O’Shea; [Abstract, Page 1, Col. 1, ¶ 1 to Page 1, Col. 1, ¶ 2, and Page 11, Col. 2, ¶ 2]).
Regarding claim 26, Larish in view of O’Shea further discloses the system of claim 22, wherein training the generator machine learning network comprises:
updating the generator machine learning network using the discriminator machine learning network communicably coupled to the optimizer (Larish; updating the generator machine learning network [¶ 0085-0086] using the discriminator machine learning network communicably coupled to the optimizer [¶ 0093-0094]), wherein the optimizer is configured to process decision information indicating a determination performed by the discriminator machine learning network (Larish; the optimizer is configured to process decision information indicating a determination [¶ 0091 and ¶ 0093-0094] performed by the discriminator machine learning network [¶ 0085-0086]).
Regarding claim 27, Larish in view of O’Shea further discloses the system of claim 26, wherein the optimizer is configured to process decision information indicating a determination performed by the discriminator machine learning network using one or more iterative optimization techniques (Larish; the optimizer is configured to process decision information indicating a determination performed by the discriminator machine learning network using one or more iterative optimization techniques (i.e. backpropagation) [¶ 0085-0086 and ¶ 0093-0094]).
Regarding claim 28, Larish in view of O’Shea further discloses the system of claim 27, wherein the one or more iterative optimization techniques include a gradient descent or optimization algorithm (Larish; one or more iterative optimization techniques [as addressed within the parent claims(s)] implicitly include gradient descent or optimization algorithm given backpropagation in relation with error handling [¶ 0085-0086 and ¶ 0093-0094]).
O’Shea further teaches optimization techniques include a stochastic gradient descent (SGD) or Adam optimization algorithm (O’Shea; optimization techniques include a SGD or Adam optimization algorithm [Page 4, Col. 2, ¶ 2 to Page 5, Col. 2, ¶ 1]).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing of the claimed invention was made to modify Larish as modified by O’Shea, to incorporate optimization techniques include a stochastic gradient descent (SGD) or Adam optimization algorithm (as taught by O’Shea), in order to provide an optimized processed to minimize loss of information in relation with data transfer (O’Shea; [Abstract, Page 1, Col. 1, ¶ 1 to Page 1, Col. 1, ¶ 2, and Page 11, Col. 2, ¶ 2]).
Regarding claim 29, Larish in view of O’Shea further discloses the system of claim 22, wherein the decision information indicates a result of the discriminator machine learning network processing (i) a first set of information corresponding to information transmitted across the wireless communications channel and (ii) a second set of information corresponding to data (Larish; the decision information indicates a result of the discriminator machine learning network processing (i) a first set of information corresponding to information transmitted across the wireless communications channel and (ii) a second set of information corresponding to data [¶ 0085-0086]; wherein, one or more processed (input and output) singles corresponds to 1st and 2nd information [¶ 0091-0094]) generated by at least on of (a) the generator machine learning network or (b) sampling from a target information source (Larish; generated by at least on of (a) the generator machine learning network or (b) sampling from a target information source [¶ 0092-0094]; moreover, utilizing a ML generator/decoder [¶ 0085-0086]).
Regarding claim 30, Larish in view of O’Shea further discloses the system of claim 29, wherein the first set of information corresponding to information transmitted across the wireless communications channel comprises data generated by the generator machine learning network (Larish; the first set of information corresponding to information transmitted across the wireless communications channel comprises data generated by the generator machine learning network [¶ 0085]; wherein, one or more generators are configured to accept input ¶ 0090-0091 and ¶ 0093]).
Regarding claim 31, Larish in view of O’Shea further discloses the system of claim 29, wherein the first set of information corresponding to information transmitted across the wireless communications channel is an altered version of information generated by the generator machine learning network that is obtained by processing the information using either a real or simulated communications channel (Larish; the first set of information corresponding to information transmitted across the wireless communications channel is an altered version (i.e. backpropagation) of information generated by the generator machine learning network that is obtained by processing the information using either a real or simulated communications channel (i.e. virtualized network) [¶ 0085-0086 and ¶ 0093-0094]).
Regarding claim 32, Larish in view of O’Shea further discloses the system of claim 22, wherein performing one or more encodings of the data to generate the radio frequency signal (Larish; [as addressed within the parent claim(s)]) comprises:
generating a signal using a time-frequency modulation basis (Larish; generating a signal using a time-frequency modulation basis [¶ 0019-0020 and ¶ 0026-0028]).
Regarding claim 33, the rejection of claim 33 is addressed within the rejection of claim 22, due to the similarities claim 33 and claim 22 share, therefore refer to the rejection of claim 22 regarding the rejection of claim 33. Although, claim 33 and claim 22 may not be identical, it is reasonable to reject claim 33 based on the prior art teachings and rational within the rejection of claim 22.
Regarding claim 34, the rejection of claim 34 is addressed within the rejection of claim 23, due to the similarities claim 34 and claim 23 share, therefore refer to the rejection of claim 23 regarding the rejection of claim 34.
Regarding claim 35, the rejection of claim 35 is addressed within the rejection of claim 24, due to the similarities claim 35 and claim 24 share, therefore refer to the rejection of claim 24 regarding the rejection of claim 35.
Regarding claim 36, the rejection of claim 36 is addressed within the rejection of claim 25, due to the similarities claim 36 and claim 25 share, therefore refer to the rejection of claim 25 regarding the rejection of claim 36.
Regarding claim 37, the rejection of claim 37 is addressed within the rejection of claim 26, due to the similarities claim 37 and claim 26 share, therefore refer to the rejection of claim 26 regarding the rejection of claim 37.
Regarding claim 38, the rejection of claim 38 is addressed within the rejection of claim 27, due to the similarities claim 38 and claim 27 share, therefore refer to the rejection of claim 27 regarding the rejection of claim 38.
Regarding claim 39, the rejection of claim 39 is addressed within the rejection of claim 28, due to the similarities claim 39 and claim 28 share, therefore refer to the rejection of claim 28 regarding the rejection of claim 39.
Regarding claim 40, the rejection of claim 40 is addressed within the rejection of claim 29, due to the similarities claim 40 and claim 29 share, therefore refer to the rejection of claim 29 regarding the rejection of claim 40.
Regarding claim 41, the rejection of claim 41 is addressed within the rejection of claim 22, due to the similarities claim 41 and claim 22 share, therefore refer to the rejection of claim 22 regarding the rejection of claim 41; however, the subject matter/limitations not addressed by claim 22 is/are addressed below.
Larish discloses one or more non-transitory computer storage media encoded with computer program instructions that when executed by one or more computers cause the one or more computers to perform operations (Larish; one or more non-transitory computer storage media encoded with computer program instructions that when executed by one or more computers cause the one or more computers to perform operations [¶ 0060-0061 and ¶ 0066]).
(further refer to the rejection of claim 22)
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Refer to PTO-892, Notice of Reference Cited for a listing of analogous art.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Charles Lloyd Beard whose telephone number is (571)272-5735. The examiner can normally be reached Monday - Friday, 8:00 AM - 5: 00 PM, alternate Fridays EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Tammy Goddard can be reached at (571) 272-7773. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
CHARLES LLOYD. BEARD
Primary Examiner
Art Unit 2616
/CHARLES L BEARD/Primary Examiner, Art Unit 2611