DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claim(s) 15-20 is/are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claim(s) does/do not fall within at least one of the four categories of patent eligible subject matter because the claim(s) are directed to a signal per se.
The United States Patent and Trademark Office (USPTO) is obliged to give claims their broadest reasonable interpretation consistent with the specification during proceedings before the USPTO. See in re Zietz, 893 F. 2d 3 19 (Fed. Cir. 1989) (during patent examination the pending claims must be interpreted as broadly as their terms reasonably allow). The broadest reasonable interpretation of a claim drawn to a computer readable medium (also called machine readable medium and other such variations) typically covers forms of non-transitory tangible media and transitory propagating signals per se in view of the ordinary and customary meaning of computer readable media, particularly when the specification is silent. See MPEP 21 1 1.01. When the broadest reasonable interpretation of a claim covers a signal per se, the claim must be rejected under 35 U.S.C. 101 as covering non-statutory matter. See in re Nuijten, 500 F.3d 1246, 1356-57 (Fed. Cir.2007) (transitory embodiments are not directed to statutory subject matter).
The term "machine-readable medium" shall also be taken to include any medium other than a transitory medium such as a carrier wave that is capable of storing or encoding a set of instructions for execution by the machine that cause the machine to perform any one or more of the methodologies of the present invention. The term " storing computer-readable instructions" shall accordingly be taken to include, but not be limited to, solid-state memories, and optical and magnetic media.) allows computer readable storage medium to read on both non-transitory storage medium and transitory storage medium, the claims are rejected under the broadest reasonable interpretation as a product of transitory propagating signal.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1, 3, 7-8, 10, 14-15, 17 is/are rejected under 35 U.S.C. 102 (a) (2) as being anticipated by Karnagel et al. (US 2020/0118036 A1).
Re Claim 1, 8 & 15, Karnagel teaches a processor comprising:
one or more circuits to cause one or more neural networks to be trained in parallel based, at least in part, on two or more randomly selected, similarly-sized portions of one or more datasets. (Karnagel; FIG. 1-5; Background, ¶ [0013]-[0043], [0125]-[0141]; The embodiment(s) detail the parallel execution of training neural networks on random selected similar sized datasets.)
Re Claim 3, 10 & 17, Karnagel discloses the processor of claim 1, wherein the one or more circuits further cause the one or more datasets to be randomly split into two or more subsets of training data samples. (Karnagel; FIG. 1; ¶ [0013]-[0023], [0047]-[0079]; The datasets can be broken in various seized training related subsets.)
Re Claim 7 & 14, Karnagel discloses the processor of claim 1, wherein the two or more randomly selected, similarly-sized portions contain training data samples of similar number of sequences. (Karnagel; FIG. 1-3; ¶ [0013]-[0023], [0140]; Randomly selected, similar seize training data sample of sequences.)
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 2, 9 & 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Karnagel et al. (US 2020/0118036 A1) and further in view of DAVID et al. (US 2022/0012595 A1).
Re Claim 2, 9 & 16, Karnagel discloses the processor of claim 1, wherein the one or more neural networks are trained in parallel on two or more training processors, and (Karnagel; FIG. 1-5; Background, ¶ [0013]-[0043], [0125]-[0141]; The embodiment(s) detail the parallel execution of training neural networks on random selected similar sized datasets.)
Karnagel does not explicitly suggest wherein each of the two or more training processors updates at least a part of the one or more neural networks using at least one of the two or more randomly selected similarly-sized portions.
However, in analogous art, DAVID teaches wherein each of the two or more training processors updates at least a part of the one or more neural networks using at least one of the two or more randomly selected similarly-sized portions. (DAVID; FIG. 1; Background, Summary, ¶ [0017]-[0027], [0040]-[0054]; The updating of neural network using random selected datasets.)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention (AIA ) to modify Karnagel in view of DAVID to updated neural networks for the reasons of training and generating new neural networks based on training data sets. (DAVID Abstract)
Claim(s) 4-6, 11-13, 18-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Karnagel et al. (US 2020/0118036 A1) and further in view of Kida (US 2019/0065989 A1).
Re Claim 4, 11 & 18, Karnagel discloses the processor of claim 3, yet does not explicitly suggest wherein the one or more circuits further cause the training data samples in each of the two or more subsets to be further split into two or more portions based on sample sizes, wherein each of the two or more portions contains a set of training data samples of a similar size.
However, in analogous art, Kida teaches wherein the one or more circuits further cause the training data samples in each of the two or more subsets to be further split into two or more portions based on sample sizes, (Kida; FIG. 1-12; ¶ [0018]-[0028], [0038]-[0048]; The embodiment(s) detail comparable methodology such as training data samples, splitting data samples and classifying data samples by size.)
wherein each of the two or more portions contains a set of training data samples of a similar size. (Kida; FIG. 1-12; ¶ [0018]-[0028], [0038]-[0048]; The embodiment(s) detail comparable methodology such as training data samples, splitting data samples and classifying data samples by size.)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention (AIA ) to modify Karnagel in view of Kida to split subsets of data for the reasons of training machine learning models by dividing training data samples. (Kida Abstract)
Re Claim 5, 12 & 19, Karnagel-Kida discloses the processor of claim 4, wherein the one or more circuits further cause the two or more portions within each subset to be ranked according to their corresponding sample sizes. (Kida; FIG. 1-12; ¶ [0018]-[0028], [0038]-[0058]; The embodiment(s) detail comparable methodology such as the classification (ranked) data samples associated with sample sizes.)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention (AIA ) to modify Karnagel in view of Kida to split subsets of data for the reasons of training machine learning models by dividing training data samples. (Kida Abstract)
Re Claim 6, 13 & 20, Karnagel-Kida discloses the processor of claim 5, wherein at least one of the two or more randomly selected, similarly-sized portions is sampled corresponding to a ranking among portions within a first subset, and (Kida; FIG. 1-12; ¶ [0018]-[0028], [0038]-[0058]; The embodiment(s) detail comparable methodology such as randomized size, classified subset of sample data.)
at least another of the two or more randomly selected, similarly-sized portions is sampled corresponding to a same ranking among portions within a second subset. (Kida; FIG. 1-12; ¶ [0018]-[0028], [0038]-[0058]; The embodiment(s) detail comparable methodology such as randomized size, classified subset of sample data.)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention (AIA ) to modify Karnagel in view of Kida to split subsets of data for the reasons of training machine learning models by dividing training data samples. (Kida Abstract)
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHRISTOPHER B ROBINSON whose telephone number is (571)270-0702. The examiner can normally be reached M-F 7:00-3:00 EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Nicholas R Taylor can be reached at 571-272-3889. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CHRISTOPHER B ROBINSON/ Primary Examiner, Art Unit 2443