DETAILED ACTION
In view of Appeal Brief filed on 12/08/25, PROSECUTION IS HEREBY REOPENED.
See complete summary of options available to Applicant in view of reopening of prosecution in the conclusion of this office action. As correctly noted by Applicant in the Appeal Brief dated 12/08/25, in the amendment filed 02/04/25, claims 1-8, and 21-24 were canceled (Appeal Brief p. 17). Furthermore in said amendment, claims 9-24 were amended. However, the final rejection dated 05/09/25 continued the rejections of the claim set as filed 09/30/24.
Examiner further notes Applicant’s improper manner of making amendments in the amendment filed 02/04/25, wherein numerous instances of improper manner of making amendments is present throughout the claims. See 37 CFR 1.121(c). See e.g., comparison of representative claim 9 filed 09/30/24:
An artificial neural network structure in a computer system, the neural network comprising:
one or more feed-forward layers; and
one or more Softmax layers coupled to receive an input vector from the one or more feed-forward layers;
at least one of the Softmax layers configured to compute an unnormalized Softmax vector from the [[an ]]input vector comprising elements x by:
computing an integer vector maximum localmax of the input vector; and
determining 2(x-localmax) for each element x.
with comparison of claim 9 filed 02/24/25:
A computer system comprising a non-transitory [[An ]]artificial neural network structurethe computer system comprising:
one or more processors;
the non-transitory artificial neural network structure comprising:
one or more feed-forward layers; and
one or more Softmax layers coupled to the one or more feed-forward layers;
at least one of the Softmax layers configured to operate the one or more processors to generate
raising elements of the input vector to powers of two; and
computing an integer vector maximum of the input vector.
Applicant is reminded to amend the claims in accordance with 37 CFR 1.121, wherein amendment to claims rewrite the entire claim with all changes (additions and deletions).
Claim Objections
Claim 15 is objected to because of the following informalities.
Claim 15 line 1 appears to have a typographical error with no space between “Thecomputer”.
Appropriate correction is required.
Specification
The specification is objected to as failing to provide proper antecedent basis for the claimed subject matter. See 37 CFR 1.75(d)(1) and MPEP § 608.01(o). Correction of the following is required. Claim 9, and claim 19 recite, with further recitation in dependent claims: “A computer system comprising a non-transitory artificial neural network structure”. The specification discloses only non-transitory machine readable media comprising machine-executable instructions. See specification [0041]. There is no disclosure of a non-transitory artificial neural network structure.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 9-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 9 recites “A computer system comprising a non-transitory artificial neural network structure, the computer system comprising: one or more processors; the non-transitory artificial neural network structure comprising…”. It is unclear what is meant by a non-transitory artificial neural network structure. The specification discloses only non-transitory machine readable media comprising machine-executable instructions. See specification [0041]. It is unclear whether the non-transitory artificial neural network structure refers to the structure comprising machine readable media, which is unclaimed, or whether there is some unknown structure of the artificial neural network that is non-transitory.
Furthermore the preamble recites two instances of the computer system comprising: “A computer system comprising a non-transitory artificial neural network structure” and “the computer system comprising: one or more processors”. This renders the claim indefinite, as it is unclear where the preamble begins and ends, and it is unclear what the relationship is between the non-transitory artificial neural network structure and the one or more processors. Claims 10-18 inherit the same deficiency as claim 9 based on dependence. Claim 19 recites substantially the same preamble limitations as claim 9 and is rejected for the same reasons. Claim 20 inherits the same deficiency as claim 19 based on dependence.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 9-20 are rejected under 35 U.S.C. § 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more.
Regarding claim 9, under the Alice framework Step 1, the claim 9 falls within the four statutory categories of patentable subject matter identified by 35 USC 101: a process, machine, manufacture or a composition of matter.
Under the Alice framework Step 2A prong 1, claim 9 recites mathematical concepts of mathematical calculations including raising a vector to the power of two, computing a maximum of a vector. Specifically, the claim recites the following mathematical calculations:
generate an unnormalized Softmax vector from the input vector by:
raising elements of the input vector to powers of two; and
computing an integer vector maximum of the input vector.
See for example [0040], which further describes these mathematical calculations in terms of equations. For these reasons claim 9 recites mathematical concepts.
Under the Alice framework Step 2A prong 2 analysis, the claim recites the following additional elements: a computer system comprising a non-transitory artificial neural network structure, the computer system comprising: one or more processors; the non-transitory artificial neural network structure comprising: one or more feed-forward layers; at least one of the Softmax layers configured to operate the one or more processor. The claim does no more than merely “apply it’ in a computer system, comprising one or more processors. Furthermore, the computer system comprising a non-transitory artificial neural network structure comprising one or more feed-forward layers, and one or more softmax layers coupled to the one or more feed-forward layers merely generally link the use of the mathematical concepts to a particular technological environment or field of use, the field of use being an artificial neural network and the particular technological environment being the one or more feed forward layers and one or more software layers coupled to the one or more feed forward layers. The claim does not provide a specific structure or specifically limit these additional elements in a meaningful way beyond the generically recited feed-forward and SoftMax layers. For these reasons, claim 9 is not integrated into a practical application.
Under the Alice Framework Step 2B analysis, the claim considered individually and as an ordered combination does not include additional elements that are sufficient to amount to significantly more than the abstract idea. The claim does no more than merely generally link the use of the mathematical concepts to a particular technological environment or field of use. Furthermore artificial neural networks comprising one or more feed-forward layers; and one or more SoftMax layers coupled to the one or more feed-forward layers are well understood, routine, and conventional. See e.g., M. Gormley, Neural Networks, 10-601B Introduction to Machine Learning, Carnegie Mellon School of Computer Science, 2016 (hereinafter “Gormley”), which teaches an overview of neural network layer architectures including on ore more two feed forward layers, and one SoftMax layer (p. 58). See also applicant arguments p. 7 describing a softmax layer as a common structural component of various types of neural networks. Furthermore, receiving an input vector is well understood, routine, and conventional activity. See MPEP 2106(d).II.i. For these reasons claim 9 does not amount to significantly more than the abstract idea.
Claims 10-11 are rejected for at least the reasons set forth with respect to claim 9. Under the Step 2A prong 1 analysis, claims 10-11 further mathematically limit claim 9. Under the Step 2A prong 2 and Step 2B analysis, claims 10-11 further includes the additional element of the one or more processors configured to operate to perform further mathematical calculations. This further limitation merely recites “apply it” in a computer.
Claim 12 is rejected for at least the reasons set forth with respect to claim 9. Under the Step 2A prong 1 analysis, claim 12 further mathematically limits claim 9. Under the Step 2A prong 2 and Step 2B analysis, claim 12 includes the following further additional elements: utilize a plurality of processing elements to perform further mathematical calculations. This further limitation merely recites “apply it” in a computer.
Claims 13-16 are rejected for at least the reasons set forth with respect to claim 12. Claims 13-16 merely further mathematically limit the mathematical concepts of claim 12. Claims 13-16 include no further additional elements that would require further analysis under Step 2A prong 2 and Step 2B beyond those recited in claim 12.
Claims 17 and 18 are rejected for at least the reasons set forth with respect to claims 9. Claims 17 and 18 further mathematically limits claim 1 and includes the following further additional elements: performing the mathematical calculations in a single execution loop.
Under the step 2A prong 2 analysis, performing mathematical calculations in an execution loop comprises an insignificant extra solution activity. Furthermore the execution of these mathematical calculations in a single execution loop is a direct result of the mathematical calculations, which only require one loop to perform the calculations. See figure 7A versus 7B and 7C, which include only one “for” loop to compute powers and sum of powers of two. For these reasons claims 17 and 18 are not integrated into a practical application.
Under the Step 2B analysis, the use of an execution loop is well understood, routine and conventional activity. See Patterson, Ch 2, p. 92 describing the use of loops. For these reasons, claim 17 and 18 does not amount to significantly more than the abstract idea.
Regarding claim 19, under the Alice framework Step 1, the claim 19 falls within the four statutory categories of patentable subject matter identified by 35 USC 101: a process, machine, manufacture or a composition of matter.
Under the Alice framework Step 2A prong 1, claim 9 recites mathematical concepts of mathematical calculations including raising a vector to the power of two, computing a maximum of a vector. Specifically, the claim recites the following mathematical calculations:
generate an unnormalized Softmax vector from an input vector comprising x elements by:
raising element of the input vector to powers of two; and
computing an integer vector maximum of the input vector.
See for example [0040], which further describes these mathematical calculations in terms of equations. For these reasons claim 19 recites mathematical concepts.
Under the Alice framework Step 2A prong 2 analysis, the claim recites the following additional elements: a transformer artificial neural network structure in a computer system, the transformer artificial neural network structure comprising: a self-attention layer; and an encoder-decoder attention layer; each of the self-attention layer and the encoder-decoder attention layer comprising a SoftMax layer configured to operate the one or more processors. The claim does no more than merely generally link the use of the mathematical concepts to a particular technological environment or field of use, the field of use being a transformer artificial neural network and the particular technological environment being the self-attention layer, the encoder-decoder attention layer each comprising a SoftMax layer. The claim does not provide a specific structure or specifically limit these additional elements in a meaningful way beyond the generically recited layers. For these reasons, claim 19 is not integrated into a practical application.
Under the Alice Framework Step 2B analysis, the claim considered individually and as an ordered combination does not include additional elements that are sufficient to amount to significantly more than the abstract idea. The claim does no more than merely generally link the use of the mathematical concepts to a particular technological environment or field of use. Furthermore transformer artificial neural networks comprising a self-attention layer; and an encoder-decoder attention layer; each of the self-attention layer and the encoder-decoder attention layer comprising a SoftMax layer are well understood, routine, and conventional. See e.g., Z Hu et al, Lecture 16: Building Blocks of Deep Learning, overview of CNNs, RNNs, and attention, Carnegie Mellon University, 2019 (hereinafter “Hu”), and LP Morency, Tutorial on Multimodal Machine Learning, MultiComp Lab, Carnegie Mellon University, 2017 (hereinafter “Morency’) which teach overviews of neural networks including self-attention layers, encoder-decoder attention layers, and layers comprising SoftMax layers (Hu Attention Mechanisms, Transformer section), (Morency Autoencoder slide showing encoder-decoder layers, attention model for machine translation slide). See also Applicant arguments p.7 describing the softmax layer as a common structural component. For these reasons claim 19 does not amount to significantly more than the abstract idea.
Claim 20 is rejected for at least the reasons set forth with respect to claims 19. Claims 10 further mathematically limits claim 19 and includes the following further additional elements: performing the mathematical calculations in a single execution loop.
Under the step 2A prong 2 analysis, performing mathematical calculations in an execution loop comprises an insignificant extra solution activity. Furthermore the execution of these mathematical calculations in a single execution loop is a direct result of the mathematical calculations, which only require one loop to perform the calculations. See figure 7A versus 7B and 7C, which include only one “for” loop to compute powers and sum of powers of two. For these reasons claim 20 is not integrated into a practical application.
Under the Step 2B analysis, the use of an execution loop is well understood, routine and conventional activity. See Patterson, Ch 2, p. 92 describing the use of loops. For these reasons, claim 20 does not amount to significantly more than the abstract idea.
Allowable Subject Matter
For the reasons set forth in the office action dated 01/25/24, claims 9-20 would be allowable if rewritten to overcome the rejections under 35 USC 101.
Conclusion
To avoid abandonment of the application, appellant must exercise one of the following two options:
(1) file a reply under 37 CFR 1.111 (if this Office action is non-final) or a reply under 37 CFR 1.113 (if this Office action is final); or,
(2) initiate a new appeal by filing a notice of appeal under 37 CFR 41.31 followed by an appeal brief under 37 CFR 41.37. The previously paid notice of appeal fee and appeal brief fee can be applied to the new appeal. If, however, the appeal fees set forth in 37 CFR 41.20 have been increased since they were previously paid, then appellant must pay the difference between the increased fees and the amount previously paid.
A Supervisory Patent Examiner (SPE) has approved of reopening prosecution by signing at the end of this action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to EMILY E LAROCQUE whose telephone number is (469)295-9289. The examiner can normally be reached on 10:00am - 1200pm, 2:00pm - 8pm ET M-F.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor Andrew Caldwell can be reached on 571-272-3702. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/EMILY E LAROCQUE/ Primary Examiner, Art Unit 2182
/ANDREW CALDWELL/ Supervisory Patent Examiner, Art Unit 2182