DETAILED ACTION
This Office Action is in response to Applicant's Response filed on 01/07/2026 for the above identified application.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 01/07/2026 has been entered.
Response to Amendment
The amendment filed on 01/07/2026 has been entered.
Claims 21, 23, 26, 30, 32, 35, 39, and 40 are amended. Claims 21-40 are pending in the application.
Claim Objection
Claim 40 is objected to because of the following informalities: This claim recites “the softmax output”, which has no antecedent basis. Appropriate correction is required.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 21-40 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1
Claims 21-29 are directed to a method, Claims 30-38 are directed to a system, and Claims 39-40 are directed to a computer-readable medium. Thus, the claims fall within one of the statutory categories (process, machine, articles of manufacture) and are eligible under Step 1.
Step 2A Prong 1
Independent Claims
Claims 21, 30, and 39 recite:
processing an input sequence to generate a respective encoder hidden state for each input in the input sequence - under its broadest reasonable interpretation in light of specification, this limitation encompasses a mental process (which is observing, evaluating, and judging that can be practically performed in the human mind or by a human using a pen and paper) of evaluating an input sequence to generate a respective representation (encoder hidden state) for each input in the input sequence.
generating a plurality of output sequences that have different numbers of outputs than each other, the generating comprising, for each position in an output sequence of the plurality of output sequences: generating, based on the respective encoder hidden state for at least one input in the input sequence, a respective output for the each position in the output sequence; and determining, using the respective output whether to (i) select a token from a fixed vocabulary of possible tokens as an output for the output sequence at the each position in the output sequence, or (ii) select a designated token that indicates that the each position is a last position in the output sequence - under its broadest reasonable interpretation in light of specification, this limitation encompasses a mental process (which is observing, evaluating, and judging that can be practically performed in the human mind or by a human using a pen and paper) of writing down a plurality of output sequences by calculating a particular output (respective output) for each evaluated position in the output sequence using the representation (encoder hidden state) and determining, based on the calculated output, whether to select a word/ token from a fixed vocabulary of words as output for the evaluated position or to select a designated word/ token for the last position in the output sequence.
Accordingly, these claims recite an abstract idea that falls under the “mental process” grouping.
Step 2A Prong 2
Independent Claims
Additional elements
Claims 21, 30, and 39:
using an encoder neural network, using a decoder neural network, generated by the encoder neural network, generated by the decoder neural network - these limitations are recited at a high-level of generality such that it amount to no more than merely using neural network as a tool to execute the abstract idea (see MPEP § 2106.05(f)). These limitations can also be viewed as generally linking the judicial exception to the technological environment of neural networks (see MPEP § 2106.05(h)).
Claim 21:
computer-implemented method - this limitation is recited at a high-level of generality such that it amounts to no more than mere instructions to apply the abstract idea on a generic computer (see MPEP § 2106.05(f)). This limitation can also be viewed as generally linking the use of a judicial exception to the field of generic computer (see MPEP § 2106.05(h)).
Claim 30:
system comprising one or more computers and one or more storage devices storing instructions that when executed by the one or more computers cause the one or more computers to perform operations - this limitation is recited at a high-level of generality such that it amounts to no more than mere instructions to apply the abstract idea on a generic computer (see MPEP § 2106.05(f)). This limitation can also be viewed as generally linking the use of a judicial exception to the field of generic computer (see MPEP § 2106.05(h)).
Claim 39:
one or more non-transitory computer storage media storing instructions that when executed by one or more computers cause the one or more computers to perform operations - this limitation is recited at a high-level of generality such that it amounts to no more than mere instructions to apply the abstract idea on a generic computer (see MPEP § 2106.05(f)). This limitation can also be viewed as generally linking the use of a judicial exception to the field of generic computer (see MPEP § 2106.05(h)).
Accordingly, these additional elements do not integrate the judicial exception into a practical application because they do not impose any meaningful limits on practicing the abstract idea. These claims are directed to the abstract idea.
Step 2B
Independent Claims
Additional elements
Claims 21, 30, and 39:
using an encoder neural network, using a decoder neural network, generated by the encoder neural network, generated by the decoder neural network - these limitations are recited at a high-level of generality such that it amount to no more than merely using neural network as a tool to execute the abstract idea (see MPEP § 2106.05(f)). These limitations can also be viewed as generally linking the judicial exception to the technological environment of neural networks (see MPEP § 2106.05(h)).
Claim 21:
computer-implemented method - this limitation is recited at a high-level of generality such that it amounts to no more than mere instructions to apply the abstract idea on a generic computer (see MPEP § 2106.05(f)). This limitation can also be viewed as generally linking the use of a judicial exception to the field of generic computer (see MPEP § 2106.05(h)).
Claim 30:
system comprising one or more computers and one or more storage devices storing instructions that when executed by the one or more computers cause the one or more computers to perform operations - this limitation is recited at a high-level of generality such that it amounts to no more than mere instructions to apply the abstract idea on a generic computer (see MPEP § 2106.05(f)). This limitation can also be viewed as generally linking the use of a judicial exception to the field of generic computer (see MPEP § 2106.05(h)).
Claim 39:
one or more non-transitory computer storage media storing instructions that when executed by one or more computers cause the one or more computers to perform operations - this limitation is recited at a high-level of generality such that it amounts to no more than mere instructions to apply the abstract idea on a generic computer (see MPEP § 2106.05(f)). This limitation can also be viewed as generally linking the use of a judicial exception to the field of generic computer (see MPEP § 2106.05(h)).
Accordingly, these additional elements do not amount to significantly more than the judicial exception. As such, these claims are patent ineligible.
Step 2A Prong 1
Dependent Claims
Claims 22 and 31:
the respective output for the each position comprises a softmax output for the each position - these limitations merely further the mental process of generating the respective output with mathematical relationships.
Claims 23, 32, and 40:
generating the softmax output for the each position comprises, for an initial position in the output sequence: processing a predetermined initial output to generate an initial decoder hidden state; generating, from the initial decoder hidden state and the respective encoder hidden state for each input in the input sequence, a first softmax output for the initial position, wherein the first softmax output comprises a respective first output score for each token from the fixed vocabulary of possible tokens; and selecting a first token according to the respective first output score for each token as an output at the initial position in the output sequence - these limitations encompass mathematical relationships and mental processes corresponding to manipulations associated with the softmax output.
Claims 25 and 34:
processing the predetermined initial output to generate the initial decoder hidden state comprises: initializing an internal state of the decoder neural network to the encoder hidden state for a last input in the input sequence; and processing the predetermined initial output to update the initialized internal state to the initial decoder hidden state- these limitations encompass a mental process of generating the initial decoder hidden state representation, designing the initial state of a network to reflect the representation of the last input, and evaluating and updating the initial state based on a predetermined initial output, which is evaluating and judging that can be practically performed in the human mind or by a human using a pen and paper.
Claims 26 and 35:
generating the softmax output for the each position comprises, for a subsequent position after the initial position in the output sequence: processing an output at a preceding position in the output sequence to generate a decoder hidden state for the output at the preceding position; generating, from the decoder hidden state for the output at the preceding position and the respective encoder hidden states for each input in the input sequence, a second softmax output for the subsequent position, wherein the second softmax output comprises a respective second output score for each token from the fixed vocabulary of possible tokens; and selecting a second token according to the respective second output score for each token as an output at the subsequent position in the output sequence – these limitations encompass mathematical relationships and mental processes corresponding to manipulations associated with the softmax output.
Claims 27 and 36:
the softmax output for the each position is generated – these limitations encompass mathematical relationships, mathematical calculations, and mental processes corresponding to manipulations associated with the softmax output.
Claims 28 and 37:
when the token from the fixed vocabulary of possible tokens is selected, determining that an other output should be generated; and when the designated token is selected, determining that the other output should not be generated - these limitations encompass a mental process of evaluating and judging that can be practically performed in the human mind or by a human using a pen and paper.
Claims 29 and 38:
generating the plurality of output sequences comprises generating the plurality of output sequences using a beam search technique, and wherein the method further comprises: determining a respective sequence score for each output sequence in the plurality of output sequences; and selecting an output sequence having a highest sequence score as a final output sequence for the input sequence –these limitations encompass mathematical relationships/ calculations and mental processes corresponding to manipulations associated with beam search technique.
Thus, the claims recite the abstract idea that falls under the “mental process” and/or “mathematical concepts” grouping.
Step 2A Prong 2
Dependent Claims
Additional elements
Claims 23, 32, and 40:
using the decoder neural network; generated by the encoder neural network - these limitations are recited at a high-level of generality such that it amounts to no more than merely using neural network as a tool to execute the abstract idea (see MPEP § 2106.05(f)). These limitations can also be viewed as generally linking the judicial exception to the technological environment of neural networks (see MPEP § 2106.05(h)).
Claims 24 and 33:
the encoder neural network and the decoder neural network are each a respective recurrent neural network - these limitations are recited at a high-level of generality such that it amounts to no more than merely using neural network as a tool to execute the abstract idea (see MPEP § 2106.05(f)). These limitations can also be viewed as generally linking the judicial exception to the technological environment of neural networks (see MPEP § 2106.05(h)).
Claims 25 and 34:
using the decoder neural network - these limitations are recited at a high-level of generality such that it amounts to no more than merely using neural network as a tool to execute the abstract idea (see MPEP § 2106.05(f)). These limitations can also be viewed as generally linking the judicial exception to the technological environment of neural networks (see MPEP § 2106.05(h)).
Claims 26 and 35:
using the decoder neural network; generated by the encoder neural network - these limitations are recited at a high-level of generality such that it amounts to no more than merely using neural network as a tool to execute the abstract idea (see MPEP § 2106.05(f)). These limitations can also be viewed as generally linking the judicial exception to the technological environment of neural networks (see MPEP § 2106.05(h)).
Claims 27 and 36:
generated by a softmax output layer of the decoder neural network – - these limitations are recited at a high-level of generality such that it amounts to no more than merely using neural network as a tool to execute the abstract idea (see MPEP § 2106.05(f)). These limitations can also be viewed as generally linking the judicial exception to the technological environment of neural networks (see MPEP § 2106.05(h)).
Accordingly, these additional elements do not integrate the judicial exception into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claims are directed to the abstract idea.
Step 2B
Dependent Claims
Additional elements
Claims 23, 32, and 40:
using the decoder neural network; generated by the encoder neural network - these limitations are recited at a high-level of generality such that it amounts to no more than merely using neural network as a tool to execute the abstract idea (see MPEP § 2106.05(f)). These limitations can also be viewed as generally linking the judicial exception to the technological environment of neural networks (see MPEP § 2106.05(h)).
Claims 24 and 33:
the encoder neural network and the decoder neural network are each a respective recurrent neural network - these limitations are recited at a high-level of generality such that it amounts to no more than merely using neural network as a tool to execute the abstract idea (see MPEP § 2106.05(f)). These limitations can also be viewed as generally linking the judicial exception to the technological environment of neural networks (see MPEP § 2106.05(h)).
Claims 25 and 34:
using the decoder neural network - these limitations are recited at a high-level of generality such that it amounts to no more than merely using neural network as a tool to execute the abstract idea (see MPEP § 2106.05(f)). These limitations can also be viewed as generally linking the judicial exception to the technological environment of neural networks (see MPEP § 2106.05(h)).
Claims 26 and 35:
using the decoder neural network; generated by the encoder neural network - these limitations are recited at a high-level of generality such that it amounts to no more than merely using neural network as a tool to execute the abstract idea (see MPEP § 2106.05(f)). These limitations can also be viewed as generally linking the judicial exception to the technological environment of neural networks (see MPEP § 2106.05(h)).
Claims 27 and 36:
generated by a softmax output layer of the decoder neural network – - these limitations are recited at a high-level of generality such that it amounts to no more than merely using neural network as a tool to execute the abstract idea (see MPEP § 2106.05(f)). These limitations can also be viewed as generally linking the judicial exception to the technological environment of neural networks (see MPEP § 2106.05(h)).
Accordingly, these additional elements do not amount to significantly more than the judicial exception. As such, the claims are patent ineligible.
Response to Arguments
Claim Objection: Applicant’s amendments did not overcome all of the claim objections previously set forth.
35 U.S.C. §101: In the remarks, Applicant argues that the present Specification describes technical improvements that can be achieved by the claimed subject matter with respect to the operation of a neural network that converts an input sequence into an output sequence. The disclosed technology allows a neural network system to effectively generate an output that is a pointer into an input sequence, regardless of the number of inputs in the input sequence. Thus, an output sequence that consists of inputs from an input sequence can effectively be generated. Moreover, the neural network system can effectively generate such output sequences for input sequences of varying lengths.
Examiner respectfully disagrees with Applicant’s arguments.
Firstly, Examiner notes that the features upon which Applicant relies as improvement – i.e., generating an output that is a pointer into an input sequence, generating output sequence that consists of inputs from an input sequence, and generating output sequences for input sequences of varying lengths - are not recited in the rejected claim(s). The claim merely involves the steps of evaluating an input sequence to generate a respective representation; calculating a respective output for each evaluated position in the output sequence using the representation; generating output sequences by determining whether to select a designated word/ token as output based on the last position in the output sequence, which don’t reflect generating an output that is a pointer into an input sequence, generating output sequence that consists of inputs from an input sequence, and generating output sequences for input sequences of varying lengths. Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. See In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993). Additionally, according to the Background section of the specification, selecting output from a “fixed vocabulary” encompasses a conventional approach in generating output sequence. Secondly, as analyzed in detail under Step 2A, Prong 1 portion of the 35 U.S.C. 101 rejections above, evaluating an input sequence to generate a respective representation; calculating a particular output (respective output) for each evaluated position in the output sequence using the representation (encoder hidden state); generating output sequences and determining whether to select a designated word/ token as output based on the last position in the output sequence are mental processes involving observation, evaluation, and judgement that can be practically performed in the human mind or by a human using a pen and paper, such as a user evaluating an input sequence, writing down a plurality of output sequences, and making a judgment regarding end position in the sequence. The system and neural networks are recited at a high-level of generality such that it amount to no more than merely using neural networks and generic computer as a tool to execute the abstract idea. Therefore, these additional elements do not integrate the judicial exception into a practical application and do not amount to significantly more than the judicial exception. Examiner notes that “It is important to note, the judicial exception alone cannot provide the improvement” (see MPEP § 2106.05 (a)). The recited claim merely involves, at most, an improvement to the abstract idea itself with the aid of a neural networks and generic computer. Examiner notes that “It is important to keep in mind that an improvement in the abstract idea itself is not an improvement in technology” (see MPEP 2106.05(a)(II)).
Accordingly, Applicant’s arguments concerning the §101 rejections are not persuasive.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Applicant is required under 37 CFR § 1.111(c) to consider these references fully when responding to this action.
Socher et al. (US 2017/0024645 A1) teaches: an input module 1200A, where a positional encoder 1208 implements the sentence reader 1202 and a bi-directional GRU 1206 comprises the input fusion layer 1204; each sentence encoding fi is the output of an encoding scheme taking the word tokens [w1 i, . . . , wM i i], where Mi is the length of the sentence; generates a sequence of one or more output labels responsive to the question; input sequence operation 310 trains on initial input sequences to obtain hidden states for words and sentences; the final outcome of the episodic memory is given to the answer module that is comprised of an RNN that can generate any sequence of output labels based upon the final outcome of the episodic memory and provide the sequence of output labels as the answer (see [0061], [0087], [0108], [0147]).
Any inquiry concerning this communication or earlier communications from the examiner should be directed to REJI KARTHOLY whose telephone number is (571)272-3432. The examiner can normally be reached on Monday - Thursday from 7:30 am to 3:30 pm.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jennifer Welch, can be reached at telephone number 571-272-7212. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from Patent Center. Status information for published applications may be obtained from Patent Center. Status information for unpublished applications is available through Patent Center for authorized users only. Should you have questions about access to Patent Center, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) Form at https://www.uspto.gov/patents/uspto-automated- interview-request-air-form.
/REJI KARTHOLY/Primary Examiner, Art Unit 2143