Prosecution Insights
Last updated: April 19, 2026
Application No. 18/344,201

DEVICE AND METHOD WITH FLEXIBLE NEURAL NETWORK

Non-Final OA §101§112
Filed
Jun 29, 2023
Examiner
AKINTOLA, OLABODE
Art Unit
3691
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Samsung Electronics Co., Ltd.
OA Round
1 (Non-Final)
50%
Grant Probability
Moderate
1-2
OA Rounds
4y 2m
To Grant
59%
With Interview

Examiner Intelligence

Grants 50% of resolved cases
50%
Career Allow Rate
375 granted / 748 resolved
-1.9% vs TC avg
Moderate +9% lift
Without
With
+9.1%
Interview Lift
resolved cases with interview
Typical timeline
4y 2m
Avg Prosecution
36 currently pending
Career history
784
Total Applications
across all art units

Statute-Specific Performance

§101
35.2%
-4.8% vs TC avg
§103
33.9%
-6.1% vs TC avg
§102
10.2%
-29.8% vs TC avg
§112
10.1%
-29.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 748 resolved cases

Office Action

§101 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 4-5, 6, 14-16 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 4-5, 15 and 16 recite the limitation "the operating of the layer of the neural network model" in lines 2-3. There is insufficient antecedent basis for this limitation in the claims. Claim should recite “operation” Claim 6 recites the limitation “for the post-processing of the result value of the operation” in lines 2 and 4. There is insufficient antecedent basis for this “the result value” in the claim. Claim 14 recites the limitation “the device” in lines 2 and 4. There is insufficient antecedent basis for this limitation in the claim. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 12 and 15-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (abstract idea) without significantly more. Analysis Claim 12: Ineligible. STEP 1: The claim recites a series of acts, as such, is a statutory category of invention (Step 1: YES). STEP 2A (PRONG 1): The claim is analyzed to determine whether it is directed to a judicial exception. The claim recites the limitations: storing a weight for an operation of a layer of a neural network model; generating setting information for performing the operation of the layer by the neural network model using the stored weight; receiving input data for the operation based on the generated setting information; performing the operation of the layer based on the received input data; post-processing a result value of the performing of the operation; and storing the result value of the operation. These limitations, as drafted, are processes that, under its broadest reasonable interpretation, can be performed as a mental process (that is, “observation, evaluation, judgement, opinion”), or in the alternative, the mathematical concepts. These limitations fall under the “mental processes” and/or “mathematical concept” groups (Step 2A1-Yes). STEP 2A (PRONG 2): For purposes of Compact Prosecution, Examiner will provide an additional alternative analysis for each of the remaining steps below. Next, the claim is analyzed to determine if it is integrated into a practical application. The claims do not recite any additional elements and so, do not integrate the abstract idea into a practical application. The claim is directed to the abstract idea (Step 2A2-No). In the alternative, even if the “method” incorporates certain hardware components that constitute additional elements (which is not necessarily implied here), such components are recited at a high level of generality, i.e., as a generic computing system performing a generic computer function of processing data. These system components are no more than mere instructions to apply the exception using generic computer components. Accordingly, these components do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claim is directed to the abstract idea (Step 2A2-No). STEP 2B: Next, the claim is analyzed to determine if there are additional claim limitations that individually, or as an ordered combination, ensure that the claim amounts to significantly more than the abstract ideas (whether claim provides inventive concept). As discussed with respect to Step 2A2 above, the absence of additional claim elements to determine whether the claim amounts to significantly more than the abstract ideas renders this Step 2B moot. In the alternative, as discussed with respect to the alternative Step 2A2 above, the additional elements in the claim amount to no more than mere instructions to apply the exception using generic computer components. The same analysis applies here in alternative Step 2B, i.e., mere instructions to apply an exception using generic computer components cannot integrate a judicial exception into a practical application at alternative Step 2A or provide an inventive concept in alternative Step 2B. Viewing the limitations as an ordered combination does not add anything further than looking at the limitations individually. When viewed either individually, or as an ordered combination, with or without the alternative analysis above, the additional limitations do not amount to a claim as a whole that is significantly more than the abstract idea itself. Therefore, the claim does not amount to significantly more than the recited abstract idea (Step 2B- No). The claim is not patent eligible. Claims 15-20 recite wherein the operating of the layer of the neural network model comprises: in response to a correction range value being a predetermined first value, reducing a byte size of a digital value of the operation result; and in response to the correction range value being a predetermined second value, extending the byte size of the digital value of the operation result; wherein the operating of the layer of the neural network model comprises moving a center value of the operation based on a center value movement range value; wherein the post-processing of the result value of the operation comprises: performing a post-processing operation of any one or any combination of any two or more of pooling, batch normalization, activation, and output result bit conversion; and storing the result value of the operation obtained by converting a result of the post-processing operation based on the setting information; wherein the post-processing operation is performed in response to a signal value for notifying that an output value generated in a specific cycle is valid being received; wherein the receiving of the input data for the operation comprises converting the received input data related to data in a column direction; wherein the receiving of the input data for the operation comprises reformatting data while reusing the data through a shift buffer. These limitations further narrow the abstract idea, but is nonetheless part of the abstract idea identified in claim 12. The additional elements, as similarly analyzed in claim 12 above, do not integrate the abstract idea into a practical application. The claimed invention as a whole also does not amount to significantly more than the abstract idea. The claim is similarly rejected under the same rationale as claim 12, supra. For claim 12, Examiner strongly suggests that each step of claim 12 includes the element that carries out the operation respectively similar to claim 1, as this could potentially overcome the aforementioned 35 U.S.C. 101 rejection. For example: “A method comprising: storing and operating, by an operation module, a weight for an operation of a layer of a neural network model; generating, by a control module, setting information for performing the operation of the layer by the neural network model using the stored weight; receiving, by an input model, input data for the operation of the layer based on the generated setting information; receiving, by a merging model, operation results of the operation of the layer from the operation module and merge the received operation results of the layer; receiving, by a post-processing module, the merged operation results of the layer from the merging module and post-process the received merged operation results of the layer; and converting and storing, by an output stream module, the post-processed operation results based on the generated setting information.” Allowable Subject Matter Claims 1-3, and 7-11 are allowed. Claim 13 is objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. The following is a statement of reasons for the indication of allowable subject matter: The closest prior art, Vivekraja et al. (USPN 12,518,167), teaches an operation module configured to store and operate a weight for an operation of a layer of a neural network model (col. 2, lines 17-25); a control module configured to generate setting information for performing the operation of the layer by the neural network model using the stored weight (col. 2, lines 17-25); an input module configured to receive input data for the operation of the layer based on the generated setting information (col. 2, lines 17-25). The closest prior art fails to teach inter alia: a merging module configured to receive operation results of the operation of the layer from the operation module and merge the received operation results of the layer; a post-processing module configured to receive the merged operation results of the layer from the merging module and post-process the received merged operation results of the layer; and an output stream module configured to convert and store the post-processed operation results based on the generated setting information. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Ioffe et al. (US 11,893,485) teaches methods, systems, and apparatus, including computer programs encoded on computer storage media, for processing inputs using a neural network system that includes a batch normalization layer. Kim et al. (US 11,836,463) teaches neural network device including a shift register circuit, a control circuit, and a processing circuit. The shift register circuit includes registers configured to, in each cycle of cycles, transfer stored data to a next register and store new data received from a previous register to a current register. The control circuit is configured to sequentially input data of input activations included in an input feature map into the shift register circuit in a preset order. The processing circuit, includes crossbar array groups that receive input activations from at least one of the registers and perform a multiply-accumulate (MAC) operation with respect to the received input activation and weights, is configured to accumulate and add at least some operation results output from the crossbar array groups in a preset number of cycles to obtain an output activation in an output feature map. Kwon et al. (USPAP 2022/0343147) teaches a neural network apparatus including: a first processing circuit and a second processing circuit each configured to perform a vector-by-matrix multiplication (VMM) operation on a weight and an input activation; a first register configured to store an output of the first processing circuit; an adder configured to add an output of the first register and an output of the second processing circuit; a second register configured to store an output of the adder; and an input circuit configured to input a same input activation to the first processing circuit and the second processing circuit and control the first processing circuit and the second processing circuit. Lee et al. (USPAP 2022/0092394) teaches an apparatus with neural network operations. Any inquiry concerning this communication or earlier communications from the examiner should be directed to OLABODE AKINTOLA whose telephone number is (571)272-3629. The examiner can normally be reached Mon-Fri 8:30a-6:00p. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abhishek Vyas can be reached at 571-270-1836. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /OLABODE AKINTOLA/Primary Examiner, Art Unit 3691
Read full office action

Prosecution Timeline

Jun 29, 2023
Application Filed
Feb 05, 2026
Non-Final Rejection — §101, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602694
SYSTEMS AND METHODS FOR MOBILE PRE-AUTHORIZATION OF A CREDIT TRANSACTION
2y 5m to grant Granted Apr 14, 2026
Patent 12586128
SYSTEM AND METHOD FOR SMART ORDER ROUTING AND AUTOMATIC MARKET MAKER PATH DETERMINATION IN A DECENTRALIZED MARKET
2y 5m to grant Granted Mar 24, 2026
Patent 12586059
CHAT-BASED TRANSACTION AUTOMATION SYSTEM
2y 5m to grant Granted Mar 24, 2026
Patent 12572901
AUTOMATED TRANSACTION HANDLING USING SOFTWARE BOTS
2y 5m to grant Granted Mar 10, 2026
Patent 12567113
SYSTEMS AND METHODS FOR MEASURING RELATIONSHIPS BETWEEN INVESTMENTS AND OTHER VARIABLES
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
50%
Grant Probability
59%
With Interview (+9.1%)
4y 2m
Median Time to Grant
Low
PTA Risk
Based on 748 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month