Prosecution Insights
Last updated: April 19, 2026
Application No. 18/095,420

METHOD AND APPARATUS ENCODING/DECODING A NEURAL NETWORK FEATURE MAP

Non-Final OA §101§102§103
Filed
Jan 10, 2023
Examiner
CHIUSANO, ANDREW TSUTOMU
Art Unit
2144
Tech Center
2100 — Computer Architecture & Software
Assignee
Hanbat National University Industry-Academic Cooperation Foundation
OA Round
1 (Non-Final)
55%
Grant Probability
Moderate
1-2
OA Rounds
3y 2m
To Grant
83%
With Interview

Examiner Intelligence

Grants 55% of resolved cases
55%
Career Allow Rate
217 granted / 392 resolved
At TC average
Strong +28% interview lift
Without
With
+28.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
22 currently pending
Career history
414
Total Applications
across all art units

Statute-Specific Performance

§101
12.7%
-27.3% vs TC avg
§103
57.4%
+17.4% vs TC avg
§102
10.7%
-29.3% vs TC avg
§112
13.6%
-26.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 392 resolved cases

Office Action

§101 §102 §103
DETAILED ACTION This Office Action is sent in response to Applicant’s Communication received 1/1/2023 for application number 18/095,420. Claims 1-17 are pending. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-17 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Independent claim 1, representative of claims 16 and 17, recites: A method of decoding a neural network feature, comprising: receiving a bitstream including an encoded feature, the encoded feature including one or more features extracted from at least one image; decoding the feature from the bitstream; and reconstructing features corresponding to a plurality of layers of a neural network based on the decoded feature. (2A, prong 1) The underlined portions of the claim recite an abstract idea, specifically a mathematical calculation. Applicant’s specification (as well as the dependent claims) states that reconstructing features requires calculating upsampled or downsampled features for a plurality of layers (see para. 0214-56 of spec. as published). The Examiner notes that although the claim does not explicitly recite an equation, the claim limitations are not merely based on a mathematical concept; “…a mathematical concept need not be expressed in mathematical symbols.” See MPEP 2106.04(a)(2). Here, in order to reconstruct features corresponding to a plurality of layers of a neural network based on the decoded feature, arithmetic calculations (of upsampling or downsampling) must be performed. (2A, prong 2) This judicial exception is not integrated into a practical application. The claims recite the additional limitations of (1) receiving and decoding a bitstream including features and (2) generic computer hardware (the apparatus units in claim 16, non-transitory computer-readable medium in claim 17). Additional element (1) is insignificant extra-solution activity, because it is mere necessary data gathering for the abstract idea. That is to say, this additional element merely functions to receive numerical values for use in the mathematical calculations. Additional element (2) is a mere instruction to apply the exception because it merely adds generic computer hardware to the abstract idea after the fact. Even when considered in combination with the abstract idea, the additional elements do not integrate the abstract idea into a practical application because they only add insignificant extra-solution activity and mere instructions to apply the exception to the mathematical calculations. (2B) The claim(s) does/do not include additional elements that are sufficient to amount to significantly more than the judicial exception. Additional element (1) is well-understood, routine, and conventional, analogous to receiving or transmitting data over a network, e.g., using the Internet to gather data. See MPEP 2106.05(d) citing Intellectual Ventures v. Symantec, 838 F.3d 1307, 1321; 120 USPQ2d 1353, 1362 (Fed. Cir. 2016). Additional element (2) is a mere instruction to apply the exception as explained above. Even when considered in combination with the abstract idea, the additional elements do not amount to significantly more than the abstract idea itself because they only add insignificant extra-solution activity that is well-understood, routine, and conventional and mere instructions to apply the exception to the mathematical calculations. In other words, claim as a whole is directed to receiving data for mathematical calculations, and then performing the mathematical calculations on a computer, which does not amount to significantly more than the mathematical calculations themselves. With respect to dependent claims 2-15, these claims add additional mathematical calculations to the mathematical calculations in the independent claims. Specifically, claims 2-7 recite the reconstruction calculations are performed from the bottom layer to the top layer and additional details on how to perform the calculations, and claims 8-15 recite the reconstruction calculations are performed from the top layer to the bottom layer and additional details on how to perform the calculations. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1, 8-11, are 16-17 is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Ahn et al. (US 2024/0056575 A1). In reference to claim 1, Ahn discloses a method (para. 0004) of decoding a neural network feature, comprising: receiving a bitstream including an encoded feature, the encoded feature including one or more features extracted from at least one image (bitstream of image feature map received, para. 0080-81, fig. 1); decoding the feature from the bitstream (feature map decoded, para. 0081-82, fig. 1); and reconstructing features corresponding to a plurality of layers of a neural network based on the decoded feature (decoder reconstructs feature map, para. 0082-83, the features correspond to a plurality of neural network blocks, the blocks being a layer of the neural network, para. 0088, 0152-57). In reference to claim 8, Ahn discloses the method of claim 1, wherein the features corresponding to the plurality of layers are reconstructed according to a reconstruction order of a top-down structure (Ahn teaches reconstruction by reversing the quantization from the encoder, para. 0152-57, which is “top-down” because it starts with the output feature map, i.e. the top layer output, and dequantizes / upsamples to reconstruct the feature map). In reference to claim 9, Ahn discloses the method of claim 8, wherein the top-down structure is a structure in which a feature corresponding to each layer is reconstructed in an order from a highest layer to a lowermost layer among the plurality of layers (Ahn teaches reconstruction by reversing the quantization from the encoder, para. 0152-57, which is “top-down” because it starts with the output feature map, i.e. the top layer output, and dequantizes / upsamples to reconstruct the feature map). In reference to claim 10, Ahn discloses the method of claim 9, wherein reconstructing the features comprises: reconstructing, from the decoded feature, a first feature corresponding to a first layer among the plurality of layers; and reconstructing a second feature corresponding to a second layer among the plurality of layers based on at least one of the decoded feature or the first feature of the first layer, wherein the first feature of the first layer has a smaller size than the second feature of the second layer (dequantizing / decompressing / upsampling means there will be more features in lower layers as the reconstruction gets to the initial image, para. 0106-07, 0152-57). In reference to claim 11, Ahn discloses the method of claim 10, wherein the decoded feature is set equal to the first feature corresponding to the first layer (Ahn would start with the decoded feature being the same as the first feature of the first layer, para. 0106-07, 0152-57). In reference to claim 16, this claim is directed to an apparatus associated with the method claimed in claim 1 and is therefore rejected under a similar rationale. In reference to claim 17, this claim is directed to an non-transitory computer-readable medium associated with the method claimed in claim 1 and is therefore rejected under a similar rationale. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 2-5 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ahn et al. (US 2024/0056575 A1) in view of Zhao et al. (US 2024/0135491 A1). In reference to claim 2, Ahn does not explicitly teach the method of claim 1, wherein the features corresponding to the plurality of layers are reconstructed according to a reconstruction order of a bottom-up structure. Zhao teaches the method of claim 1, wherein the features corresponding to the plurality of layers are reconstructed according to a reconstruction order of a bottom-up structure (Zhao is bottom—up reconstruction because it takes the input feature map and downsamples, fig. 2, para. 0062-82). It would have been obvious to one of ordinary skill in art, having the teachings of Ahn and Zhao before the earliest effective filing date, to modify the top-down reconstruction as disclosed by Ahn to include the bottom-up reconstruction as taught by Zhao. One of ordinary skill in the art would have been motivated to modify the top-down reconstruction of Ahn to include the bottom-up reconstruction of Zhao because it can help reduce parameters and network complexity (Zhao, para. 0002-08). In reference to claim 3, Ahn does not explicitly teach the method of claim 2, wherein the bottom-up structure is a structure in which a feature corresponding to each layer is reconstructed in order from a lowest layer to a highest layer among the plurality of layers. Zhao teaches the method of claim 2, wherein the bottom-up structure is a structure in which a feature corresponding to each layer is reconstructed in order from a lowest layer to a highest layer among the plurality of layers (Zhao is bottom—up reconstruction because it takes the input feature map and downsamples, fig. 2, para. 0062-82). It would have been obvious to one of ordinary skill in art, having the teachings of Ahn and Zhao before the earliest effective filing date, to modify the top-down reconstruction as disclosed by Ahn to include the bottom-up reconstruction as taught by Zhao. One of ordinary skill in the art would have been motivated to modify the top-down reconstruction of Ahn to include the bottom-up reconstruction of Zhao because it can help reduce parameters and network complexity (Zhao, para. 0002-08). In reference to claim 4, Ahn does not explicitly teach the method of claim 3, wherein reconstructing the features comprises: reconstructing, from the decoded feature, a first feature corresponding to a first layer among the plurality of layers; and reconstructing a second feature corresponding to a second layer among the plurality of layers based on at least one of the decoded feature or the first feature of the first layer, wherein the first feature of the first layer has a larger size than the second feature of the second layer. Zhao teaches the method of claim 3, wherein reconstructing the features comprises: reconstructing, from the decoded feature, a first feature corresponding to a first layer among the plurality of layers; and reconstructing a second feature corresponding to a second layer among the plurality of layers based on at least one of the decoded feature or the first feature of the first layer, wherein the first feature of the first layer has a larger size than the second feature of the second layer (see fig. 2: first layer like branch 2 is larger than second layer like branch 3). It would have been obvious to one of ordinary skill in art, having the teachings of Ahn and Zhao before the earliest effective filing date, to modify the top-down reconstruction as disclosed by Ahn to include the bottom-up reconstruction as taught by Zhao. One of ordinary skill in the art would have been motivated to modify the top-down reconstruction of Ahn to include the bottom-up reconstruction of Zhao because it can help reduce parameters and network complexity (Zhao, para. 0002-08). In reference to claim 5, Ahn does not explicitly teach the method of claim 4, wherein the first feature is reconstructed by upsampling the decoded feature, and wherein the upsampling is performed based on one of nearest neighbor interpolation or pixel shuffle. Zhao teaches the method of claim 4, wherein the first feature is reconstructed by upsampling the decoded feature, and wherein the upsampling is performed based on one of nearest neighbor interpolation or pixel shuffle (for branch 2, upsampling can be performed using nearest neighbor before downsampling, para. 0071, fig. 2). It would have been obvious to one of ordinary skill in art, having the teachings of Ahn and Zhao before the earliest effective filing date, to modify the top-down reconstruction as disclosed by Ahn to include the bottom-up reconstruction as taught by Zhao. One of ordinary skill in the art would have been motivated to modify the top-down reconstruction of Ahn to include the bottom-up reconstruction of Zhao because it can help reduce parameters and network complexity (Zhao, para. 0002-08). Claims 6-7 and 12-14 are not rejected under 35 USC § 102 / 103. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. References C, U, V, and W (see Notice of References Cited) teach encoding and decoding feature maps. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Andrew T. Chiusano whose telephone number is (571)272-5231. The examiner can normally be reached M-F, 10am-6pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Tamara Kyle can be reached at 571-272-4241. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ANDREW T CHIUSANO/Primary Examiner, Art Unit 2144
Read full office action

Prosecution Timeline

Jan 10, 2023
Application Filed
Mar 06, 2026
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596767
ACTIVE LEARNING DRIFT ANALYSIS AND TRAINING
2y 5m to grant Granted Apr 07, 2026
Patent 12591771
DYNAMIC QUANTIZATION FOR ENERGY EFFICIENT DEEP LEARNING
2y 5m to grant Granted Mar 31, 2026
Patent 12561045
CONTENT-BASED MENUS FOR TABBED USER INTERFACE
2y 5m to grant Granted Feb 24, 2026
Patent 12547927
DETECTING ASSOCIATED EVENTS
2y 5m to grant Granted Feb 10, 2026
Patent 12541686
METHOD AND APPARATUS WITH NEURAL ARCHITECTURE SEARCH BASED ON HARDWARE PERFORMANCE
2y 5m to grant Granted Feb 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
55%
Grant Probability
83%
With Interview (+28.0%)
3y 2m
Median Time to Grant
Low
PTA Risk
Based on 392 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month