Prosecution Insights
Last updated: April 19, 2026
Application No. 18/513,581

IMAGE ENCODING AND DECODING, VIDEO ENCODING AND DECODING: METHODS, SYSTEMS AND TRAINING METHODS

Final Rejection §DP
Filed
Nov 19, 2023
Examiner
CARTER, RICHARD BRUCE
Art Unit
2485
Tech Center
2400 — Computer Networks
Assignee
Interdigital Vc Holdings Inc.
OA Round
2 (Final)
64%
Grant Probability
Moderate
3-4
OA Rounds
3y 1m
To Grant
85%
With Interview

Examiner Intelligence

Grants 64% of resolved cases
64%
Career Allow Rate
290 granted / 453 resolved
+6.0% vs TC avg
Strong +21% interview lift
Without
With
+20.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
12 currently pending
Career history
465
Total Applications
across all art units

Statute-Specific Performance

§101
6.1%
-33.9% vs TC avg
§103
60.3%
+20.3% vs TC avg
§102
8.2%
-31.8% vs TC avg
§112
11.0%
-29.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 453 resolved cases

Office Action

§DP
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Examiner agrees with applicant’s argument (on page 8) with respect to the 35 USC § 101 rejection. Therefore, the previous 35 USC § 101 rejection has been withdrawn with respect to claim 18. Response to Amendment 3. The applicant's amendment received on 11/28/2025 in which claim 18 (AMENDED) and claims 19-20 (NEWLY ADDED), has been fully considered and entered, but the arguments are moot in view of the new ground(s) of rejection. Double Patenting 4. The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory obviousness-type double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); and In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on a nonstatutory double patenting ground provided the conflicting application or patent either is shown to be commonly owned with this application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. Effective January 1, 1994, a registered attorney or agent of record may sign a terminal disclaimer. A terminal disclaimer signed by the assignee must fully comply with 37 CFR 3.73(b). 5. Claim 1-20 are rejected on the ground of nonstatutory obviousness-type double patenting as being unpatentable over claims 1-16 of U.S. Patent No. 11,677,948 B2. Furthermore, although the conflicting claims at issue are not identical, they are not patentably distinct from each other because U.S. Patent No.: 11,677,948 B2 claims: Instant Application: 18/513,581 Note: bold and underlined fonts means same features between instant application and conflicting appl. Conflicting Application: 17/740,716 → now US Patent No.: 11,677,948 B2 Claim [1]: A computer implemented method of training a neural network for use in lossy image or video compression, the method comprising: (i) receiving an input image; (ii) encoding the input image using a first neural network to produce a latent representation, and decoding the latent representation using a second neural network to produce a reconstruction of the input image; (iii) producing a first feature map, associated with the input image; (iv) producing a second feature map, associated with the reconstruction of the input image; (v) evaluating a function based on differences between the first feature map and the second feature map; (vi) evaluating a gradient of the function; (vii) back-propagating the gradient of the function through the first neural network and the second neural network to update the weights of the first neural network and the second neural network; (viii) repeating steps (i) to (vii) to produce a trained first neural network and a trained second neural network. Claim [1]: A computer implemented method of training a first neural network and a second neural network, the neural networks being for use in lossy image or video compression, transmission and decoding, the method including the steps of: (i) receiving an input training image; (ii) encoding the input training image using the first neural network, to produce a latent representation; (iii) quantizing the latent representation to produce a quantized latent; (iv) using the second neural network to produce an output image from the quantized latent, wherein the output image is an approximation of the input image; (v) evaluating a loss function based on differences between the output image and the input training image; (vi) evaluating a gradient of the loss function; (vii) back-propagating the gradient of the loss function through the second neural network and through the first neural network, to update weights of the second neural network and of the first neural network; and (viii) repeating steps (i) to (vii) using a set of training images, to produce a trained first neural network and a trained second neural network, and (ix) storing the weights of the trained first neural network and of the trained second neural network; wherein the loss function is a weighted sum of a rate term and a distortion term, wherein split quantisation is used during the evaluation of the gradient of the loss function, with a combination of two quantisation proxies for the rate term and the distortion term. However, examiner notes that Munkberg et al. (US Patent No.: 11,475,542 B2) teaches the unique limitations in the instant application regarding a computer implemented method (see fig. 1B) of training a neural network (see col. 6 lines 6-7) for use in lossy image or video compression (see fig. 1A), the method comprising: (i) receiving an input image (see fig. 2G, e.g. “input image”); and (ii) encoding the input image (see fig. 2G, e.g. “input image”) using a first neural network (see fig. 1A unit 110); and a second neural network (see fig. 2D unit 210 or unit 220). Therefore, it would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains could recognize the advantage of providing video compression training method and system by modifying Besenbruch’s teachings in the present US Patent No.: 11,677,948 for the purpose of wherein split quantisation is used during the evaluation of the gradient of the loss function, with a combination of two quantisation proxies for the rate term and the distortion term, thereby improving compression efficiency. Allowable Subject Matter 6. The following is a statement of reasons for the indication of allowable subject matter: Claims 1-20 of the instant application would be allowable provided obviousness type double patenting rejection above is overcome. Conclusion 7. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Kim et al. (US Pub. No.: 2020/0364574 A1) discloses neural network model apparatus and compressing method of neural network model. Abadl et al. (US Pub. No.: 2019/0171929 A1) discloses encoding and reconstructing inputs using neural networks. 8. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action. 9. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Richard Carter whose telephone number is (571)270-1220. The examiner can normally be reached M-F 8:30 am - 5:00 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jay Patel can be reached on 571-272-2988. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /R.B.C/Examiner, Art Unit 2485 /JAYANTI K PATEL/Supervisory Patent Examiner, Art Unit 2485 January 8, 2026
Read full office action

Prosecution Timeline

Nov 19, 2023
Application Filed
May 21, 2025
Non-Final Rejection — §DP
Nov 28, 2025
Response Filed
Jan 07, 2026
Final Rejection — §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12591126
APPARATUS AND METHODS FOR REAL-TIME IMAGE GENERATION
2y 5m to grant Granted Mar 31, 2026
Patent 12578567
APPARATUS AND METHODS FOR REAL-TIME IMAGE GENERATION
2y 5m to grant Granted Mar 17, 2026
Patent 12568224
EVC DECODING COMPLEXITY METRICS
2y 5m to grant Granted Mar 03, 2026
Patent 12563233
CTU-ROW BASED GEOMETRIC TRANSFORM
2y 5m to grant Granted Feb 24, 2026
Patent 12563173
HEAD-MOUNTED DEVICE FOR DISPLAYING PROJECTED IMAGES
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
64%
Grant Probability
85%
With Interview (+20.9%)
3y 1m
Median Time to Grant
Moderate
PTA Risk
Based on 453 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month