Prosecution Insights
Last updated: April 19, 2026
Application No. 19/027,056

Video Decoding and Encoding

Non-Final OA §DP
Filed
Jan 17, 2025
Examiner
SENFI, BEHROOZ M
Art Unit
2482
Tech Center
2400 — Computer Networks
Assignee
Telefonaktiebolaget Lm Ericsson (Publ)
OA Round
1 (Non-Final)
83%
Grant Probability
Favorable
1-2
OA Rounds
2y 10m
To Grant
93%
With Interview

Examiner Intelligence

Grants 83% — above average
83%
Career Allow Rate
858 granted / 1039 resolved
+24.6% vs TC avg
Moderate +10% lift
Without
With
+10.1%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
20 currently pending
Career history
1059
Total Applications
across all art units

Statute-Specific Performance

§101
7.4%
-32.6% vs TC avg
§103
42.6%
+2.6% vs TC avg
§102
21.1%
-18.9% vs TC avg
§112
9.3%
-30.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1039 resolved cases

Office Action

§DP
DETAILED ACTION Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Double Patenting 2. The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). 3. A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). 4. The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. 5. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. 6. Claims 1-20 are rejected on the ground of nonstatutory double patenting as being unpatentable over patented claims 1-20 of U.S. Patent No. 12,262,020; and patented claims 1-20 of U.S. Patent No. 11,831,878 and patented claims 1-3,5-6,8-10 and 14-15 of U.S. Patent No. 11,284,080, either alone or in-combination. Although the claims at issue are not identical, the scope and novel feature of the claims are the same and directed to the same invention, and are not patentably distinct from each other. It is noted that claims of the instant application is broader than the corresponding patented claims. For example; 19/027056 US 12262020 1. A method for decoding, from a bitstream, coded sample data of an encoded representation of a video sequence, comprising: receiving non-sample data for at least a first picture and a second picture; storing the non-sample data in a decoder memory that comprises a decoded picture buffer (DPB); receiving first coded sample data for the first picture; decoding the first coded sample data using the non-sample data to produce first decoded sample data; storing the first decoded sample data in the DPB; removing the first decoded sample data from the DPB; receiving second coded sample data for the second picture; keeping the non-sample data in the decoder memory based on a storage type indicator; decoding the second coded sample data using the non-sample data to produce second decoded sample data; and storing the second decoded sample data in the DPB. 1. A method for decoding, from a bitstream, coded sample data of an encoded representation of a video sequence, comprising: receiving non-sample data for at least a first picture and a second picture; storing the non-sample data in a decoder memory that comprises a decoded picture buffer (DPB); receiving first coded sample data for the first picture; decoding the first coded sample data using the non-sample data to produce first decoded sample data; storing the first decoded sample data in the DPB; receiving second coded sample data for the second picture; removing the first decoded sample data from the DPB; keeping the non-sample data in the decoder memory based on an obtained storage type indicator; decoding the second coded sample data using the non-sample data to produce second decoded sample data; and storing the second decoded sample data in the DPB, wherein the non-sample data comprises Context Adaptive Binary Arithmetic Coding (CABAC) probability states, adaptive loop filtering (ALF) filter types, ALF parameters, ALF coefficients, coding tree structure parameters, scaling matrices, and/or quantization parameter (QP) modification parameters. 2. The method of claim 1, further comprising identifying, in the decoder memory, the non- sample data among non-sample data of different previously decoded pictures of the video sequence based on information retrieved from the bitstream. 2. The method of claim 1, further comprising identifying, in the decoder memory, the non-sample data among non-sample data of different previously decoded pictures of the video sequence based on information retrieved from the bitstream. 3. The method of claim 2, further comprising: retrieving the information from the bitstream; and obtaining, based on the information retrieved from the bitstream, an index to a list of indicators to different non-sample data from the different previously decoded pictures, wherein identifying the non-sample data comprises identifying, in the decoder memory, the non- sample data among the non-sample data of the different previously decoded pictures based on the index. 3. The method of claim 2, further comprising: retrieving the information from the bitstream; and obtaining, based on the information retrieved from the bitstream, an index to a list of indicators to different non-sample data from the different previously decoded pictures, wherein identifying the non-sample data comprises identifying, in the decoder memory, the non-sample data among the non-sample data of the different previously decoded pictures based on the index. 4. The method of claim 1, further comprising: after decoding the second coded sample data using the non-sample data, removing the non- sample data from the decoder memory. 4. The method of claim 1, further comprising: after decoding the second coded sample data using the non-sample data, removing the non-sample data from the decoder memory. 5. The method of claim 1, wherein the first decoded sample data comprises luma values, and the non-sample data does not comprise any luma values. 5. The method of claim 1, wherein the first decoded sample data comprises luma values. 1. of ‘878, … said non-sample data does not comprise any luma value. 6. The method of claim 1, further comprising: marking the first decoded sample data as one of used for reference and unused for reference and marking the non-sample data as one of used for reference and unused for reference, wherein data marked as used for reference indicates that the data is available as reference data by a subsequently decoded picture of the video sequence and data marked as unused for reference indicates that the data is not available as reference data by any subsequently decoded picture of the video sequence, wherein removing the first decoded sample data comprises removing the first decoded sample data from the decoder memory if the first decoded sample data is marked as unused for reference and a previously decoded picture is not needed for output. 6. The method of claim 1, further comprising: marking the first decoded sample data as one of used for reference and unused for reference and marking the non-sample data as one of used for reference and unused for reference, wherein data marked as used for reference indicates that the data is available as reference data by a subsequently decoded picture of the video sequence and data marked as unused for reference indicates that the data is not available as reference data by any subsequently decoded picture of the video sequence, wherein removing the first decoded sample data comprises removing the first decoded sample data from the decoder memory if the first decoded sample data is marked as unused for reference and a previously decoded picture is not needed for output. 7. A method for encoding sample data, comprising: storing in an encoder memory non-sample data for at least a first picture and a second picture, the encoder memory comprising a decoded picture buffer (DPB); encoding first sample data for the first picture to produce first coded sample data; decoding the first coded sample data using the non-sample data to produce first decoded sample data; storing the first decoded sample data in the DPB; removing the first decoded sample data from the DPB; encoding second sample data for the second picture to produce second coded sample data; keeping the non-sample data in the decoder memory based on a storage type indicator; decoding the second coded sample data using the non-sample data to produce second decoded sample data; and storing the second decoded sample data in the DPB. 7. A method for encoding sample data, comprising: storing in an encoder memory non-sample data for at least a first picture and a second picture, the encoder memory comprising a decoded picture buffer (DPB); encoding first sample data for the first picture to produce first coded sample data; decoding the first coded sample data using the non-sample data to produce first decoded sample data; storing the first decoded sample data in the DPB; encoding second sample data for the second picture to produce second coded sample data; removing the first decoded sample data from the DPB; keeping the non-sample data in the decoder memory based on a storage type indicator; decoding the second coded sample data using the non-sample data to produce second decoded sample data; and storing the second decoded sample data in the DPB, wherein the non-sample data comprises Context Adaptive Binary Arithmetic Coding (CABAC) probability states, adaptive loop filtering (ALF) filter types, ALF parameters, ALF coefficients, coding tree structure parameters, scaling matrices, and/or quantization parameter (QP) modification parameters. 8. The method of claim 7, further comprising: after decoding the second coded sample data using the non-sample data, removing the non- sample data from the decoder memory. 8. The method of claim 7, further comprising: after decoding the second coded sample data using the non-sample data, removing the non-sample data from the decoder memory. 9. The method of claim 7, wherein the first decoded sample data comprises luma values, and the non-sample data does not comprise any luma values. 9. The method of claim 7, wherein the first decoded sample data comprises luma values. 1. of ‘878, … said non-sample data does not comprise any luma value. 10. The method of claim 7, further comprising: marking the first decoded sample data as one of used for reference and unused for reference and marking the non-sample data as one of used for reference and unused for reference, wherein data marked as used for reference indicates that the data is available as reference data by a subsequently encoded picture of the video sequence and data marked as unused for reference indicates that the data is not available as reference data by any subsequently encoded picture of the video sequence, wherein removing the first decoded sample data comprises removing the first decoded sample data from the DPB if the first decoded sample data is marked as unused for reference and a previously reconstructed picture is not needed for output. 10. The method of claim 7, further comprising: marking the first decoded sample data as one of used for reference and unused for reference and marking the non-sample data as one of used for reference and unused for reference, wherein data marked as used for reference indicates that the data is available as reference data by a subsequently encoded picture of the video sequence and data marked as unused for reference indicates that the data is not available as reference data by any subsequently encoded picture of the video sequence, wherein removing the first decoded sample data comprises removing the first decoded sample data from the DPB if the first decoded sample data is marked as unused for reference and a previously reconstructed picture is not needed for output. 11. A decoder comprising: a processor; and a memory comprising instructions executable by the processor, wherein the decoder is configured to perform a method comprising: receiving non-sample data for at least a first picture and a second picture; storing the non-sample data in a decoder memory that comprises a decoded picture buffer (DPB); receiving first coded sample data for the first picture; decoding the first coded sample data using the non-sample data to produce first decoded sample data; storing the first decoded sample data in the DPB; removing the first decoded sample data from the DPB; receiving second coded sample data for the second picture; keeping the non-sample data in the decoder memory based on a storage type indicator; decoding the second coded sample data using the non-sample data to produce second decoded sample data; and storing the second decoded sample data in the DPB. 11. A decoder comprising: a processor; and a memory comprising instructions executable by the processor, wherein the decoder is configured to perform a method comprising: receiving non-sample data for at least a first picture and a second picture; storing the non-sample data in a decoder memory that comprises a decoded picture buffer (DPB); receiving first coded sample data for the first picture; decoding the first coded sample data using the non-sample data to produce first decoded sample data; storing the first decoded sample data in the DPB; receiving second coded sample data for the second picture; removing the first decoded sample data from the DPB; keeping the non-sample data in the decoder memory based on an obtained storage type indicator; decoding the second coded sample data using the non-sample data to produce second decoded sample data; and storing the second decoded sample data in the DPB, wherein the non-sample data comprises Context Adaptive Binary Arithmetic Coding (CABAC) probability states, adaptive loop filtering (ALF) filter types, ALF parameters, ALF coefficients, coding tree structure parameters, scaling matrices, and/or quantization parameter (QP) modification parameters. 12. The decoder of claim 11, wherein the method further comprises identifying, in the decoder memory, the non-sample data among non-sample data of different previously decoded pictures of the video sequence based on information retrieved from the bitstream. 12. The decoder of claim 11, wherein the method further comprises identifying, in the decoder memory, the non-sample data among non-sample data of different previously decoded pictures of the video sequence based on information retrieved from the bitstream. 13. The decoder of claim 12, wherein the method further comprises: retrieving the information from the bitstream; and obtaining, based on the information retrieved from the bitstream, an index to a list of indicators to different non-sample data from the different previously decoded pictures, wherein identifying the non-sample data comprises identifying, in the decoder memory, the non- sample data among the non-sample data of the different previously decoded pictures based on the index. 13. The decoder of claim 12, wherein the method further comprises: retrieving the information from the bitstream; and obtaining, based on the information retrieved from the bitstream, an index to a list of indicators to different non-sample data from the different previously decoded pictures, wherein identifying the non-sample data comprises identifying, in the decoder memory, the non-sample data among the non-sample data of the different previously decoded pictures based on the index. 14. The decoder of claim 11, wherein the method further comprises: after decoding the second coded sample data using the non-sample data, removing the non- sample data from the decoder memory. 14. The decoder of claim 11, wherein the method further comprises: after decoding the second coded sample data using the non-sample data, removing the non-sample data from the decoder memory. 15. The decoder of claim 11, wherein the first decoded sample data comprises luma values, and the non-sample data does not comprise any luma values. 15. The decoder of claim 11, wherein the first decoded sample data comprises luma values. 1. of ‘878, … said non-sample data does not comprise any luma value. 16. The decoder of claim 11, wherein the method further comprises: marking the first decoded sample data as one of used for reference and unused for reference and marking the non-sample data as one of used for reference and unused for reference, wherein data marked as used for reference indicates that the data is available as reference data by a subsequently decoded picture of the video sequence and data marked as unused for reference indicates that the data is not available as reference data by any subsequently decoded picture of the video sequence, wherein removing the first decoded sample data comprises removing the first decoded sample data from the decoder memory if the first decoded sample data is marked as unused for reference and a previously decoded picture is not needed for output. 16. The decoder of claim 11, wherein the method further comprises: marking the first decoded sample data as one of used for reference and unused for reference and marking the non-sample data as one of used for reference and unused for reference, wherein data marked as used for reference indicates that the data is available as reference data by a subsequently decoded picture of the video sequence and data marked as unused for reference indicates that the data is not available as reference data by any subsequently decoded picture of the video sequence, wherein removing the first decoded sample data comprises removing the first decoded sample data from the decoder memory if the first decoded sample data is marked as unused for reference and a previously decoded picture is not needed for output. 17. An encoder comprising: a processor; and a memory comprising instructions executable by the processor, wherein the encoder is configured to perform a method comprising: storing in an encoder memory non-sample data for at least a first picture and a second picture, the encoder memory comprising a decoded picture buffer (DPB); encoding first sample data for the first picture to produce first coded sample data; decoding the first coded sample data using the non-sample data to produce first decoded sample data; storing the first decoded sample data in the DPB; removing the first decoded sample data from the DPB; encoding second sample data for the second picture to produce second coded sample data; keeping the non-sample data in the decoder memory based on a storage type indicator; decoding the second coded sample data using the non-sample data to produce second decoded sample data; and storing the second decoded sample data in the DPB. 17. An encoder comprising: a processor; and a memory comprising instructions executable by the processor, wherein the encoder is configured to perform a method comprising: storing in an encoder memory non-sample data for at least a first picture and a second picture, the encoder memory comprising a decoded picture buffer (DPB); encoding first sample data for the first picture to produce first coded sample data; decoding the first coded sample data using the non-sample data to produce first decoded sample data; storing the first decoded sample data in the DPB; encoding second sample data for the second picture to produce second coded sample data; removing the first decoded sample data from the DPB; keeping the non-sample data in the decoder memory based on a storage type indicator; decoding the second coded sample data using the non-sample data to produce second decoded sample data; and storing the second decoded sample data in the DPB, wherein the non-sample data comprises Context Adaptive Binary Arithmetic Coding (CABAC) probability states, adaptive loop filtering (ALF) filter types, ALF parameters, ALF coefficients, coding tree structure parameters, scaling matrices, and/or quantization parameter (QP) modification parameters. 18. The encoder of claim 17, wherein the method further comprises: after decoding the second coded sample data using the non-sample data, removing the non- sample data from the decoder memory. 18. The encoder of claim 17, wherein the method further comprises: after decoding the second coded sample data using the non-sample data, removing the non-sample data from the decoder memory. 19. The encoder of claim 17, wherein the first decoded sample data comprises luma values, and the non-sample data does not comprise any luma values. 19. The encoder of claim 17, wherein the first decoded sample data comprises luma values. 20. The encoder of claim 17, wherein the method further comprises: marking the first decoded sample data as one of used for reference and unused for reference and marking the non-sample data as one of used for reference and unused for reference, wherein data marked as used for reference indicates that the data is available as reference data by a subsequently encoded picture of the video sequence and data marked as unused for reference indicates that the data is not available as reference data by any subsequently encoded picture of the video sequence, wherein removing the first decoded sample data comprises removing the first decoded sample data from the DPB if the first decoded sample data is marked as unused for reference and a previously reconstructed picture is not needed for output. 20. The encoder of claim 17, wherein the method further comprises: marking the first decoded sample data as one of used for reference and unused for reference and marking the non-sample data as one of used for reference and unused for reference, wherein data marked as used for reference indicates that the data is available as reference data by a subsequently encoded picture of the video sequence and data marked as unused for reference indicates that the data is not available as reference data by any subsequently encoded picture of the video sequence, wherein removing the first decoded sample data comprises removing the first decoded sample data from the DPB if the first decoded sample data is marked as unused for reference and a previously reconstructed picture is not needed for output. In view of the above, allowing claims 1-20 of the instant application would result in an unjustified or improper time-wise extension of the "right to exclude" granted by a patent. See In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Feb. Cir. 1993). Contact Information 7. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Behrooz Senfi, whose telephone number is (571)272-7339. The examiner can normally be reached on Monday-Friday 10:00-6:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner's supervisor, Christopher Kelley can be reached on 571 272 7331. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786- 9199 (IN USA OR CANADA) or 571 -272-1000. /BEHROOZ M SENFI/Primary Examiner, Art Unit 2482
Read full office action

Prosecution Timeline

Jan 17, 2025
Application Filed
Mar 17, 2026
Non-Final Rejection — §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12581050
OPTICAL ASSEMBLIES FOR MACHINE VISION CALIBRATION
2y 5m to grant Granted Mar 17, 2026
Patent 12574493
DISPLAY DEVICE
2y 5m to grant Granted Mar 10, 2026
Patent 12568287
GENERATING THREE-DIMENSIONAL VIDEOS BASED ON TEXT USING MACHINE LEARNING MODELS
2y 5m to grant Granted Mar 03, 2026
Patent 12563170
INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND DISPLAY DEVICE
2y 5m to grant Granted Feb 24, 2026
Patent 12556676
IMAGE SENSOR, CAMERA AND IMAGING SYSTEM WITH TWO OR MORE FOCUS PLANES
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
83%
Grant Probability
93%
With Interview (+10.1%)
2y 10m
Median Time to Grant
Low
PTA Risk
Based on 1039 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month