Prosecution Insights
Last updated: April 19, 2026
Application No. 18/784,695

IMAGE PROCESSING APPARATUS AND METHOD

Non-Final OA §DP
Filed
Jul 25, 2024
Examiner
ABRISHAMKAR, KAVEH
Art Unit
2494
Tech Center
2400 — Computer Networks
Assignee
Sony Group Corporation
OA Round
1 (Non-Final)
78%
Grant Probability
Favorable
1-2
OA Rounds
3y 3m
To Grant
95%
With Interview

Examiner Intelligence

Grants 78% — above average
78%
Career Allow Rate
797 granted / 1020 resolved
+20.1% vs TC avg
Strong +17% interview lift
Without
With
+16.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
27 currently pending
Career history
1047
Total Applications
across all art units

Statute-Specific Performance

§101
12.4%
-27.6% vs TC avg
§103
39.7%
-0.3% vs TC avg
§102
22.4%
-17.6% vs TC avg
§112
9.6%
-30.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1020 resolved cases

Office Action

§DP
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . 1. This action is in response to the communication filed on July 25, 2024. Claims 1-20 were originally received for consideration. No preliminary amendments for the claims have been received. 2. Claims 1-20 are currently pending consideration. Information Disclosure Statement 3. Initialed and dated copies of Applicant’s IDS (form 1449), received on 7/25/2024, 11/11/2024, and 1/16/2025, are attached to this Office Action. Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. 3. Claims 1-20 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-20 of U.S. Patent No. 12,081,761. Although the claims at issue are not identical, they are not patentably distinct from each other because the claims of the ‘761 render obvious the claims of the present application. The present application differs from the claims of the ‘761 patent as the present application claims is directed towards the decoder-side operations while the ‘761 Patent is directed towards the encoder-side operations (see the table below which has the difference in the claims in bold). The present application discloses generating syntax values and deriving transform coefficients by context decoding and bypass decoding, wherein the claims of ‘761 are derive these values by context encoding and bypass encoding. However, both sets of claims are directed towards the same underlying inventive concept of using context-coded bins for processing a target block, employing an upper limit value, and using either bypass encoding/decoding if the current number of the context-coded bins is greater than the upper limit value. Therefore, the claims of the present application are rendered obvious by the claims of the ’761 Patent as the claims of the present application are the expected decoder operations of the encoder operations of the ‘761 Patent. Present Application U.S. Patent 12,081,761 1. A non-transitory computer-readable storage medium storing instructions which when executed by circuitry perform a method, the method comprising: setting an upper limit value of a number of context-coded bins that can be allocated to a processing target block, comprising a plurality of sub-blocks, on a basis of a size of the processing target block and a number of context-coded bins per sub-block; processing a sub-block of the processing target block; in a case that a current number of the context-coded bins is less than or equal to the upper limit value, deriving syntax element values for some levels of transform coefficients by context decoding and derive remaining syntax elements by bypass decoding; and in a case that the current number of the context-coded bins is greater than the upper limit value, deriving the syntax elements for the levels of the transform coefficients by bypass decoding. 2. The non-transitory computer-readable storage medium of claim 1, the method comprising: deriving a syntax element value for the processed sub-block so as not to exceed the upper limit value of the number of context-coded bins that can be allocated in an entire upper block. 3. The non-transitory computer-readable storage medium of claim 2, the method comprising: determining whether all sub-blocks of the processing target block have been processed. 4. The non-transitory computer-readable storage medium of claim 2, the method comprising: decoding the syntax element value derived and generate coded data. 5. The non-transitory computer-readable storage medium of claim 1, wherein the processing target block is a Coding Unit. 6. The non-transitory computer-readable storage medium of claim 1, wherein the processing target block is a Transform Unit. 7. An image processing apparatus comprising: circuitry configured to decode coded data and generate a syntax element value; set an upper limit value of a number of context-coded bins that can be allocated to a processing target block, comprising a plurality of sub-blocks, on a basis of a size of the processing target block and a number of context-coded bins per sub-block; derive coefficient data corresponding to image data regarding the processing target block by using the syntax element value that is generated; in a case that a current number of the context-coded bins is less than or equal to the upper limit value, derive syntax element values for some levels of transform coefficients by context decoding and derive remaining syntax elements by bypass decoding; and in a case that the current number of the context-coded bins is greater than the upper limit value, derive the syntax elements for the levels of the transform coefficients by bypass decoding. 8. The image processing apparatus of claim 7, wherein the circuitry is configured to derive a syntax element value for the processed sub-block so as not to exceed the upper limit value of the number of context-coded bins that can be allocated in an entire upper block. 9. The image processing apparatus of claim 8, wherein the circuitry is configured to determine whether all sub-blocks of the processing target block have been processed. 10. The image processing apparatus of claim 8, wherein the circuitry is configured to decode the syntax element value derived and generate coded data. 11. The image processing apparatus of claim 7, wherein the processing target block is a Coding Unit. 12. The image processing apparatus of claim 7, wherein the processing target block is a Transform Unit. 13. An image processing method comprising: decoding coded data and generate a syntax element value; setting an upper limit value of a number of context-coded bins that can be allocated to a processing target block, comprising a plurality of sub-blocks, on a basis of a size of the processing target block and a number of context-coded bins per sub-block; deriving coefficient data corresponding to image data regarding the processing target block by using the syntax element value that is generated; in a case that a current number of the context-coded bins is less than or equal to the upper limit value, deriving syntax element values for some levels of transform coefficients by context decoding and derive remaining syntax elements by bypass decoding; and in a case that the current number of the context-coded bins is greater than the upper limit value, deriving the syntax elements for the levels of the transform coefficients by bypass decoding. 14. The image processing method of claim 13, wherein the processing target block is a Coding Unit. 15. A non-transitory computer-readable storage medium storing instructions which when executed by circuitry perform a method, the method comprising: setting an upper limit value of a number of context-coded bins that can be allocated to a processing target block, comprising a plurality of sub-blocks, on a basis of a size of the processing target block and a number of context-coded bins per sub-block; processing a sub-block of the processing target block; in a case that a current number of the context-coded bins is less than or equal to the upper limit value, deriving syntax element values for some levels of transform coefficients from context-coded bins and derive remaining syntax elements from bypass-coded bins; and in a case that the current number of the context-coded bins is greater than the upper limit value, deriving the syntax elements for the levels of the transform coefficients from bypass-coded bins. 16. The non-transitory computer-readable storage medium of claim 15, wherein the processing target block is a Coding Unit. 17. An image processing apparatus comprising: circuitry configured to decode coded data and generate a syntax element value; set an upper limit value of a number of context-coded bins that can be allocated to a processing target block, comprising a plurality of sub-blocks, on a basis of a size of the processing target block and a number of context-coded bins per sub-block; derive coefficient data corresponding to image data regarding the processing target block by using the syntax element value that is generated; in a case that a current number of the context-coded bins is less than or equal to the upper limit value, derive syntax element values for some levels of transform coefficients from context-coded bins and derive remaining syntax elements from bypass-coded bins; and in a case that the current number of the context-coded bins is greater than the upper limit value, derive the syntax elements for the levels of the transform coefficients from bypass-coded bins. 18. The image processing apparatus of claim 17, wherein the processing target block is a Coding Unit. 19. An image processing method comprising: decoding coded data and generate a syntax element value; setting an upper limit value of a number of context-coded bins that can be allocated to a processing target block, comprising a plurality of sub-blocks, on a basis of a size of the processing target block and a number of context-coded bins per sub-block; deriving coefficient data corresponding to image data regarding the processing target block by using the syntax element value that is generated; in a case that a current number of the context-coded bins is less than or equal to the upper limit value, deriving syntax element values for some levels of transform coefficients from context-coded bins and derive remaining syntax elements from bypass-coded bins; and in a case that the current number of the context-coded bins is greater than the upper limit value, deriving the syntax elements for the levels of the transform coefficients from bypass-coded bins. 20. The image processing method of claim 19, wherein the processing target block is a Coding Unit. 13. A non-transitory computer-readable storage medium storing instructions which when executed by circuitry perform a method, the method comprising: setting an upper limit value of a number of context-coded bins that can be allocated to a processing target block, comprising a plurality of sub-blocks, on a basis of a size of the processing target block and a number of context-coded bins per sub-block; processing a sub-block of the processing target block; in a case that a current number of the context-coded bins is less than or equal to the upper limit value, deriving syntax element values for some levels of transform coefficients by context encoding and derive remaining syntax elements by bypass encoding; and in a case that the current number of the context-coded bins is greater than the upper limit value, deriving the syntax elements for the levels of the transform coefficients by bypass encoding. 14. The non-transitory computer-readable storage medium of claim 13, the method comprising: deriving a syntax element value for the processed sub-block so as not to exceed the upper limit value of the number of context-coded bins that can be allocated in an entire upper block. 15. The non-transitory computer-readable storage medium of claim 14, the method comprising: determining whether all sub-blocks of the processing target block have been processed. 16. The non-transitory computer-readable storage medium of claim 14, the method comprising: encoding the syntax element value derived and generate coded data. 17. The non-transitory computer-readable storage medium of claim 13, wherein the processing target block is a Coding Unit. 18. The non-transitory computer-readable storage medium of claim 13, wherein the processing target block is a Transform Unit. 1. An image processing apparatus comprising: circuitry configured to set an upper limit value of a number of context-coded bins that can be allocated to a processing target block, comprising a plurality of sub-blocks, on a basis of a size of the processing target block and a number of context-coded bins per sub-block; process a sub-block of the processing target block; in a case that a current number of the context-coded bins is less than or equal to the upper limit value, derive syntax element values for some levels of transform coefficients by context encoding and derive remaining syntax elements by bypass encoding; and in a case that the current number of the context-coded bins is greater than the upper limit value, derive the syntax elements for the levels of the transform coefficients by bypass encoding. 2. The image processing apparatus of claim 1, wherein the circuitry is further configured to determine whether all sub-blocks of the processing target block have been processed. 3. The image processing apparatus of claim 1, wherein the circuitry is further configured to encode the syntax element values derived and generate coded data. 4. The image processing apparatus of claim 1, wherein the processing target block is a Coding Unit. 5. The image processing apparatus of claim 1, wherein the processing target block is a Transform Unit. 6. The image processing apparatus of claim 1, wherein the upper limit value of a number of context coded bins of sub-blocks in the processing target block is 28. 7. An image processing method comprising: setting an upper limit value of a number of context-coded bins that can be allocated to a processing target block, comprising a plurality of sub-blocks, on a basis of a size of the processing target block and a number of context-coded bins per sub-block; processing a sub-block of the processing target block; in a case that a current number of the context-coded bins is less than or equal to the upper limit value, deriving syntax element values for some levels of transform coefficients by context encoding and derive remaining syntax elements by bypass encoding; and in a case that the current number of the context-coded bins is greater than the upper limit value, deriving the syntax elements for the levels of the transform coefficients by bypass encoding. 8. The image processing method of claim 7, further comprising: deriving a syntax element value for the processed sub-block so as not to exceed the upper limit value of the number of context-coded bins that can be allocated in an entire upper block. 9. The image processing method of claim 8, further comprising: determining whether all sub-blocks of the processing target block have been processed. 10. The image processing method of claim 8, further comprising: encoding the syntax element value derived and generate coded data. 11. The image processing method of claim 7, wherein the processing target block is a Coding Unit. 12. The image processing method of claim 7, wherein the processing target block is a Transform Unit. 13. A non-transitory computer-readable storage medium storing instructions which when executed by circuitry perform a method, the method comprising: setting an upper limit value of a number of context-coded bins that can be allocated to a processing target block, comprising a plurality of sub-blocks, on a basis of a size of the processing target block and a number of context-coded bins per sub-block; processing a sub-block of the processing target block; in a case that a current number of the context-coded bins is less than or equal to the upper limit value, deriving syntax element values for some levels of transform coefficients by context encoding and derive remaining syntax elements by bypass encoding; and in a case that the current number of the context-coded bins is greater than the upper limit value, deriving the syntax elements for the levels of the transform coefficients by bypass encoding. 14. The non-transitory computer-readable storage medium of claim 13, the method comprising: deriving a syntax element value for the processed sub-block so as not to exceed the upper limit value of the number of context-coded bins that can be allocated in an entire upper block. 15. The non-transitory computer-readable storage medium of claim 14, the method comprising: determining whether all sub-blocks of the processing target block have been processed. 16. The non-transitory computer-readable storage medium of claim 14, the method comprising: encoding the syntax element value derived and generate coded data. 17. The non-transitory computer-readable storage medium of claim 13, wherein the processing target block is a Coding Unit. 18. The non-transitory computer-readable storage medium of claim 13, wherein the processing target block is a Transform Unit. 19. An image processing apparatus comprising: circuitry configured to decode coded data and generate a syntax element value; set an upper limit value of a number of context-coded bins that can be allocated to a processing target block, comprising a plurality of sub-blocks, on a basis of a size of the processing target block and a number of context-coded bins per sub-block; derive coefficient data corresponding to image data regarding the processing target block by using the syntax element value that is generated; in a case that a current number of the context-coded bins is less than or equal to the upper limit value, derive syntax element values for some levels of transform coefficients by context encoding and derive remaining syntax elements by bypass encoding; and in a case that the current number of the context-coded bins is greater than the upper limit value, derive the syntax elements for the levels of the transform coefficients by bypass encoding. 20. An image processing method comprising: decoding coded data and generate a syntax element value; setting an upper limit value of a number of context-coded bins that can be allocated to a processing target block, comprising a plurality of sub-blocks, on a basis of a size of the processing target block and a number of context-coded bins per sub-block; deriving coefficient data corresponding to image data regarding the processing target block by using the syntax element value that is generated; in a case that a current number of the context-coded bins is less than or equal to the upper limit value, deriving syntax element values for some levels of transform coefficients by context encoding and derive remaining syntax elements by bypass encoding; and in a case that the current number of the context-coded bins is greater than the upper limit value, deriving the syntax elements for the levels of the transform coefficients by bypass encoding. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to KAVEH ABRISHAMKAR whose telephone number is (571)272-3786. The examiner can normally be reached M-F 9-5:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jung Kim can be reached at 571-272-3804. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /KAVEH ABRISHAMKAR/ 03/23/2026Primary Examiner, Art Unit 2494
Read full office action

Prosecution Timeline

Jul 25, 2024
Application Filed
Mar 23, 2026
Non-Final Rejection — §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12598086
TOKENIZED INDUSTRIAL AUTOMATION SOFTWARE
2y 5m to grant Granted Apr 07, 2026
Patent 12598216
SMALL-FOOTPRINT ENDPOINT DATA LOSS PREVENTION
2y 5m to grant Granted Apr 07, 2026
Patent 12585761
SYSTEM AND METHOD FOR COMBINING CYBER-SECURITY THREAT DETECTIONS AND ADMINISTRATOR FEEDBACK
2y 5m to grant Granted Mar 24, 2026
Patent 12585771
LEARNED CONTROL FLOW MONITORING AND ENFORCEMENT OF UNOBSERVED TRANSITIONS
2y 5m to grant Granted Mar 24, 2026
Patent 12579280
SYSTEMS AND METHODS FOR VULNERABILITY SCANNING OF DEPENDENCIES IN CONTAINERS
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
78%
Grant Probability
95%
With Interview (+16.9%)
3y 3m
Median Time to Grant
Low
PTA Risk
Based on 1020 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month