Prosecution Insights
Last updated: April 19, 2026
Application No. 18/387,125

SAMPLE ADAPTIVE OFFSET PARAMETER ESTIMATION FOR IMAGE AND VIDEO CODING

Final Rejection §103
Filed
Nov 06, 2023
Examiner
FEREJA, SAMUEL D
Art Unit
2487
Tech Center
2400 — Computer Networks
Assignee
Texas Instruments Incorporated
OA Round
4 (Final)
75%
Grant Probability
Favorable
5-6
OA Rounds
2y 8m
To Grant
86%
With Interview

Examiner Intelligence

Grants 75% — above average
75%
Career Allow Rate
458 granted / 614 resolved
+16.6% vs TC avg
Moderate +12% lift
Without
With
+11.8%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
66 currently pending
Career history
680
Total Applications
across all art units

Statute-Specific Performance

§101
3.6%
-36.4% vs TC avg
§103
64.1%
+24.1% vs TC avg
§102
13.8%
-26.2% vs TC avg
§112
7.9%
-32.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 614 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application is being examined under the pre-AIA first to invent provisions. Status of the Claims Claims 1-5, 7-15, and 17-22 are pending. Claims 6 and 16 are canceled. Response to Arguments / Amendments Rejections under 35 U.S.C. § 103: Applicant’s arguments have been fully considered, but they are not persuasive, see discussion below. Esenlik in view of Kenji fails to describe “replicating a pixel value across a slice boundary, much less "replicating a value fora first pixel from a value for a second pixel, ... wherein the first pixel is across a slice boundary in an image from the second pixel," and replicating a value for a first pixel from a value for a diagonally adjacent second pixel” as recited in amended claim 1. As to the above argument, Esenlik discloses replicating a value for a first pixel from a value for a second pixel by categorizing a pixel into a category out of different categories according to their immediate neighborhood and to apply to the pixel a category-dependent offset accordingly ([0098] FIG. 3) in which the first pixel is diagonally adjacent to the second pixel employing categorization of the pixels in neighborhood corresponding to a pixel such as pixel "c": four of them , namely 401 to 404, are one-dimensional patterns and two of them, namely 405 and 406 are two dimensional patterns ([0098] FIG. 3, 401, 402, 403, 404, 405, and 406) Esenlik further discloses processing the first pixel is across a slice boundary using filter grid displacements with a frame is partitioned into multiple slices ([0163] –[0168], FIGS. 28 to 31 [0170]). In addition to Esenlik, Kenji teaches the first pixel is across a slice boundary in an image from the second pixel using an arrangement for the first adaptive filter for boundary 183 performs filter processing straddling slices on pixels to be processed which are near the slice boundary and regarding which pixels of the neighboring slice are included in the surrounding pixels, i.e. the first adaptive filter for boundary 183 performs filter processing using the pixels of the current slice and neighboring slice, with a method such as shown in A in; [0127], FIG. 7, second adaptive filter for boundary 184 performs filter processing by generating dummy data as necessary, using the pixels of the current slice alone ([0126], FIG. 7). PNG media_image1.png 192 494 media_image1.png Greyscale Below is also the subject matter was supported in the provisional application od Esenlik 61/498841 “The basic concept of AO is to categorize pixels into different categories, and the pixel value offset between reconstructed pixels One of the adaptive offset methods is called Edge offset (EO) that classifies all pixels of a partition into multiple categories by and original pixels of each category is calculated for compensation. It can reduce mean square error between comparing with neighboring pixels and compensates the average offset in each category. EO patterns are reconstructed pixels and original pixels by adding an offset on reconstructed pixels. Fig. 2” PNG media_image2.png 770 1002 media_image2.png Greyscale It should be further noted that Applicant has not presented any specific arguments with regards to the rejections of the dependent claims. Accordingly, Examiner maintains the rejection with regards to above arguments. Claim Rejections - 35 USC § 103 The following is a quotation of pre-AIA 35 U.S.C. 103(a) which forms the basis for all obviousness rejections set forth in this Office action: (a) A patent may not be obtained through the invention is not identically disclosed or described as set forth in section 102 of this title, if the differences between the subject matter sought to be patented and the prior art are such that the subject matter as a whole would have been obvious at the time the invention was made to a person having ordinary skill in the art to which said subject matter pertains. Patentability shall not be negatived by the manner in which the invention was made. Claims 1-5, 7-15 and 17-22 are rejected under pre-AIA 35 U.S.C. 103(a) as being as being unpatentable over Esenlik et al. (US 20140328413 A1, hereinafter Esenlik) in view of Kenji et al. (US 20120121188, hereinafter Kenji). Regarding Claim 1, Esenlik discloses a device comprising a non-transitory computer readable medium including instructions that, when executed, cause one or more processors to: replicate a value for a first pixel from a value for a second pixel ([0098] FIG. 3, categorizing a pixel into a category out of different categories according to their immediate neighborhood and to apply to the pixel a category-dependent offset accordingly), wherein the first pixel is diagonally adjacent to the second pixel ([0098] FIG. 3, 401, 402, 403, 404, 405, and 406 corresponding to a pixel "c" and pixels in its neighborhood, which are employed for categorization: four of them , namely 401 to 404, are one-dimensional patterns and two of them, namely 405 and 406 are two dimensional patterns), and compare a value of a third pixel to the replicated value of the first pixel ([0098] FIG. 3, classifying all pixels of a partition or an image area into multiple categories by comparing them with neighboring pixels and compensates the average offset according to each category), wherein the third pixel is across the slice boundary from the first pixel ([0163] –[0168], FIGS. 28 to 31, filter grid displacements with a frame is partitioned into multiple slices); and filter the third pixel in response to comparing the value of the third pixel to the replicated value of the first pixel ([0098] FIG. 4, six different example patterns 401, 402, 403, 404, 405, and 406 corresponding to a pixel "c" and pixels in its neighborhood, which are employed for categorization: four of them , namely 401 to 404, are one-dimensional patterns and two of them, namely 405 and 406 are two dimensional patterns). Esenlik further discloses processing the first pixel is across a slice boundary using filter grid displacements with a frame is partitioned into multiple slices ([0163] –[0168], FIGS. 28 to 31 [0170]). However, Esenlik does not explicitly disclose wherein the first pixel is across a slice boundary in an image from the second pixel. Kenji teaches the first pixel is across a slice boundary in an image from the second pixel ([0126], FIG. 7, first adaptive filter for boundary 183 performs filter processing straddling slices on pixels to be processed which are near the slice boundary and regarding which pixels of the neighboring slice are included in the surrounding pixels, i.e. the first adaptive filter for boundary 183 performs filter processing using the pixels of the current slice and neighboring slice, with a method such as shown in A in; [0127], FIG. 7, second adaptive filter for boundary 184 performs filter processing by generating dummy data as necessary, using the pixels of the current slice alone). PNG media_image1.png 192 494 media_image1.png Greyscale Therefore, it would have been obvious to one ordinary skill in the art at the time of the invention to modify the teachings of the first pixel is across a slice boundary in an image from the second pixel as taught by Kenji ([0126]) into the encoding and decoding system of Esenlik in order to provide systems for suppressing deterioration of the effects of filter processing due to local control of filter processing when encoding or when decoding (Kenji, [0013]). Regarding Claim 2, Esenlik in view of Kenji discloses the device of claim 1, Esenlik discloses wherein the first pixel is adjacent to the third pixel ([0098] FIG. 3, six different example patterns 401, 402, 403, 404, 405, and 406 corresponding to a pixel "c" and pixels in its neighborhood, which are employed for categorization: four of them , namely 401 to 404, are one-dimensional patterns and two of them, namely 405 and 406 are two dimensional patterns). Regarding Claim 3, Esenlik in view of Kenji discloses the device of claim 1, Esenlik discloses the first pixel is vertically adjacent to the third pixel ([0098] FIG. 3, 401, 402, 403, 404, 405, and 406 corresponding to a pixel "c" and pixels in its neighborhood, which are employed for categorization: four of them , namely 401 to 404, are one-dimensional patterns and two of them, namely 405 and 406 are two dimensional patterns). Regarding Claim 4, Esenlik in view of Kenji discloses the device of claim 1, Esenlik discloses the first pixel is horizontally adjacent to the third pixel ([0098] FIG. 3, 401, 402, 403, 404, 405, and 406 corresponding to a pixel "c" and pixels in its neighborhood, which are employed for categorization: four of them , namely 401 to 404, are one-dimensional patterns and two of them, namely 405 and 406 are two dimensional patterns). PNG media_image3.png 318 462 media_image3.png Greyscale Regarding Claim 5, Esenlik in view of Kenji discloses the device of claim 1, Esenlik discloses the first pixel is diagonally adjacent to the third pixel ([0098] FIG. 3, 401, 402, 403, 404, 405, and 406 corresponding to a pixel "c" and pixels in its neighborhood, which are employed for categorization: four of them , namely 401 to 404, are one-dimensional patterns and two of them, namely 405 and 406 are two dimensional patterns). Regarding Claim 7, Esenlik in view of Kenji discloses the device of claim 1, Esenlik discloses wherein the second pixel is inside a coding unit, wherein the third pixel is inside the coding unit, and wherein the first pixel is not inside the coding unit ([0118], FIG. 13 filtering pipeline for LCU for four filtering regions (LCU1, LCU2, LCU3 and LCU4) each of filtering regions (LCUs) having their own SAO parameter set: filter set 2, and SAO sets 2 for the top (horizontal) boundary of the LCU3). PNG media_image4.png 325 405 media_image4.png Greyscale Regarding Claim 8, Esenlik in view of Kenji discloses the device of claim 1, Esenlik discloses wherein the instructions are executable by the one or more processors for further causing the one or more processors to replicate a value for a fourth pixel from a value for a fifth pixel, wherein the fourth pixel is across the slice boundary from the fifth pixel ([0163] – [0168], FIGS. 28 to 31, filter grid displacements with a frame is partitioned into multiple slices individually decoded), wherein the instructions to compare the value of the third pixel to the replicated value of the first pixel comprise instructions to compare the value of the third pixel to the replicated value of the first pixel and to the replicated value of the fourth pixel ([0098], Table 410, five categories to which pixel "c" may belong to when considering one of the one-dimensional patterns (masks) 401 to 404, in particular when considering the samples in the shaded (in FIG. 3) positions relative to sample "c"), and wherein the instructions to filter the third pixel comprise instructions to filter the third pixel in response to comparing the value of the third pixel to the replicated value of the first pixel and to the replicated value of the fourth pixel ([0098] FIG. 4, six different example patterns 401, 402, 403, 404, 405, and 406 corresponding to a pixel "c" and pixels in its neighborhood, which are employed for categorization: four of them , namely 401 to 404, are one-dimensional patterns and two of them, namely 405 and 406 are two dimensional patterns). Regarding Claim 9, Esenlik in view of Kenji discloses the device of claim 1, Esenlik discloses wherein the instructions to compare the value of the third pixel to the replicated value of the first pixel comprises instructions to compare the value of the third pixel to the replicated value of the first pixel and to a value of a fourth pixel, wherein the third pixel is inside a coding unit, wherein the fourth pixel is inside the coding unit, and wherein the first pixel is not inside the coding unit ([0118], FIG. 13 filtering pipeline for LCU for four filtering regions (LCU1, LCU2, LCU3 and LCU4) each of filtering regions (LCUs) having their own SAO parameter set: filter set 2, and SAO sets 2 for the top (horizontal) boundary of the LCU3), PNG media_image4.png 325 405 media_image4.png Greyscale wherein the instructions to filter the third pixel comprise instructions to filter the third pixel in response to comparing the value of the third pixel to the replicated value of the first pixel and to the value of the fourth pixel([0098] FIG. 4, six different example patterns 401, 402, 403, 404, 405, and 406 corresponding to a pixel "c" and pixels in its neighborhood, which are employed for categorization: four of them , namely 401 to 404, are one-dimensional patterns and two of them, namely 405 and 406 are two dimensional patterns). Regarding Claim 10, Esenlik in view of Kenji discloses the device of claim 1, Esenlik discloses wherein the instructions to filter the third pixel comprises instructions to filter the third pixel using an edge offset sample adaptive offset filter in response to comparing the value of the third pixel to the replicated value of the first pixel ([0098] FIG. 3, Sample Adaptive Offset (SAO) processing such as edge offset (EO). classifies all pixels of a partition or an image area into multiple categories by comparing them with neighboring pixels and compensates the average offset according to each category). Regarding Claims 11-15 & 17-20, Method claims 11-15 & 17- 20 of using the corresponding device claimed in claims 1-5 & 7-10, and the rejections of which are incorporated herein for the same reasons as used above. Regarding Claims 21-22, System claims 21-22 of using the corresponding device claimed in claims 1 & 3-4, and the rejections of which are incorporated herein for the same reasons as used above. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. US 20120294353 A1: Sample adaptive offset (SAO), where each LCU uses its own SAO parameters. THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Samuel D Fereja whose telephone number is (469)295-9243. The examiner can normally be reached 8AM-5PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, DAVID CZEKAJ can be reached at (571) 272-7327. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SAMUEL D FEREJA/Primary Examiner, Art Unit 2487
Read full office action

Prosecution Timeline

Nov 06, 2023
Application Filed
Aug 23, 2024
Non-Final Rejection — §103
Nov 27, 2024
Response Filed
Feb 26, 2025
Final Rejection — §103
Jun 02, 2025
Request for Continued Examination
Jun 06, 2025
Response after Non-Final Action
Aug 15, 2025
Non-Final Rejection — §103
Nov 18, 2025
Response Filed
Jan 24, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597264
Method for Calibrating an Assistance System of a Civil Motor Vehicle
2y 5m to grant Granted Apr 07, 2026
Patent 12598318
METHOD AND SYSTEM-ON-CHIP FOR PERFORMING MEMORY ACCESS CONTROL WITH LIMITED SEARCH RANGE SIZE DURING VIDEO ENCODING
2y 5m to grant Granted Apr 07, 2026
Patent 12593018
SYSTEM AND METHOD FOR CONTROLLING PERCEPTUAL THREE-DIMENSIONAL ELEMENTS FOR DISPLAY
2y 5m to grant Granted Mar 31, 2026
Patent 12593036
METHOD AND APPARATUS FOR PROCESSING VIDEO SIGNAL
2y 5m to grant Granted Mar 31, 2026
Patent 12591123
METHOD FOR DETERMINING SLOPE OF SLIDE IN SLIDE SCANNING DEVICE, METHOD FOR CONTROLLING SLIDE SCANNING DEVICE AND SLIDE SCANNING DEVICE USING THE SAME
2y 5m to grant Granted Mar 31, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
75%
Grant Probability
86%
With Interview (+11.8%)
2y 8m
Median Time to Grant
High
PTA Risk
Based on 614 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month