Prosecution Insights
Last updated: April 19, 2026
Application No. 19/048,596

INVERSE PRE-FILTER FOR IMAGE AND VIDEO COMPRESSION

Non-Final OA §102
Filed
Feb 07, 2025
Examiner
TARKO, ASMAMAW G
Art Unit
2482
Tech Center
2400 — Computer Networks
Assignee
Tencent America LLC
OA Round
1 (Non-Final)
72%
Grant Probability
Favorable
1-2
OA Rounds
3y 0m
To Grant
81%
With Interview

Examiner Intelligence

Grants 72% — above average
72%
Career Allow Rate
284 granted / 395 resolved
+13.9% vs TC avg
Moderate +9% lift
Without
With
+9.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
24 currently pending
Career history
419
Total Applications
across all art units

Statute-Specific Performance

§101
3.4%
-36.6% vs TC avg
§103
58.2%
+18.2% vs TC avg
§102
23.9%
-16.1% vs TC avg
§112
4.4%
-35.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 395 resolved cases

Office Action

§102
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted on 05/28/2025 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-20 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by LIM et al. (US 20220248006 A1, hereinafter “LIM”). Regarding claim 1. LIM discloses a method of video decoding performed at a computing system having memory and one or more processors (0008), the method comprising: receiving a video bitstream comprising a current picture (0008 and 0184; Figure 2; “[0008] In the present disclosure, provided is a video decoding method includes acquiring adaptation parameter sets including an adaptive in-loop filter (ALF) set including a plurality of ALFs, determining an adaptation parameter set applied to a current picture or slice and including an ALF set applied to the current picture or slice, from among the adaptive parameter sets, ...”, “[0184] The decoding apparatus 200 may receive a bitstream output from the encoding apparatus 100. The decoding apparatus 200 may receive a bitstream stored in a computer readable recording medium, or may receive a bitstream that is streamed through a wired/wireless transmission medium. ...”); obtaining a filtered current picture by applying at least one in-loop filter to the current picture (0279 and 0301; Figures 6-7; “[0279] … the in-loop filtering method includes deblocking filtering, sample adaptive offset (SAO), bilateral filtering, and adaptive in-loop filtering, etc.” and “[0301] … since intra prediction or motion compensation is performed on the subsequent picture to be encoded/decoded by referring to a reconstructed picture that has undergone the adaptive in-loop filtering, coding efficiency of the subsequent picture as well as the coding efficiency of the current picture that has undergone in-loop filtering ...”); and obtaining a reconstructed current picture by applying an inverse pre-filter to the filtered current picture (0274, 0800-0801 and 1031; Figures 6, 59 and 61; “[0274] Inverse mapping in a dynamic range may be performed for a luma component reconstructed through intra prediction or inter prediction before in-loop filtering. The dynamic range may be divided into 16 equal pieces and the mapping function for each piece may be signaled. … In-loop filtering, reference picture storage, and motion compensation are performed in an inverse mapped region, and a prediction block generated through inter prediction is converted into a mapped region via mapping using the mapping function, and then used for generating the reconstructed block. ...”, “[0800] … slice_alf_aps_id_luma[i] may refer to an adaptation parameter set for adaptive in-loop filtering among the adaptation parameter sets referred to in the picture header or the slice header. In addition, the picture header or the slice header may include coding information indicating an adaptation parameter set for adaptive in-loop filtering applicable to the picture or the slice. Here, the temporal layer identifier of the adaptation parameter set having adaptation_parameter_set_id indicated by the coding information may be less than or equal to the temporal layer identifier of the current picture.”, and “[1031] In addition, a luma component picture sample array alfPictureL changed and reconstructed after adaptive in-loop filtering is performed may be output according to an adaptive in-loop filtering process. ...”). Regarding claim 2. LIM discloses the method of claim 1, wherein an output of the in-loop filter, without any inverse pre-filtering, is used for in-loop filtering (0274 and 0643; Figures 6 and 37; “[0274] ... In-loop filtering, reference picture storage, and motion compensation are performed in an inverse mapped region, and a prediction block generated through inter prediction is converted into a mapped region via mapping using the mapping function, and then used for generating the reconstructed block. However, since the intra prediction is performed in the mapped region, the prediction block generated via the intra prediction may be used for generating the reconstructed block without mapping/inverse mapping.”, “[0643] In addition, filtering target samples include an unavailable sample located outside the CTU or CTB boundaries, at least one of the deblocking filtering, the adaptive sample offset, and the adaptive in-loop filtering is not performed on the unavailable sample and the unavailable sample is used as it is for the filtering.”). Regarding claim 3. LIM discloses the method of claim 1, wherein the reconstructed current picture output by the inverse pre-filter is used for in-loop filtering (0174-0176; Figure 1; “[0174] … the encoding apparatus 100 may reconstruct or decode the encoded current image, ...”, “[0176] A reconstructed block may pass through the filter unit 180. The filter unit 180 may apply at least one of a deblocking filter, a sample adaptive offset (SAO), and an adaptive in-loop filter (ALF) to a reconstructed sample, a reconstructed block or a reconstructed image. The filter unit 180 may be called as an in-loop filter.”). Regarding claim 4. LIM discloses the method of claim 1, wherein the at least one in-loop filter comprises a plurality of in-loop filters, and wherein the inverse pre-filter is arranged after a first in-loop filter of the plurality of in-loop filters and before a second in-loop filter of the plurality of in-loop filters (0094, 0291-0292, 0303, 0678 and 1256; Figures 1, 6-8, 37 and 92-93; “[0094] … acquiring adaptation parameter sets including an adaptive in-loop filter (ALF) set including a plurality of ALFs, determining an adaptation parameter set applied to a current picture or slice and including an ALF set applied to the current picture or slice, ...”, “[0291] … the in-loop filtering may be performed such that the adaptive in-loop filtering, the sample adaptive offset, and the deblocking filtering are sequentially applied …”, “[0292] … a decoded picture refers to the output from the in-loop filtering or post-processing filtering performed on a reconstructed picture composed of reconstructed blocks ...”, “[0303] … at least one of the filtering methods is performed as post-processing filtering after a decoding process is performed. ... When the Wiener filter is used after the decoding process, the Wiener filter is applied to a reconstructed/decoded picture before the reconstructed/decoded picture is output (i.e., displayed). ...”, “[1256] … with respect to the samples that have undergone at least one of the deblocking filtering, the sample adaptive offset, and the bi-directional filtering among the reconstructed/decoded samples within the current picture, the adaptive in-loop filtering is performed on the reconstructed/decoded samples within the current picture using L filters without performing the block classification. Here, L is a positive integer.”). Regarding claim 5. LIM discloses the method of claim 1, wherein the inverse pre-filter is a fixed filter having fixed parameters (0655; Figure 37; “[0655] … the filter information include at least one piece of information selected from among information on whether luminance component filtering is performed, .. information on whether CU-based filtering is performed, information on whether a filter of a previous reference picture is used, a filter index of a previous reference picture, information on whether a fixed filter is used for a block class index, index information for a fixed filter, ...”). Regarding claim 6. LIM discloses the method of claim 1, wherein one or more coefficients for the inverse pre-filter are signaled in the video bitstream (0172-0173, 0298 and 0729; FIGS. 56-61; “[0172] A coding parameter may include information (flag, index, etc.) such as syntax element that is encoded in an encoder and signaled to a decoder, and information derived when performing encoding or decoding. … information of whether or not a secondary transform is used, a primary transform index, a secondary transform index, information of whether or not a residual signal is present, a coded block pattern, a coded block flag (CBF), a quantization parameter, a quantization parameter residue, a quantization matrix, whether to apply an intra loop filter, an intra loop filter coefficient, an intra loop filter tab, an intra loop filter shape/form, whether to apply a deblocking filter, a deblocking filter coefficient, a deblocking filter tab, a deblocking filter strength, a deblocking filter shape/form, whether to apply an adaptive sample offset, an adaptive sample offset value, an adaptive sample offset category, an adaptive sample offset type, whether to apply an adaptive in-loop filter, an adaptive in-loop filter coefficient, an adaptive in-loop filter tab, an adaptive in-loop filter shape/form, … a last significant coefficient flag, a coded flag for a unit of a coefficient group, a position of the last significant coefficient, a flag for whether a value of a coefficient is larger than 1, a flag for whether a value of a coefficient is larger than 2, a flag for whether a value of a coefficient is larger than 3, information on a remaining coefficient value, a sign information, a reconstructed luma sample, a reconstructed chroma sample, a residual luma sample, a residual chroma sample, a luma transform coefficient …”, “[0173] Herein, signaling the flag or index may mean that a corresponding flag or index is entropy encoded and included in a bitstream by an encoder, and may mean that the corresponding flag or index is entropy decoded from a bitstream by a decoder.”, “[0298] The filter type means a filter selected from among a Wiener filter, a low-pass filter, a high-pass filter, a linear filter, a non-linear filter, and a bidirectional filter.”, “[0729] FIGS. 56 to 61 show examples of syntax element information necessary for adaptive in-loop filtering. At least one of syntax elements necessary for adaptive in-loop filter may be entropy-encoded/decoded in at least one of a parameter set, a header, a brick, a CTU or a CU.”). Regarding claim 7. LIM discloses the method of claim 6, wherein the inverse pre-filter is a Wiener filter (0298-0303; Figures 7-55; “[0298] The filter type means a filter selected from among a Wiener filter, a low-pass filter, a high-pass filter, a linear filter, a non-linear filter, and a bidirectional filter.”). Regarding claim 8. LIM discloses the method of claim 6, wherein the inverse pre-filter is an adaptive loop filter (ALF) (0283; Figure 6; “[0283] … in-loop filtering means adaptive in-loop filtering. ...”). Regarding claim 9. LIM discloses the method of claim 8, wherein the inverse pre-filter uses a same set of parameters as another ALF that is applied to one or more blocks of the current picture (0094 and 0736; Figures 1 and 56-61, “[0094] … a video decoding method includes acquiring adaptation parameter sets including an adaptive in-loop filter (ALF) set including a plurality of ALFs, determining an adaptation parameter set applied to a current picture or slice and including an ALF set applied to the current picture or slice, from among the adaptive parameter sets, determining an adaptation parameter set applied to a current coding tree block (CTB) and including an ALF set applied to the current CTB included in the current picture or slice, from the adaptation parameter set applied to the current picture or slice, and filtering the current CTB based on the ALF set of the determined adaptation parameter set applied to the current CTB, … ”, “[0736] … when the syntax element for adaptive in-loop filtering is entropy-encoded/decoded in an adaptation parameter set, adaptive in-loop filtering may be performed using a syntax element for adaptive in-loop filtering having the same syntax element value in the unit referring to the same adaptation parameter set.”). Regarding claim 10. LIM discloses the method of claim 8, wherein the inverse pre-filter uses a first set of parameters that is different than a second set of parameters used by another ALF that is applied to one or more blocks of the current picture (0094 and 0293-0294; Figures 1 and 7-55; “[0294] The in-loop filtering may be performed on a decoded picture that has undergone at least one of the in-loop filtering methods. For example, when at least one of the in-loop filtering methods is performed on a decoded picture which has undergone at least one of the other in-loop filtering methods, parameters used for the latter filtering method may be changed, and then the former filtering with the changed parameters may be performed ...”). Regarding claim 11. LIM discloses the method of claim 10, further comprising parsing, from the video bitstream, the first set of parameters (0094 and 0124-0125; Figure 1; “[0094] In the present disclosure, provided is a video decoding method includes acquiring adaptation parameter sets including an adaptive in-loop filter (ALF) set including a plurality of ALFs, determining an adaptation parameter set applied to a current picture or slice and including an ALF set applied to the current picture …”, “[0124] Parameter Set: corresponds to header information among a configuration within a bitstream. At least one of a video parameter set, a sequence parameter set, a picture parameter set, and an adaptation parameter set may be included in a parameter set. ...”, “[0125] Parsing: may mean determination of a value of a syntax element by performing entropy decoding, or may mean the entropy decoding itself.”). Regarding claim 12. LIM discloses the method of claim1, wherein the inverse pre-filter comprises a rescaling filter (0145 and 0275; Figures 1 and 6; “[0275] … residual block may be converted into an inverse mapped region by performing scaling on the chroma … by scaling the residual block using the derived value, the residual block may be switched to the inverse mapped region. Then, chroma component block restoration, intra prediction, inter prediction, in-loop filtering, and reference picture storage may be performed in the inverse mapped area.”). Regarding claim 13. The encoding method claim 13 is drawn to the reverse method of the encoding/decoding method of using the corresponding decoding method claimed in claim 1. Therefore encoding method claim 13 corresponds to decoding claim 1 and is rejected for same reasons of anticipation as used above in that they are used together. Regarding claims 14-15. The encoding method claims 14-15 are drawn to the reverse method of the encoding method of using the corresponding decoding method claimed in claims 5-6. Therefore encoding method claims 14-15 correspond to decoding claims 5-6 and are rejected for same reasons of anticipation as used above. Regarding claims 16-17. The encoding method claims 16-17 are drawn to the reverse method of the encoding method of using the corresponding decoding method claimed in claims 8-9. Therefore encoding method claims 16-17 correspond to decoding claims 8-9 and are rejected for same reasons of anticipation as used above. Regarding claim 18. LIM discloses the method of claim 13, further comprising signaling, via a video bitstream, whether to apply the inverse pre-filter at a decoding component (0172-0173; FIGS. 56-61; “[0172] A coding parameter may include information (flag, index, etc.) such as syntax element that is encoded in an encoder and signaled to a decoder, and information derived when performing encoding or decoding. … whether to apply an adaptive in-loop filter, …”, “[0173] Herein, signaling the flag or index may mean that a corresponding flag or index is entropy encoded and included in a bitstream by an encoder, and may mean that the corresponding flag or index is entropy decoded from a bitstream by a decoder.”). Regarding claim 19. LIM discloses the method of claim 13, further comprising signaling, via a video bitstream, the encoded current picture (0174 and 0257; Figures 56-61; “[0174] … an encoded current image may be used as a reference image for another image that is processed afterwards. Accordingly, the encoding apparatus 100 may reconstruct or decode the encoded current image, or store the reconstructed or decoded image as a reference image in reference picture buffer 190.”, “[0257] The bitstream may include a reference picture index indicating a reference picture. The reference picture index may be entropy-encoded by the encoding apparatus 100 and then signaled as a bitstream to the decoding apparatus 200. The decoding apparatus 200 may generate a prediction block of the decoding target block based on the derived motion vector and the reference picture index information.”). Regarding claim 20. Claim 20 directed to a non-transitory computer-readable storage medium (CRM) storing a bitstream generated by an encoding method. The claim does not recite that the CRM contains executable instruction, that when executed, implement the encoding method. The bitstream is a product produced by the encoding method. Therefore, the claims are not limited to the recited steps, only the structure implied by the steps. (See MPEP §2113 - Product-by-Process claims.) Hence, the encoding method steps recited are given patentable weight only to structures in the bitstream that are implied by the steps. To be given patentable weight, the CRM and the bitstream (i.e. descriptive material) must be in a functional relationship. A functional relationship can be found where the descriptive material performs some function with respect to the CRM to which it is associated. See MPEP §2111.05(I)(A). When a claimed “computer-readable medium merely serves as a support for information or data, no functional relationship exists”. MPEP §2111.05(III). The CRM storing the claimed bitstream in claim 20 merely services as a support for the CRM of the bitstream and provides no functional relationship between the stored bitstream and the CRM. Therefore, the structure bitstream, which scope is implied by the method steps, is non-functional descriptive material and given no patentable weight. MPEP §2111.05(III). Thus, the claim scope is just a storage medium storing data and is anticipated by LIM which recites a storage medium storing a bitstream (0044, 0727 and 1316) storing the encoded bitstream on a storage medium). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to ASMAMAW TARKO whose telephone number is (571)272-9205. The examiner can normally be reached Monday -Friday 9:00AM-5:00PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chris Kelley can be reached at (571) 272-7331. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ASMAMAW G TARKO/ Patent Examiner, Art Unit 2482
Read full office action

Prosecution Timeline

Feb 07, 2025
Application Filed
Feb 14, 2026
Non-Final Rejection — §102
Mar 26, 2026
Interview Requested
Apr 15, 2026
Examiner Interview Summary
Apr 15, 2026
Applicant Interview (Telephonic)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12529288
SYSTEMS AND METHODS FOR ESTIMATING RIG STATE USING COMPUTER VISION
2y 5m to grant Granted Jan 20, 2026
Patent 12511768
METHOD AND APPARATUS FOR DEPTH IMAGE ENHANCEMENT
2y 5m to grant Granted Dec 30, 2025
Patent 12506865
SYSTEMS AND METHODS FOR REDUCING A RECONSTRUCTION ERROR IN VIDEO CODING BASED ON A CROSS-COMPONENT CORRELATION
2y 5m to grant Granted Dec 23, 2025
Patent 12498482
CAMERA APPARATUS
2y 5m to grant Granted Dec 16, 2025
Patent 12469164
VEHICLE EXTERNAL DETECTION DEVICE
2y 5m to grant Granted Nov 11, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
72%
Grant Probability
81%
With Interview (+9.3%)
3y 0m
Median Time to Grant
Low
PTA Risk
Based on 395 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month