Prosecution Insights
Last updated: April 19, 2026
Application No. 18/651,397

INDIRECTIVE TEMPORAL FILTERING FOR ADAPTIVE JOINED PARAMETER SMOOTHING

Non-Final OA §102§112
Filed
Apr 30, 2024
Examiner
CAMMARATA, MICHAEL ROBERT
Art Unit
2667
Tech Center
2600 — Communications
Assignee
Samsung Electronics Co., Ltd.
OA Round
1 (Non-Final)
70%
Grant Probability
Favorable
1-2
OA Rounds
2y 4m
To Grant
99%
With Interview

Examiner Intelligence

Grants 70% — above average
70%
Career Allow Rate
213 granted / 305 resolved
+7.8% vs TC avg
Strong +36% interview lift
Without
With
+35.9%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
46 currently pending
Career history
351
Total Applications
across all art units

Statute-Specific Performance

§101
4.5%
-35.5% vs TC avg
§103
45.8%
+5.8% vs TC avg
§102
21.1%
-18.9% vs TC avg
§112
24.6%
-15.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 305 resolved cases

Office Action

§102 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Specification The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed. The following title is suggested: Temporal Smoothness Filter With Percentile And Threshold Control To Reduce Frame Delay Drawings The drawings are objected to because Figs. 2A-C, 3A-C, 4A-C, 5A-C, and 9 are too small in scale to clearly illustrate the inventive features including the images and graphs; see also the subscripts of Figs. 6-9 all of which are poor quality reproductions that do not have satisfactory reproduction characteristics contrary to 37 CFR 1.84(l). In addition, Fig. 6 should be designated by a legend such as --Prior Art-- because only that which is old is illustrated as per [0038] of the instant published application. Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. The figure or figure number of an amended drawing should not be labeled as “amended.” If a drawing figure is to be canceled, the appropriate figure must be removed from the replacement sheet, and where necessary, the remaining figures must be renumbered and appropriate changes made to the brief description of the several views of the drawings for consistency. Additional replacement sheets may be necessary to show the renumbering of the remaining figures. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. In general, the claims have been abstracted from the disclosure to such an extent that they have lost clear meaning. In other words, divorcing the claims of nearly all context such as to what the percentiles refer and to what an indirect input may mean results in the claims being highly indefinite and unclear. In more detail, claim 1 recites “utilizing indirect input including percentiles to control one or more filter coefficients of the adaptive filter.” The term “indirect input” is unclear because it is not understood how indirect input may differ from direct input. Furthermore, it is not understood from the claim what is meant by this indirect input including “percentiles”. Moreover, to what does the “percentiles” refer; percentiles of what entity are being recited? Claim 1 also recites “controlling, based on the percentiles at different percentages, at least two of the one or more filter coefficients together to synchronize smoothing for the image output”. What is meant by “percentiles at different percentages” and how does that relate to a filter control scheme based thereon? The term “synchronize smoothing” is also unclear as it is not specified what is being synchronized. In further regards to claim 1, what is being filtered? No input is claimed—only the “image output” is recited without the filter having any clearly recited input. Independent claim 8 and 15 are parallel to claim 1 and employ the same indefinite language outlined above. Claims 2, 9 and 16 recite “wherein the adaptive filter is configured to adapt the one or more filter coefficients based on a frame feature difference to reduce frame delay” but there is no claimed entity that determines the “frame feature difference”. What are these frames and how do they relate to the unclaimed (see above) input image signal? What is the frame delay and how does it relate to anything else in the claim? These issues are compounded by a distinct lack of any input to the filter. Claims 3, 10, and 17 recite “wherein the indirect input of percentiles to control the one or more filter coefficients avoids direct control of the one or more filter coefficients by at least one of a filter input or a filter output”. See rejection of claim 1 above re indirect control. It is also not understood what is meant by indirect input of percentiles and how such “indirect input” results in “avoiding” direct control. Claims 5, 12 and 19 recite “wherein upon the frame feature difference exceeding a threshold, the IIR filter with a first coefficient value is used to reduce sudden changes in curve parameters”. Claims 6, 13, and 20 recite “wherein upon the frame feature difference being less than the threshold, the IIR filter with a second coefficient value is used to provide that the curve parameters change to reduce the frame delay”. Again, there is no claim element that determined a frame feature difference and it is unclear to what this frame feature difference may involve since, for example, no input is claimed. The ”curve parameters” are also wholly divorced from any context leaving their meaning wholly unclear. It is also unclear to recite properties of the IIR filter without having some element actually change or alter the IIR filter. The theme of failing to recite any input signal also continues in these claims by obtusely referring to the IIR being “used” in an unclear context of reducing sudden changes in unclear curve parameters of an unnamed signal having no clear relation to anything in the claim. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Nadernejad {E. Nadernejad, C. Mantel, N. Burini and S. Forchhammer, "Flicker reduction in LED-LCDs with local backlight," 2013 IEEE 15th International Workshop on Multimedia Signal Processing (MMSP), Pula, Italy, 2013, pp. 312-316, doi: 10.1109/MMSP.2013.6659307}. Claim 1 In regards to claim 1, Nadernejad discloses a computer implemented method {see section III Experiments in which the algorithm was implemented on a computer using software to assess the performance in the method on 8 Full-HD video sequences with Fig. 5 illustrating the results thereof} comprising: applying an adaptive filter in a conversion function that provides temporal smoothness for image output {See Section II Flicker Reduction Using IIR Filter that reduces flicker artifacts by temporally smoothing the change in the backlight image}; utilizing indirect input including percentiles to control one or more filter coefficients of the adaptive filter {Section II including equations 2-4 while noting that maximum and average values are percentiles and are used to control/affect the filter coefficients}; and controlling, based on the percentiles at different percentages, at least two of the one or more filter coefficients together to synchronize smoothing for the image output {Section II including equations 2-4 that control, based on the percentiles (max, avg) at different percentages including thresholds TH1 and TH2 that are used to shape the transfer function (1) of the IIR filter such that based on the difference between max and avg (equation 4) and the thresholds TH1, TH2 the filter coefficients (e.g. b0, b1, a1, a2.) are controlled to “synchronize” the temporal smoothing with the video frames. As best as can be determined (see 112b rejection), synchronize may refer to temporal filtering that is aligned/synchronized with the vide frames to prevent screen flicker}. Claim 2 In regards to claim 2, Nadernejad discloses wherein the adaptive filter is configured to adapt the one or more filter coefficients based on a frame feature difference to reduce frame delay {see section II also discussing reducing backlight lag (frame delay) along with reducing flicker using the adaptive temporal IIR filter}. Claim 3 In regards to claim 3, Nadernejad discloses wherein the indirect input of percentiles to control the one or more filter coefficients avoids direct control of the one or more filter coefficients by at least one of a filter input or a filter output {as best as can be determined based on the unclear language (see 112(b) rejection), see section II including IIR filter control by the same type of percentiles as broadly and indefinitely recited}. Claim 4 In regards to claim 4, Nadernejad discloses wherein the adaptive filter comprises an infinite impulse response (IIR) filter {section II clearly discloses an IIR filter}. Claims 5-7 Nadernejad discloses (claim 5) wherein upon the frame feature difference exceeding a threshold, the IIR filter with a first coefficient value is used to reduce sudden changes in curve parameters, (claim 6) wherein upon the frame feature difference being less than the threshold, the IIR filter with a second coefficient value is used to provide that the curve parameters change to reduce the frame delay, and (claim 7) wherein the first coefficient value is less than the second coefficient value, the image output comprises video frames provided to a display device. {see Section II including equations 2-4 that control, based on the percentiles (max, avg) at different percentages including thresholds TH1 and TH2 that are used to shape the transfer function (1) of the IIR filter such that based on the difference between max and avg (equation 4) and the thresholds TH1, TH2 the filter coefficients (e.g. b0, b1, a1, a2.) are controlled to “synchronize” the temporal smoothing with the video frames. Further as to claims 5 and 6, filter parameters b0 and b1 correspond to the first and second coefficients that values of which are used to reduce sudden changes in curve parameters (smoothing to reduce flicker) and reduce frame delay depending upon their respective values in comparison with the thresholds TH1 and TH2. Further as to claim 7, the first coefficient value may be less than the second coefficient value depending upon the frame difference and thresholds TH1 and TH2 while noting that the output of the temporally filtered backlight luminance image is used to drive an edge-lit display device displaying processed video frames as per Section III Experiments}. Claims 8-14; and 15-20 The rejection of method claims 1-7; 1-5, and 6+7 above applies mutatis mutandis to the corresponding limitations of non-transitory processor readable medium claims 8-14 and apparatus claims 15-20, respectively while noting that the rejection above cites to both device and method disclosures. For the high level memory/processor limitations of claims 15-20 and computer readable storage medium storing program limitations of claims 8-14 see section III Experiments in which the algorithm was implemented on a computer (with processor) using processor-readable software to assess the performance in the method on 8 Full-HD video sequences with Fig. 5 illustrating the results thereof. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Kiser {Kiser, Chris, et al. "Real time automated tone mapping system for HDR video." IEEE International Conference on Image Processing. Vol. 134. IEEE Orlando, 2012} discloses an IIR filter with percentiles (Lmax, Lav and Lmin). Equation 1a is an exponential decay of IIR filter expressed in equations 3a-c to reduce flicker by temporally smoothing video frames. See section 3.2 copied below. . PNG media_image1.png 830 1156 media_image1.png Greyscale Any inquiry concerning this communication or earlier communications from the examiner should be directed to Michael R Cammarata whose telephone number is (571)272-0113. The examiner can normally be reached M-Th 7am-5pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Matthew Bella can be reached at 571-272-7778. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MICHAEL ROBERT CAMMARATA/ Primary Examiner, Art Unit 2667
Read full office action

Prosecution Timeline

Apr 30, 2024
Application Filed
Feb 24, 2026
Non-Final Rejection — §102, §112
Mar 20, 2026
Interview Requested
Mar 26, 2026
Applicant Interview (Telephonic)
Mar 26, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602797
RECONSTRUCTION OF BODY MOTION USING A CAMERA SYSTEM
2y 5m to grant Granted Apr 14, 2026
Patent 12586171
METHODS AND SYSTEMS FOR GRADING DEVICES
2y 5m to grant Granted Mar 24, 2026
Patent 12579597
Point Group Data Synthesis Apparatus, Non-Transitory Computer-Readable Medium Having Recorded Thereon Point Group Data Synthesis Program, Point Group Data Synthesis Method, and Point Group Data Synthesis System
2y 5m to grant Granted Mar 17, 2026
Patent 12579835
INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND COMPUTER-READABLE RECORDING MEDIUM FOR DISTINGUISHING OBJECT AND SHADOW THEREOF IN IMAGE
2y 5m to grant Granted Mar 17, 2026
Patent 12567283
FACIAL RECOGNITION DATABASE USING FACE CLUSTERING
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
70%
Grant Probability
99%
With Interview (+35.9%)
2y 4m
Median Time to Grant
Low
PTA Risk
Based on 305 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month