Prosecution Insights
Last updated: April 19, 2026
Application No. 18/567,324

METHOD, DEVICE, AND MEDIUM FOR VIDEO PROCESSING

Non-Final OA §103
Filed
Dec 05, 2023
Examiner
UHL, LINDSAY JANE KILE
Art Unit
2481
Tech Center
2400 — Computer Networks
Assignee
Bytedance Inc.
OA Round
3 (Non-Final)
80%
Grant Probability
Favorable
3-4
OA Rounds
2y 4m
To Grant
89%
With Interview

Examiner Intelligence

Grants 80% — above average
80%
Career Allow Rate
324 granted / 404 resolved
+22.2% vs TC avg
Moderate +9% lift
Without
With
+8.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
38 currently pending
Career history
442
Total Applications
across all art units

Statute-Specific Performance

§101
3.7%
-36.3% vs TC avg
§103
65.4%
+25.4% vs TC avg
§102
8.7%
-31.3% vs TC avg
§112
10.3%
-29.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 404 resolved cases

Office Action

§103
DETAILED ACTION This Office Action is in response to the amendment filed on December 16, 2025. Claims 51-55, 57, and 59-71 are pending and are examined. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment The amendments made to original claims 51, 69, and 70 and the cancellation of claim 58 have been fully considered. In light of these amendments, the previous double-patenting rejection is withdrawn. Response to Argument Applicant's arguments and amendments received December 16, 2025 have been fully considered. With regard to 35 U.S.C. § 103, Applicant argues that the cited prior art fails to disclose wherein during the process of the optical flow based coding method, a first value is subtracted from a first sample or pixel in a first prediction block of the video, a second value is subtracted from a second sample or pixel in a second prediction block of the video unit. Specifically, Applicant argues, that Xiu at most recites that the horizontal and vertical gradients at each sample position are calculated as the equation 13 and horizontal and vertical gradient values based on prediction samples, sample differences between first and second prediction samples, and intermediate BDOF derivation parameters are calculated, but does not recite the three subtractions required by the amended claims. Examiner respectfully disagrees. The independent claims, as amended, require that during the optical flow, 1) a first value is subtracted from a first sample or pixel in a first prediction block of the video, 2) a second value is subtracted from a second sample or pixel in a second prediction block of the video unit, and 3) a difference of the first sample or pixel and the second sample or pixel is determined. Xiu also discloses, in equation 13, the calculation of gradients, which includes the subtraction of I(i-1, j), i.e., a value, from I(i+1, j), i.e., a sample or pixel in a prediction block (see ¶72, describing that I(k)(i, j) is a sample value at coordinate (i, j) of the prediction signal – i.e., I(i+1, j) would be understood to be a sample or pixel at coordinate (i+1, j) in a prediction block). Xiu discloses that these gradients are calculated “based on the first prediction samples I(0)(i, j)” and “the second prediction samples I(1)(i, j)” (see ¶144), i.e., the calculations of equation 13 are done for both the first and second prediction samples. In other words, for the first prediction sample I(0)(i, j) in a first prediction block – a first value I(0)(i-1, j) is subtracted from a first sample I(0)(i+1, j) and for the second prediction sample I(1)(i, j) in a second prediction block – a second value I(1)(i-1, j) is subtracted from a second sample I(1)(i+1, j). In addition, Xiu states that gradient values based on “sample differences between the first prediction samples I(0)(i, j) and the second prediction samples I(1)(i, j)” are derived (see ¶144). In other words, Xiu discloses the determination of a difference of the first sample or pixel and the second sample or pixel. Accordingly, all three limitations, as amended, are disclosed. If Applicant would like to be more specific about the optical flow calculations to differentiate from the cited prior art, Applicant is encouraged to do so. Applicants arguments are directed to newly amended language, which is addressed below. See the rejection below for how the art on record reads on the newly amended language as well as the examiner's interpretation of the cited art in view of the presented claim set. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 51-55, 57, 59-63, and 68-71 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent Publication No. 2017/0094305 (“Li”) in view of U.S. Patent Publication No. 2022/0239943 (“Xiu”), which corresponds to priority applications dated in 2019 and 2020. With respect to claim 51, Li discloses the invention substantially as claimed, including: A method of video processing, comprising: determining, during a conversion between a video unit and a bitstream of the video, information on applying an optical flow based coding method to the video unit based on illuminance information of the video unit (see Figs. 7-8, items 182, 212, ¶¶6, 21, 74, 102, 123, 137-139, 149-150, describing determining, during encoding or decoding, i.e., a conversion between a video unit and a bitstream of the video unit, information on applying BIO (an optical flow based coding method) to the video unit based on whether the block occurs in a region of illumination change, i.e., illuminance information of the video unit), wherein the illuminance information comprises an illuminance change of the video unit and the illuminance change is included in a process of the optical flow based coding method (see citations above, describing that the illuminance information is whether the block occurs in a region of illumination change, i.e., illuminance change of the video unit, and that the BIO method may be altered based on the presence of this illumination/illuminance change, e.g., by applying special processes during BIO or by using illumination compensation in the calculation of gradients for BIO), and … ; and performing the conversion based on the information (see citations and arguments with respect to elements above and Figs. 7-8, items 184, 186, 214, 216, describing that the motion compensation process of encoding/decoding, i.e., the conversion, is performed (e.g., by performing BIO or not) based on the information on applying optical flow). Li does not explicitly disclose wherein during the process of the optical flow based coding method, a first value is subtracted from a first sample or pixel in a first prediction block of the video, a second value is subtracted from a second sample or pixel in a second prediction block of the video unit, and a difference of the first sample or pixel and the second sample or pixel is determined. However, in the same field of endeavor, Xiu discloses that it was known for BIO/BDOF/PROF to include the subtraction of a first value from a first sample/pixel in a first prediction block, the subtraction of a second value from a second sample/pixel in a second prediction block, and the determination of a difference of the first sample/pixel and the second sample/pixel, i.e.: wherein during the process of the optical flow based coding method, a first value is subtracted from a first sample or pixel in a first prediction block of the video, a second value is subtracted from a second sample or pixel in a second prediction block of the video unit, and a difference of the first sample or pixel and the second sample or pixel is determined (see citations and arguments with respect to claim 51 above, ¶¶114, 144, equation 13, showing and describing that during BIO/BDOF/PROF, i.e., during the process of the optical flow based coding method, it was known to subtract a first value, e.g., I(i-1, j) for I(0), from a first sample or pixel, e.g., I(i+1, j) for I(0), in a first prediction block I(0), subtract a second value, e.g., I(i-1, j) for I(1), from a second sample or pixel, e.g., I(i+1, j) for I(1), in a second prediction block I(1), and determine sample differences between the first samples/pixels I(0) and the second samples/pixels I(1)). At the time of filing, one of ordinary skill would have been familiar with bi-directional optical flow and its use of gradient, including how to calculate it and have understood that, as evidenced by Xiu, one way to calculate gradient would include determining x and y gradients at each sample position by subtracting values and determining the difference between samples as shown in equation 13 and paragraph 144. Accordingly, to one of ordinary skill in the art at the time of filing, doing so in the coding system of Li to determine the gradient for BIO would have represented nothing more than the combination of prior art elements according to predictable results and/or the simple substitution of one known element for another to obtain predictable results. Therefore, it would have been obvious to one having ordinary skill in the art at the time of filing to include a mechanism for determining x and y gradients at each sample position by subtracting a first value from a first sample or pixel in a first prediction block of the video, a second value from a second sample or pixel in a second prediction block of the video unit, and by determining a difference of the first sample or pixel and the second sample or pixel in the BIO coding system of Li as taught by Xiu. With respect to claim 52, Li discloses the invention substantially as claimed. As detailed above, Li in view of Xiu discloses all the elements of independent claim 51. Li/Xiu additionally discloses: wherein the information on applying the optical flow based coding method comprises at least one of: whether to apply the optical flow based coding method to the video unit, or how to apply optical flow based coding method to the video unit (see citations and arguments with respect to claim 51 above, describing that the information on applying BIO/the optical flow based coding method comprises whether to apply BIO/the optical flow based coding method to the block and how to apply it, e.g., whether to apply it differently). The reasons for combining the cited prior art with respect to claim 51 also apply to claim 52. With respect to claim 53, Li discloses the invention substantially as claimed. As detailed above, Li in view of Xiu discloses all the elements of independent claim 51. Li/Xiu additionally discloses: wherein determining the information on applying the optical flow based coding method to the video unit comprises: determining whether an illuminance change of the video unit occurs; and in response to that the illuminance change occurs, determining that the optical flow based coding method is not applied to the video unit (see citations and arguments with respect to claim 51 above, describing that determining the information on applying BIO/the optical flow based coding method comprises determining whether an illumination/illuminance change of the block/region occurs, and in response, determining that BIO/the optical flow based coding method is not applied in a region of illumination change). The reasons for combining the cited prior art with respect to claim 51 also apply to claim 53. With respect to claim 54, Li discloses the invention substantially as claimed. As detailed above, Li in view of Xiu discloses all the elements of dependent claim 53. Li/Xiu additionally discloses: wherein a syntax element in the bitstream indicates whether the illuminance change occurs, or wherein determining whether the illuminance change of the video unit occurs comprises: determining whether the illuminance change of the video unit occurs based on at least one of: whether a coding tool is applied to the video unit, or how the coding tool is applied to the video unit (see citations and arguments with respect to claim 51 above, including Li ¶¶102, 123, 139, describing that a syntax element in the bitstream, e.g., IC_flag, may indicate whether the illuminance change occurs). The reasons for combining the cited prior art with respect to claim 51 also apply to claim 54. With respect to claim 55, Li discloses the invention substantially as claimed. As detailed above, Li in view of Xiu discloses all the elements of independent claim 51. Li/Xiu additionally discloses: further comprising: applying the optical flow based coding process to the video unit based on whether an illuminance change of the video unit occurs (see citations and arguments with respect to claim 51 above, describing that BIO/the optical flow based coding process is applied to the video unit based on whether there is an illumination change in the region of the block). The reasons for combining the cited prior art with respect to claim 51 also apply to claim 55. With respect to claim 57, Li discloses the invention substantially as claimed. As detailed above, Li in view of Xiu discloses all the elements of independent claim 51. Li/Xiu additionally discloses: wherein a value is subtracted in calculation of a gradient in the process of the optical flow based coding method (see Xiu ¶¶114, 144, describing that in bi-directional optical flow, it was known to, in the process of BIO/BDOF/PROF calculate a gradient by subtracting a value). The reasons for combining the cited prior art with respect to claim 51 also apply to claim 57. With respect to claim 59, Li discloses the invention substantially as claimed. As detailed above, Li in view of Xiu discloses all the elements of independent claim 51. Li/Xiu additionally discloses: wherein a determination of an illuminance change of the video unit is performed in a first level, and a determination of how to and/or whether to apply the optical flow based coding method is performed in a second level (see citations and arguments with respect to claim 51 above, including Li Figs. 7-8, items 182, 184, 212, 214, describing that determining illuminance change is performed at a first level, e.g., at steps 182, 212, and the optical flow based method is performed in a second level, e.g., at steps 184, 214). The reasons for combining the cited prior art with respect to claim 51 also apply to claim 59. With respect to claim 60, Li discloses the invention substantially as claimed. As detailed above, Li in view of Xiu discloses all the elements of independent claim 51. Li/Xiu additionally discloses: wherein how to apply the optical flow based coding method is determined based on at least one of: whether a coding tool is applied to the video unit, or how the coding tool is applied to the video unit, or wherein whether the optical flow based coding method is applied to the video unit is determined based on at least one of: whether a coding tool is applied to the video unit, or how the coding tool is applied to the video unit (see citations and arguments with respect to claim 51 above, including Li ¶¶102, 123, 139, describing that the optical flow based coding method BIO applied to the video unit may be applied based on whether illumination compensation is applied/whether the IC flag is true, i.e., whether a coding tool is applied to the video unit). The reasons for combining the cited prior art with respect to claim 51 also apply to claim 60. With respect to claim 61, Li discloses the invention substantially as claimed. As detailed above, Li in view of Xiu discloses all the elements of dependent claim 60. Li/Xiu additionally discloses: wherein the coding tool comprises at least one of: a location illumination compensation method, a bi-prediction with coding unit level weight method, or an affine compensation method, or wherein if the coding tool is applied to the video unit, the optical flow based coding method is applied to the video unit (see citations and arguments with respect to claims 51 and 60 above, describing that the coding tool, i.e., illumination compensation, comprises illumination compensation which determines of the current block occurs in a region of illumination change, i.e., a location illumination compensation method). The reasons for combining the cited prior art with respect to claim 51 also apply to claim 61. With respect to claim 62, Li discloses the invention substantially as claimed. As detailed above, Li in view of Xiu discloses all the elements of independent claim 51. Li/Xiu additionally discloses: wherein the optical flow based coding method comprises at least one of: a bi-directional optical flow method in which an optical flow is used to refine a bi- prediction signal of a coding block, a prediction refinement with optical flow for affine mode in which the optical flow is used to refine an affine motion compensated prediction, or a coding method in which the optical flow is used to generate or refine a prediction/reconstruction signal of a coding block (see citations and arguments with respect to claim 51 above, including Li Fig. 4, ¶21, describing that the optical flow based coding method is BIO/bi-directional optical flow in which optical flow refines a bi-prediction signal of a block). The reasons for combining the cited prior art with respect to claim 51 also apply to claim 62. With respect to claim 63, Li discloses the invention substantially as claimed. As detailed above, Li in view of Xiu discloses all the elements of dependent claim 62. Li/Xiu additionally discloses: wherein the optical flow based coding method is a bi-directional optical flow (BDOF), or wherein the optical flow based coding method is a prediction refinement with optical flow (PROF) (see citations and arguments with respect to claim 51 above, describing that the optical flow based coding method is bi-directional optical flow). The reasons for combining the cited prior art with respect to claim 51 also apply to claim 63. With respect to claim 68, Li discloses the invention substantially as claimed. As detailed above, Li in view of Xiu discloses all the elements of independent claim 51. Li/Xiu additionally discloses: wherein the conversion includes encoding the video unit into the bitstream; or decoding the video unit from the bitstream (see citations and arguments with respect to claim 51 above, describing that the conversion may be encoding or decoding). The reasons for combining the cited prior art with respect to claim 51 also apply to claim 68. With respect to claim 69, Li discloses the invention substantially as claimed. As detailed above, Li in view of Xiu discloses all the elements of independent claim 51. Li/Xiu additionally discloses: An apparatus for processing video data (see Li Figs. 1, items 20, 30, ¶¶5, 6, 51, 74, describing encoders/decoders, i.e., apparatuses, for video processing) comprising: a processor (see Li ¶¶52, 75, 54, 153-156, describing that the functions described may be embodied in software stored in a computer-readable storage medium and executed by a processor); and a non-transitory memory with instructions thereon (see citations and arguments with respect to element above, describing that the functions described may be embodied in software/instructions stored in a non-transitory memory), wherein the instructions upon execution by the processor, cause the processor to perform acts comprising: determining, during a conversion between a video unit and a bitstream of the video, information on applying an optical flow based coding method to the video unit based on illuminance information of the video unit, wherein the illuminance information comprises an illuminance change of the video unit and the illuminance change is included in a process of the optical flow based coding method, and wherein during the process of the optical flow based coding method, a first value is subtracted from a first sample or pixel in a first prediction block of the video, as second value is subtracted from a second sample or pixel in a second prediction block of the video, and a difference of the first sample or pixel and the second sample or pixel is determined (see citations and arguments with respect to corresponding element of claim 51 above); and performing the conversion based on the information (see citations and arguments with respect to corresponding element of claim 51 above). With respect to claim 70, claim 70 recites the elements of claim 51 in computer-readable medium form rather than method form. Li discloses that its method may be embodied in a non-transitory computer-readable storage medium storing instructions that cause a processor to perform the method (see ¶¶52, 75, 54, 153-156). Accordingly, the disclosure cited with respect to claim 51 also applies to claim 70. With respect to claim 71, Li discloses the invention substantially as claimed. As detailed above, Li in view of Xiu discloses all the elements of independent claim 51. Li/Xiu additionally discloses: further comprising: storing the bitstream in a non-transitory computer-readable recording medium (see Li Fig. 1, item 16, ¶¶47-48, 53-55, describing a computer-readable medium 16 which may receive and store encoded video data and which may be non-transitory). The reasons for combining the cited prior art with respect to claim 51 also apply to claim 71. Claim Rejections - 35 USC § 103 Claims 64-67 are rejected under 35 U.S.C. 103 as being unpatentable over Li in view Xiu and further in view of U.S. Patent Publication No. 2023/0396780 (“Wang”), which corresponds to a priority application filed February 2021. With respect to claim 64, Li discloses the invention substantially as claimed. As described above, Li in view of Xiu teaches all the elements of independent claim 51. Li/Xiu teaches determining whether an illuminance change occurs, but does not explicitly detail how such a calculation is made, i.e., it does not explicitly disclose wherein if a change of sample or pixel values between two video units is larger than a first threshold value, an illuminance change occurs. However, in the same field of endeavor, Wang discloses that it was known to calculate illumination change by determining if a change of sample or pixel values between coding units is larger than a threshold, i.e.,: wherein if a change of sample or pixel values between two video units is larger than a first threshold value, an illuminance change occurs (see ¶¶228-230, 242, 247, describing that illumination change triggering an illumination compensation flag may be measured by determining when the luma difference between a unit in a reference frame and a unit in a current frame, i.e., between two video units, is greater than a preset change threshold). At the time of filing, one of ordinary skill would have been familiar with determining illumination change warranting illumination compensation processing and how to calculate such change and have understood that, as evidenced by Wang, one such method of calculation would include the determination of change using the SAD between the luma difference in a reference frame and a current frame and comparing it to a preset threshold. Accordingly, to one of ordinary skill in the art at the time of filing, doing so in the illumination compensation change analysis of Li/Xiu would have represented nothing more than the combination of prior art elements according to predictable results and/or the simple substitution of one known element for another to obtain predictable results. Therefore, it would have been obvious to one having ordinary skill in the art at the time of filing to include a mechanism for determining illumination change using the SAD between the luma difference in a reference frame and a current frame and comparing it to a preset threshold in the illumination compensation change analysis of Li/Xiu as taught by Wang. With respect to claim 65, Li discloses the invention substantially as claimed. As described above, Li in view of Xiu and Wang discloses all the elements of dependent claim 64. Li/Xiu/Wang additionally discloses: wherein the change is calculated by: d=abs(P1-P2), wherein P1 and P2 represent two samples or pixels in the two video units, respectively, and abs represents an absolute value operation, or wherein the first threshold value is predefined, or wherein the first threshold value is determined dynamically, or wherein the first threshold value is indicated in the bitstream (see citations and arguments with respect to claim 64 above, showing and describing that the change may be calculated by the SAD, i.e., absolute value of the difference, between the luma of blocks of the current picture and the reference picture; such citations also describe that the first threshold value may be preset/predefined). The reasons for combining the cited prior art with respect to claim 64 also apply to claim 65. With respect to claim 66, Li discloses the invention substantially as claimed. As described above, Li in view of Xiu discloses all the elements of independent claim 51 and Li in view of Xiu and Wang discloses all the elements of dependent claim 64, the combination of which is incorporated herein. Li/Xiu/Wang additionally discloses: wherein if a change of sample or pixel values in the video unit between two video units is larger than a second threshold value, an illuminance change occurs, or wherein if a change of mean values of sample or pixel values in the video unit between two video units is larger than the second threshold value, the illuminance change occurs (see citations and arguments with respect to claim 64 above, describing that illuminance change may occur when the change of sample/pixel values between a block of the current frame and a block of the reference frame is larger than a preset, i.e., second, threshold value). The reasons for combining the cited prior art with respect to claim 64 also apply to claim 66. With respect to claim 67, Li discloses the invention substantially as claimed. As described above, Li in view of Xiu and Wang discloses all the elements of dependent claim 66. Li/Xiu/Wang additionally discloses: wherein the second threshold value is predefined, or wherein the second threshold value is determined dynamically, or wherein the second threshold value is indicated in the bitstream (see citations and arguments with respect to claims 64 and 66 above, describing that the second threshold value may be preset). The reasons for combining the cited prior art with respect to claim 54 also apply to claim 67. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to LINDSAY JANE KILE UHL whose telephone number is (571)270-0337. The examiner can normally be reached 8:30 AM-5:00 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, William Vaughn can be reached on (571)272-3922. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. LINDSAY J UHL Primary Examiner Art Unit 2481 /LINDSAY J UHL/Primary Examiner, Art Unit 2481
Read full office action

Prosecution Timeline

Dec 05, 2023
Application Filed
Dec 05, 2023
Response after Non-Final Action
Mar 14, 2025
Non-Final Rejection — §103
Jun 20, 2025
Response Filed
Sep 12, 2025
Final Rejection — §103
Nov 17, 2025
Response after Non-Final Action
Dec 16, 2025
Request for Continued Examination
Dec 20, 2025
Response after Non-Final Action
Jan 23, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12604000
SYSTEMS AND METHODS FOR PARTITION-BASED PREDICTION MODE REORDERING
2y 5m to grant Granted Apr 14, 2026
Patent 12604030
METHOD AND APPARATUS FOR PROCESSING VIDEO SIGNAL
2y 5m to grant Granted Apr 14, 2026
Patent 12598329
SYNTAX DESIGN METHOD AND APPARATUS FOR PERFORMING CODING BY USING SYNTAX
2y 5m to grant Granted Apr 07, 2026
Patent 12593032
METHOD AND DEVICE FOR PROCESSING VIDEO SIGNAL BY USING INTER PREDICTION
2y 5m to grant Granted Mar 31, 2026
Patent 12587636
GEOMETRIC PARTITION MODE WITH MOTION VECTOR REFINEMENT
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
80%
Grant Probability
89%
With Interview (+8.7%)
2y 4m
Median Time to Grant
High
PTA Risk
Based on 404 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month