Prosecution Insights
Last updated: April 19, 2026
Application No. 18/741,268

METHOD AND DEVICE FOR VIDEO CODING USING ADAPTIVE MULTIPLE REFERENCE LINES

Final Rejection §103
Filed
Jun 12, 2024
Examiner
HABIB, IRFAN
Art Unit
2485
Tech Center
2400 — Computer Networks
Assignee
Research & Business Foundation Sungkyunkwan University
OA Round
2 (Final)
88%
Grant Probability
Favorable
3-4
OA Rounds
2y 2m
To Grant
96%
With Interview

Examiner Intelligence

Grants 88% — above average
88%
Career Allow Rate
637 granted / 721 resolved
+30.3% vs TC avg
Moderate +8% lift
Without
With
+7.8%
Interview Lift
resolved cases with interview
Typical timeline
2y 2m
Avg Prosecution
36 currently pending
Career history
757
Total Applications
across all art units

Statute-Specific Performance

§101
3.5%
-36.5% vs TC avg
§103
70.0%
+30.0% vs TC avg
§102
4.4%
-35.6% vs TC avg
§112
3.6%
-36.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 721 resolved cases

Office Action

§103
DETAILED ACTION 1. This office action is in response to U.S. Patent Application No.: 18/741,268 filed on 10/29/2025 with effective filing date 6/12/2024. Claims 1-15 are pending. Claim Rejections - 35 USC § 103 2. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 3. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. 4. Claim(s) 1-15 are rejected under 35 U.S.C. 103 as being unpatentable over Lee et al US 2021/0243429 A1 in view of Lee US 2021/0105460, hereinafter Lee2. Per claims 1, 13 & 15, Lee et al. discloses a method of intra-predicting a current block by a video decoding an intra-prediction mode information of the current block from a bitstream (para: 127, e.g. information for specifying any one of a plurality of prestored matrices may be signaled in a bitstream. The decoder may determine a matrix for performing intra-prediction on a current block). Lee et al. fails to explicitly disclose the remaining claim limitation. Lee2 however in the same field of endeavor teaches determining the multiple reference lines for the current block, and generating, multiple prediction blocks of the current block using the multiple reference lines based on the intra-prediction mode information (para: 211 & 214, e.g. An intra-prediction of a current block may be performed by at least one of a plurality of reference lines. A method of performing intra-prediction using a plurality of reference lines) and generating a final prediction block of the current block by performing a weighted combination of the multiple prediction blocks of the current block, wherein the multiple prediction blocks of the current block are generated using at least one different reference line (para: 245 & 255 & , e.g. reference sample average value of a reference line may be calculated by assigning different weights to reference samples, depending on a shape of a current block and a position of a reference sample. For example, if the current block has a square shape, the reference sample average value may be calculated by assigning the same weight to top reference samples and left reference sample). Therefore, in view of disclosures by Lee2, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention was made to combine Lee et al. and Lee2 in order to determine an intra prediction mode of a current block. Multiple reference lines of the current block among multiple reference line candidates are determined available for the current block. Prediction samples of the current block are generated based on the determined intra prediction mode and the reference lines. The corrected prediction samples are generated by correcting the prediction samples. Per claims 2 & 14, Lee2 further teaches the method of claim 1, further comprising decoding from the bitstream an adaptive multiple reference line (MRL) flag that indicates whether to apply an adaptive MRL technique according to a combination of the multiple reference lines (para: 212 & 214, e.g. whether or not performing intra-prediction using an extended reference line may be determined based on information signaled through a bitstream. Here, the information may be a 1-bit flag, but is not limited thereto. Information on whether performing intra-prediction using an extended reference line may be signaled in units of a coding tree unit, an encoding unit or a prediction unit). Per claim 3, Lee et al. further teaches the method of claim 2, wherein decoding the adaptive MRL flag includes: when the adaptive MRL flag is not present, inferring the adaptive MRL flag to be false (para: 35 & 65, e.g. the deblocking filter may remove block distortion that occurs due to boundaries between blocks in a reconstructed picture. In order to determine whether or not to perform deblocking, whether or not to apply a deblocking filter to a current block may be determined on the basis of pixels included in several rows and columns included in a block). Per claim 4, Lee et al. further teaches the method of claim 1, wherein determining the multiple reference lines includes: using predefined multiple predefined reference lines (para: 93, e.g. the information may be a 1-bit flag representing whether a prediction encoding mode is an intra mode or an inter mode). Per claim 5, Lee et al. further teaches the method of claim 1, wherein determining the multiple reference lines includes: selecting a width or a height of the current block according to the intra-prediction mode information; and using multiple predefined reference lines based on the selected width or the selected height (para: 266, e.g. when a current block is a non-square block where a width is greater than a height, an average value may be calculated by using top reference samples. To the contrary, when a current block is a non-square block where a width is smaller than a height, an average value may be calculated by using left reference samples). Per claim 6, Lee2 further teaches the method of claim 1, wherein determining the multiple reference lines includes: decoding, from the bitstream, a number of the multiple reference lines; and decoding, from the bitstream, indices of the multiple reference lines corresponding to the number of the multiple reference lines (para: 265, e.g. the first reference sample and the second reference sample may be positioned neighboring each other, but it is not limited thereto. A prediction sample of a current block may be generated in consideration of a weighted sum of the first reference sample and the second reference sample, or may be generated based on an average value, a minimum value or a maximum value of the first reference sample and the second reference sample). Per claim 7, Lee2 further teaches the method of claim 1, wherein determining the multiple reference lines includes: decoding, from the bitstream, indices of the multiple reference lines corresponding to a preset number (para: 212, 255 & 259, e.g. whether or not performing intra-prediction using an extended reference line may be determined based on information signaled through a bitstream. Here, the information may be a 1-bit flag, but is not limited thereto. Information on whether performing intra-prediction using an extended reference line may be signaled in units of a coding tree unit). Per claim 8, Lee et al. further teaches the method of claim 1, wherein generating the final prediction block of the current block comprises: determining weights corresponding to a number of the predictors; and generating the final prediction block of the current block by weighted-combining the multiple prediction blocks using the weights (para: 274). Per claim 9, Lee et al. further teaches the method of claim 8, wherein determining the weights includes: using predefined weights (para: 120). Per claim 10, Lee et al. further teaches the method of claim 8, wherein determining the weights includes: using predefined weights, wherein a larger weight is set for a reference line that is closer to the current block among the multiple reference lines (para: 120, e.g. motion information may include at least one of a motion vector, a reference picture index, a prediction direction, and a bidirectional weighting factor index. A motion vector represents a movement direction of an object and a magnitude). Per claim 11, Lee et al. further teaches the method of claim 8, wherein determining the weights includes: when indices of the multiple reference lines are decoded from the bitstream, decoding from the bitstream the weights corresponding to the indices of the multiple reference lines (para: 111, e.g. when the selected intra-prediction mode is not included in [DirS, DirE] (e.g., DC or planar mode), there is no need to signal the reference line index. Hence, the reference index is inferred to be equal to zero, where zero indicates a primary reference line immediately adjacent to the current block. In such a case, the intra-prediction mode subset 930 may not be employed as all directional references employ alternative reference lines in such a case. In summary, the reference line index is signaled in such an example whenever the intra-prediction mode is not DC or planar mode). Per claim 12, Lee et al. further teaches further teaches the method of claim 8, wherein determining the weights includes: when predefined multiple reference lines are used, decoding from the bitstream the weights in an order in which the predefined multiple reference lines are adjacent to the current block (para: 111, e.g. when the selected intra-prediction mode is not included in [DirS, DirE] (e.g., DC or planar mode), there is no need to signal the reference line index. Hence, the reference index is inferred to be equal to zero, where zero indicates a primary reference line immediately adjacent to the current block. In such a case, the intra-prediction mode subset 930 may not be employed as all directional references employ alternative reference lines in such a case. In summary, the reference line index is signaled in such an example whenever the intra-prediction mode is not DC or planar mode). Response to Arguments 5. Applicant's arguments filed 10/29/2025 have been fully considered but they are not persuasive. 6. Applicant’s arguments with respect to claim(s) 1-15 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Conclusion 7. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Hsu US 2018/0199054 A1, e.g. a multi-hypotheses motion prediction mode for video coding is provided. Multi-hypotheses motion prediction conveys prediction for motion compensation based on a selection of multiple predictions for motion compensation (hypotheses), which are respectively obtained using motion predictors or MVP selected from a list of candidate motion predictors. Lee et al. US 12,108,035 B2, e.g. an image encoding/decoding method and apparatus for performing intra prediction using a plurality of reference sample lines are provided. An image decoding method may comprise configuring a plurality of reference sample lines, reconstructing an intra prediction mode of a current block, and performing intra prediction for the current block based on the intra prediction mode and the plurality of reference sample lines. Conclusion 8. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. 9. Any inquiry concerning this communication or earlier communications from the examiner should be directed to IRFAN HABIB whose telephone number is (571)270-7325. The examiner can normally be reached Mon-Th 9AM-7PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jay Patel can be reached at 5712722988. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Irfan Habib/Examiner, Art Unit 2485
Read full office action

Prosecution Timeline

Jun 12, 2024
Application Filed
Jul 26, 2025
Non-Final Rejection — §103
Oct 29, 2025
Response Filed
Feb 13, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12593047
METHOD AND APPARATUS FOR IMAGE ENCODING AND DECODING USING TEMPORAL MOTION INFORMATION
2y 5m to grant Granted Mar 31, 2026
Patent 12569313
HANDS-FREE CONTROLLER FOR SURGICAL MICROSCOPE
2y 5m to grant Granted Mar 10, 2026
Patent 12568241
IMPROVEMENT OF BI-PREDICTION WITH CU LEVEL WEIGHT (BCW)
2y 5m to grant Granted Mar 03, 2026
Patent 12568198
3D Display Method AND 3D Display Device
2y 5m to grant Granted Mar 03, 2026
Patent 12563216
METHODS AND DEVICES FOR ENHANCING BLOCK ADAPTIVE WEIGHTED PREDICTION WITH BLOCK VECTOR
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
88%
Grant Probability
96%
With Interview (+7.8%)
2y 2m
Median Time to Grant
Moderate
PTA Risk
Based on 721 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month