Prosecution Insights
Last updated: April 19, 2026
Application No. 18/879,038

REFERENCE AREA FOR INTRA PREDICTION

Non-Final OA §101§102§103
Filed
Dec 26, 2024
Examiner
NIRJHAR, NASIM NAZRUL
Art Unit
2896
Tech Center
2800 — Semiconductors & Electrical Systems
Assignee
Nokia Technologies Oy
OA Round
1 (Non-Final)
74%
Grant Probability
Favorable
1-2
OA Rounds
2y 6m
To Grant
93%
With Interview

Examiner Intelligence

Grants 74% — above average
74%
Career Allow Rate
379 granted / 512 resolved
+6.0% vs TC avg
Strong +19% interview lift
Without
With
+18.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
37 currently pending
Career history
549
Total Applications
across all art units

Statute-Specific Performance

§101
3.8%
-36.2% vs TC avg
§103
75.4%
+35.4% vs TC avg
§102
3.4%
-36.6% vs TC avg
§112
7.1%
-32.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 512 resolved cases

Office Action

§101 §102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This communication is responsive to the correspondence filled on 12/26/2024. Claims 33-52 are presented for examination. IDS Considerations The information disclosure statement (IDS) submitted on 5/21/2025 is/are being considered by the examiner as the submission is in compliance with the provisions of 37 CFR 1.97. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim(s) 49-52 is/are rejected under 35 US.C. § 101 because the claimed invention is directed to nonstatutory subject matter. The claim(s) 49-52 is/are drawn to a “computer readable medium” comprising stored data. The Specification discusses “non-transitory computer readable medium”, however specification is silent regarding the meaning of the term “computer readable medium”. Thus, applying the broadest reasonable interpretation in light of the Specification and taking into account the meaning of the words in their ordinary usage as they would be understood by one of ordinary skill in the art (MPEP §2111.01), the claim as a whole covers a transitory signal, as such, does not fall within the definition of a process, machine, manufacture, or composition of matter (MPEP §2106.01). Variations of the term “storage” computer readable medium are not necessarily considered to limit a media claim to non-transitory embodiments because many disclosures conflate storage media and signals. Therefore, claim(s) 49-52 is/are directed towards non-statutory subject matter (See MPEP section 2106, Seventh Edition, Revision No. dated February 2000, at page 2100-10 and 2100-11). Other dependent claims are also rejected because of the deficiencies of their respective parent claims. Examiner’s comment: A claim drawn to such a computer readable medium that covers both transitory and non-transitory embodiments may be amended to narrow the claim to cover only statutory embodiments to avoid a rejection under 35 US.C. § 101 by adding the limitation “non-transitory” to the claim term (Kappos memo dated January 26, 2010 available at http://www.uspto.gov/patents/law/notices/101_crm_20100127.pdf). Claim Rejections - 35 USC § 102 The following is a quotation of 35 U.S.C. 102(a)(1)/(a)(2) which forms the basis for all obviousness rejections set forth in this Office action: (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 33-34, 36-37, 39-43, 45-46, 48-50 and 52 is/are rejected under 35 U.S.C. 102(a)(1) as being unpatentable over Li (U.S. Pub. No. 20200359016 A1). Examiner’s note: Encoding and decoding are done using same opposite algorithm. Regarding to claim 33, 42 and 49: 33. Li teach an apparatus comprising: at least one processor; and at least one memory storing instructions that, when executed by the at least one processor, cause the apparatus at least to: (Li [0053] With reference to FIG. 1, the computer system (100) includes one or more processing units (110, 115) and memory (120, 125). The processing units (110, 115) execute computer-executable instructions. A processing unit can be a general-purpose central processing unit (“CPU”), processor in an application-specific integrated circuit (“ASIC”) or any other type of processor. In a multi-processing system, multiple processing units execute computer-executable instructions to increase processing power. For example, FIG. 1 shows a CPU (110) as well as a graphics processing unit or co-processing unit (115). The tangible memory (120, 125) may be volatile memory (e.g., registers, cache, RAM), non-volatile memory (e.g., ROM, EEPROM, flash memory, etc.), or some combination of the two, accessible by the processing unit(s). The memory (120, 125) stores software (180) implementing one or more innovations for intra-picture prediction with non-adjacent reference lines of sample values available, in the form of computer-executable instructions suitable for execution by the processing unit(s).) select a coding unit of an image; (Li [0075] For syntax according to the H.265/HEVC standard, the video encoder (340) splits the content of a picture (or slice or tile) into coding tree units. A coding tree unit (“CTU”) includes luma sample values organized as a luma coding tree block (“CTB”) and corresponding chroma sample values organized as two chroma CTBs. The size of a CTU (and its CTBs) is selected by the video encoder. A luma CTB can contain, for example, 64×64, 32×32, or 16×16 luma sample values. A CTU includes one or more coding units. A coding unit (“CU”) has a luma coding block (“CB”) and two corresponding chroma CBs.) select a first area within a coded area of the image, wherein the first area comprises, at least, a first part and a second part, (Li Fig. 32 shows first part as 3230 and second part as 3220. [0306] Fig. 32 For the possibly in-loop-filtered reference sample values (3230), in-loop filtering can be completed without waiting for reconstructed sample values of the current block (3210). In some example implementations, to improve the effectiveness of intra-picture prediction of the current block (3210), reference sample values are accessed for the intra-picture prediction of the current block (3210) after in-loop filtering is performed on those reference sample values if the in-loop filtering can be performed without using reconstructed sample values of the current block (3210). This condition is satisfied for non-adjacent reference lines farther away from the current block (3210), outside the range of in-loop filtering of the edge between the adjacent reference line and current block (3210). [0307] In FIG. 32, the reference sample values also include reference sample values (3220) that cannot be in-loop filtered prior to the intra-picture prediction for the current block (3210). For the right-most reference columns, closest to the current block (3210), in-loop filtering depends on reconstructed sample values of the current block (3210). Similarly, for the bottom-most reference rows, which are not shown in FIG. 32, in-loop filtering depends on reconstructed sample values of the current block (3210). wherein the first part is at least partially different from the second part; (Li Fig. 32 shows first part as 3230 and second part as 3220. [0306] Fig. 32 For the possibly in-loop-filtered reference sample values (3230), in-loop filtering can be completed without waiting for reconstructed sample values of the current block (3210). In some example implementations, to improve the effectiveness of intra-picture prediction of the current block (3210), reference sample values are accessed for the intra-picture prediction of the current block (3210) after in-loop filtering is performed on those reference sample values if the in-loop filtering can be performed without using reconstructed sample values of the current block (3210). This condition is satisfied for non-adjacent reference lines farther away from the current block (3210), outside the range of in-loop filtering of the edge between the adjacent reference line and current block (3210). [0307] In FIG. 32, the reference sample values also include reference sample values (3220) that cannot be in-loop filtered prior to the intra-picture prediction for the current block (3210). For the right-most reference columns, closest to the current block (3210), in-loop filtering depends on reconstructed sample values of the current block (3210). Similarly, for the bottom-most reference rows, which are not shown in FIG. 32, in-loop filtering depends on reconstructed sample values of the current block (3210). perform in-loop filtering of, at least, the first part of the first area; (Li Fig. 32 shows first part as 3230 and second part as 3220. [0306] Fig. 32 For the possibly in-loop-filtered reference sample values (3230), in-loop filtering can be completed without waiting for reconstructed sample values of the current block (3210). In some example implementations, to improve the effectiveness of intra-picture prediction of the current block (3210), reference sample values are accessed for the intra-picture prediction of the current block (3210) after in-loop filtering is performed on those reference sample values if the in-loop filtering can be performed without using reconstructed sample values of the current block (3210). This condition is satisfied for non-adjacent reference lines farther away from the current block (3210), outside the range of in-loop filtering of the edge between the adjacent reference line and current block (3210). [0307] In FIG. 32, the reference sample values also include reference sample values (3220) that cannot be in-loop filtered prior to the intra-picture prediction for the current block (3210). For the right-most reference columns, closest to the current block (3210), in-loop filtering depends on reconstructed sample values of the current block (3210). Similarly, for the bottom-most reference rows, which are not shown in FIG. 32, in-loop filtering depends on reconstructed sample values of the current block (3210). perform intra prediction for the selected coding unit based, at least partially, on the first area, comprising the in-loop filtered first part; (Li Fig. 32 shows first part as 3230 and second part as 3220. [0306] Fig. 32 For the possibly in-loop-filtered reference sample values (3230), in-loop filtering can be completed without waiting for reconstructed sample values of the current block (3210). In some example implementations, to improve the effectiveness of intra-picture prediction of the current block (3210), reference sample values are accessed for the intra-picture prediction of the current block (3210) after in-loop filtering is performed on those reference sample values if the in-loop filtering can be performed without using reconstructed sample values of the current block (3210). This condition is satisfied for non-adjacent reference lines farther away from the current block (3210), outside the range of in-loop filtering of the edge between the adjacent reference line and current block (3210). [0307] In FIG. 32, the reference sample values also include reference sample values (3220) that cannot be in-loop filtered prior to the intra-picture prediction for the current block (3210). For the right-most reference columns, closest to the current block (3210), in-loop filtering depends on reconstructed sample values of the current block (3210). Similarly, for the bottom-most reference rows, which are not shown in FIG. 32, in-loop filtering depends on reconstructed sample values of the current block (3210). and output an intra-prediction block for the selected coding unit based on the performed intra prediction. (Li [0310] FIG. 33 shows a generalized technique (3300) for encoding that includes intra-picture prediction that uses in-loop-filtered sample values of a non-adjacent reference line. An encoder such as the video encoder (340) of FIG. 3, another video encoder, or an image encoder can perform the technique (3300). [0311] The encoder receives (3310) a picture, encodes (3320) the picture to produce encoded data, and outputs (3330) the encoded data as part of a bitstream. As part of the encoding (3320), the encoder performs intra-picture prediction for a current block of sample values in the picture. A non-adjacent reference line of sample values is available for the intra-picture prediction. When it performs intra-picture prediction for the current block, the encoder selects the non-adjacent reference line of sample values for use in the intra-picture prediction for the current block. At least some of the sample values of the selected non-adjacent reference line have been modified by in-loop filtering prior to use in the intra-picture prediction for the current block. For the in-loop filtering, none of the modified sample values of the selected reference line is dependent on any of the sample values of the current block) Regarding to claim 34, 43 and 50: 34. Li teach the apparatus of claim 33, wherein the second part is not in-loop filtered, and wherein the intra prediction for the selected coding unit is performed further based on the second part that is not in-loop filtered. (Li [0307] In FIG. 32, the reference sample values also include reference sample values (3220) that cannot be in-loop filtered prior to the intra-picture prediction for the current block (3210). For the right-most reference columns, closest to the current block (3210), in-loop filtering depends on reconstructed sample values of the current block (3210). Similarly, for the bottom-most reference rows, which are not shown in FIG. 32, in-loop filtering depends on reconstructed sample values of the current block (3210)) Regarding to claim 36, 45 and 52: 36. Li teach the apparatus of claim 33, wherein to perform the in-loop filtering, the apparatus is further caused to perform at least one of: a deblocking procedure; a sample adaptive offset procedure; an adaptive loop filter procedure; a cross-component sample adaptive offset procedure; a bilateral filter procedure; an alternative band classifier for the adaptive loop filter procedure; a cross-component adaptive loop filter procedure; or a neural network-based filter procedure. (Li [0099] The video encoder (340) can also perform in-loop filtering (e.g., deblock filtering and/or SAO filtering) during intra-picture coding, to filter reference sample values of non-adjacent reference lines prior to subsequent intra-picture prediction for a current block, as described below. Other filtering (such as de-ringing filtering or adaptive loop filtering (“ALF”); not shown) can alternatively or additionally be applied. Tile boundaries can be selectively filtered or not filtered at all, depending on settings of the video encoder (340), and the video encoder (340) may provide syntax elements within the coded bitstream to indicate whether or not such filtering was applied.) Regarding to claim 37 and 46: 37. Li teach the apparatus of claim 33, wherein the image comprises a current image, and wherein the current image comprises the coded area (Li teach part of the block is coded and part of the block is uncoded. [0274] When performing intra-picture prediction, reference sample values may be unavailable because they lie outside a picture boundary or slice boundary. Or, reference sample values may be unavailable because they are part of a block that has not yet been encoded/decoded/ reconstructed.) and at least one uncoded area. (Li [0302] For intra-picture prediction of a current block, the current block has not yet been encoded and reconstructed, so the sample values of the current block are not available for use by the deblocking filter or SAO filter. [0280] FIG. 27 shows a first example (2700) of mode-dependent padding to replace unavailable sample values for intra-picture prediction of sample values of a current block (2710). In FIG. 27, the sample value at position A is unavailable. The reference sample values are at integer-sample offsets in the picture. The unavailable sample [uncoded area] value at position A is replaced with a reconstructed reference sample value at integer-sample offsets. [0304] In-loop filtering of reference sample values for intra-picture prediction can be used in combination with one or more other innovations described herein. For example, in-loop-filtered reference sample values in a non-adjacent reference frame can be used in intra-picture prediction with residue compensation or weighted prediction, or can be further filtered or used in mode-dependent padding. [0306] FIG. 32 The possibly in-loop-filtered reference sample values (3230) have no dependencies on the current block (3210) or any other block that has not yet been decoded [uncoded area]) Regarding to claim 39 and 48: 39. Li teach the apparatus of claim 33, wherein the first area of the coded area of the image comprises at least one of: a reference area of the image for intra block copy, wherein the intra block copy comprises, and wherein the apparatus is further caused to determine a first prediction block in the reference area for the selected coding unit; (Li [0084] for intra block copy mode, the intra-picture prediction estimator (440) determines how to predict sample values of a block of the current picture (331) using an offset (sometimes called a block vector) that indicates a previously encoded/decoded portion of the current picture (331). Intra block copy mode can be implemented as a special case of inter-picture prediction in which the reference picture is the current picture (331), and only previously encoded/decoded sample values of the current picture (331) can be used for prediction. [0085] According to the intra prediction data (442), the intra-picture predictor (445) spatially predicts sample values of a block of the current picture (331) from previously reconstructed sample values of the current picture (331), producing intra-picture predicted sample values for the block. In doing so, the intra-picture predictor (445) can use one or more of the features of intra-picture prediction described below, e.g., intra-picture prediction with multiple candidate reference lines available, weighted prediction, residue compensation, mode-dependent padding to replace unavailable sample values, filtering of reference sample values and/or predicted sample values. Or, the intra-picture predictor (445) predicts sample values of the block using intra block copy prediction, using an offset (block vector) for the block. [0116] the intra-picture predictor (645) can use one or more of the features of intra-picture prediction described below, e.g., intra-picture prediction with multiple candidate reference lines available, weighted prediction, residue compensation, mode-dependent padding to replace unavailable sample values, filtering of reference sample values and/or predicted sample values. Or, for intra block copy mode, the intra-picture predictor (645) predicts the sample values of a current block using previously reconstructed sample values of a reference block, which is indicated by an offset (block vector) for the current block.) or a search area of the image for intra template matching prediction, wherein the intra template matching prediction, and wherein the apparatus is further caused to determine a template within the search area of the selected coding unit, and determine a second prediction block, in the search area, for the selected coding unit based, at least partially, on a defined template. (Part of or condition. No rejection is required) Regarding to claim 40: 40. Li teach the apparatus of claim 33, wherein the apparatus comprises a decoder of the image. (Li [0021] FIG. 5 is a diagram of an example decoder system in conjunction with which some described embodiments can be implemented.) Regarding to claim 41: 41. Li teach the apparatus of claim 33, wherein the apparatus comprises an encoder of the image. (Li [0020] FIGS. 4a and 4b are diagrams illustrating an example video encoder in conjunction with which some described embodiments can be implemented.) Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 35, 38, 44, 47 and 51 is/are rejected under 35 U.S.C. 103 as being unpatentable over Li (U.S. Pub. No. 20200359016 A1), in view of Budagavi (U.S. Pub. No. 20170195670 A1). Regarding to claim 35, 44 and 51: 35. Li teach the apparatus of claim 33, at least partially, on at least one of: padding for pixels in an uncoded area of the image; or pre-determined coding values for the uncoded area of the image; wherein the intra prediction for the selected coding unit is performed based on the first area, (Li [0302]For intra-picture prediction of a current block, the current block has not yet been encoded and reconstructed, so the sample values of the current block are not available for use by the deblocking filter or SAO filter. [0274] When performing intra-picture prediction, reference sample values may be unavailable because they lie outside a picture boundary or slice boundary. Or, reference sample values may be unavailable because they are part of a block that has not yet been encoded/decoded/reconstructed. [0280] FIG. 27 shows a first example (2700) of mode-dependent padding to replace unavailable sample values for intra-picture prediction of sample values of a current block (2710). In FIG. 27, the sample value at position A is unavailable. The reference sample values are at integer-sample offsets in the picture. The unavailable sample [uncoded area] value at position A is replaced with a reconstructed reference sample value at integer-sample offsets. [0304] In-loop filtering of reference sample values for intra-picture prediction can be used in combination with one or more other innovations described herein. For example, in-loop-filtered reference sample values in a non-adjacent reference frame can be used in intra-picture prediction with residue compensation or weighted prediction, or can be further filtered or used in mode-dependent padding. [0306] FIG. 32 The possibly in-loop-filtered reference sample values (3230) have no dependencies on the current block (3210) or any other block that has not yet been decoded [uncoded area]) Li do not explicitly teach wherein the apparatus is further caused to: perform in-loop filtering of the second part based, comprising the in-loop filtered first part and the in-loop filtered second part. However Budagavi teach wherein the apparatus is further caused to: perform in-loop filtering of the second part based, (Budagavi FIG. 3A shows different part of LCU using different shading. [0031] the pixels in region 306 are filtered when the immediate bottom neighboring LCU (x, y+1) is processed, and the application of ALF filtering in the other “shaded” regions 300, 302, 304 differs. [0011] In one aspect, an apparatus configured for applying an adaptive loop filter to reconstructed pixel values of a reconstructed largest coding unit (LCU) of a reconstructed picture is provided wherein the adaptive loop filter is a symmetric two-dimensional (2D) finite impulse response (FIR) filter.) comprising the in-loop filtered first part and the in-loop filtered second part. (Budagavi [0090] Various in-loop filters may be applied to the reconstructed picture data to improve the quality of the reference picture data used for encoding/decoding of subsequent pictures. The in-loop filters may include a deblocking filter component 930, a sample adaptive offset filter (SAO) component 932, and an adaptive loop filter (ALF) component 934. The in-loop filters 930, 932, 934 are applied to each reconstructed LCU in the picture and the final filtered reference picture data is provided to the storage component 918.) It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to modify Li, further incorporating Budagavi in video/camera technology. One would be motivated to do so, to incorporate the apparatus is further caused to: perform in-loop filtering of the second part based, comprising the in-loop filtered first part and the in-loop filtered second part. This functionality will improve efficiency with predictable results. Regarding to claim 38 and 47: 38. Li teach the apparatus of claim 33, Li do not explicitly teach wherein the coded area comprises a plurality of pixels, and wherein the image comprises part of a video. However Budagavi teach wherein the coded area comprises a plurality of pixels, and wherein the image comprises part of a video. (Budagavi [0027] In HEVC, a largest coding unit (LCU) is the base unit used for block-based coding. A picture is divided into non-overlapping LCUs. That is, an LCU plays a similar role in coding as the macroblock of H.264/AVC, but it may be larger, e.g., 32×32, 64×64, etc. An LCU may be partitioned into coding units (CU). A CU is a block of pixels within an LCU and the CUs within an LCU may be of different sizes. The partitioning is a recursive quadtree partitioning. The quadtree is split according to various criteria until a leaf is reached, which is referred to as the coding node or coding unit. The maximum hierarchical depth of the quadtree is determined by the size of the smallest CU (SCU) permitted. The coding node is the root node of two trees, a prediction tree and a transform tree. A prediction tree specifies the position and size of prediction units (PU) for a coding unit. A transform tree specifies the position and size of transform units (TU) for a coding unit. A transform unit may not be larger than a coding unit and the size of a transform unit may be, for example, 4×4, 8×8, 16×16, and 32×32. The sizes of the transforms units and prediction units for a CU are determined by the video encoder during prediction based on minimization of rate/distortion costs.) Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to NASIM N NIRJHAR whose telephone number is (571) 272-3792. The examiner can normally be reached on Monday - Friday, 8 am to 5 pm ET. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, William F Kraig can be reached on (571) 272-8660. The fax phone number for the organization where this application or proceeding is assigned is (571) 273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /NASIM N NIRJHAR/Primary Examiner, Art Unit 2896
Read full office action

Prosecution Timeline

Dec 26, 2024
Application Filed
Nov 30, 2025
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12598324
DEPTH DIFFERENCES IN PLACE OF MOTION VECTORS
2y 5m to grant Granted Apr 07, 2026
Patent 12593131
VELOCITY MATCHING IMAGING OF A TARGET ELEMENT
2y 5m to grant Granted Mar 31, 2026
Patent 12593074
SYSTEMS AND METHODS OF BUFFERING IMAGE DATA BETWEEN A PIXEL PROCESSOR AND AN ENTROPY CODER
2y 5m to grant Granted Mar 31, 2026
Patent 12587662
METHOD, APPARATUS AND STORAGE MEDIUM FOR IMAGE ENCODING/DECODING
2y 5m to grant Granted Mar 24, 2026
Patent 12587628
DISPLAY DEVICE AND METHOD OF DRIVING THE SAME
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
74%
Grant Probability
93%
With Interview (+18.7%)
2y 6m
Median Time to Grant
Low
PTA Risk
Based on 512 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month