Prosecution Insights
Last updated: April 19, 2026
Application No. 18/684,369

METHOD AND APPARATUS FOR FEATURE ENCODING/DECODING ON BASIS OF CORRELATION BETWEEN CHANNELS, AND RECORDING MEDIUM HAVING BITSTREAM STORED THEREIN

Final Rejection §102§103
Filed
Feb 16, 2024
Examiner
ZHOU, ZHIHAN
Art Unit
2482
Tech Center
2400 — Computer Networks
Assignee
LG Electronics Inc.
OA Round
2 (Final)
79%
Grant Probability
Favorable
3-4
OA Rounds
2y 3m
To Grant
81%
With Interview

Examiner Intelligence

Grants 79% — above average
79%
Career Allow Rate
784 granted / 987 resolved
+21.4% vs TC avg
Minimal +1% lift
Without
With
+1.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 3m
Avg Prosecution
28 currently pending
Career history
1015
Total Applications
across all art units

Statute-Specific Performance

§101
5.4%
-34.6% vs TC avg
§103
54.8%
+14.8% vs TC avg
§102
18.5%
-21.5% vs TC avg
§112
2.0%
-38.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 987 resolved cases

Office Action

§102 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION This office action is in response to an amendment filed on 01/08/2026. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1, 7, 10, 12, 14, and 15 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Lee (US 2018/0365794). As to claim 1, Lee teaches a feature decoding method performed by a feature decoding apparatus, the feature decoding method comprising: obtaining, from a bitstream, first information on whether to reorder a plurality of channels included in a current feature map; obtaining second information on a reordering unit of the plurality of channels from the bitstream, based on the first information specifying that the plurality of channel is reordered (paras. 66, 74-76, and 83-93); determining the reordering unit of the plurality of channels based on the second information, the reordering unit being determined to be one of a picture unit, a feature group unit, or a channel unit; obtaining third information on a reconstruction order of the plurality of channels from the bitstream, based on the reordering unit of the plurality of channels (paras. 66, 74-76, and 83-93; also see FIGs. 6-8 and their corresponding paragraphs); and reconstructing the plurality of channels based on the reconstruction order of the plurality of channels determined based on the third information (paras. 66, 74-76, and 83-93). As to claim 10, Lee teaches a feature encoding method performed by a feature encoding apparatus, the feature encoding method comprising: determining whether to reorder a plurality of channels included in a current feature map; determining a reordering unit of the plurality of channels based on it being determined that the plurality of channels is reordered, the reordering unit being determined to be one of a picture unit, a feature group unit, or a channel unit (paras. 66, 74-76, and 83-93; also see FIGs. 6-8 and their corresponding paragraphs); reordering the plurality of channels based on the determined reordering unit; and encoding the plurality of reordered channels and information related to the reordering (paras. 66, 74-76, and 83-93), wherein the information related to the reordering comprises first information on whether to reorder the plurality of channels, second information on the determined reordering unit and third information on the order before the reordering of the plurality of channels (paras. 66, 74-76, and 83-93). As to claim 15, Lee teaches a method of transmitting a bitstream comprising: determining whether to reorder a plurality of channels included in a current feature map; determining a reordering unit of the plurality of channels based on it being determined that the plurality of channels is reordered, the reordering unit being determined to be one of a picture unit, a feature group unit, or a channel unit (paras. 12, 66, 74-76, and 83-93, and 109-111; also see FIGs. 6-8 and their corresponding paragraphs); reordering the plurality of channels based on the determined reordering unit; encoding the plurality of reordered channels and information related to the reordering to generate the bitstream; and transmitting the bitstream, wherein the information related to the reordering comprises first information on whether to reorder the plurality of channels, second information on the determined reordering unit and third information on the order before the reordering of the plurality of channels (paras. 12, 66, 74-76, and 83-93, and 109-111). As to claims 7 and 14, Lee further teaches wherein at least one second block included in the plurality of channels is reconstructed and encoded based on inter prediction, and the inter prediction is performed based on prediction information of a reconstructed reference block included in a channel different from the second block (paras. 87, 90, and 92). As to claim 12, Lee further teaches wherein the third information is encoded in a parameter set at a level determined based on the reordering unit (paras. 66, 74-76, and 83-93). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 3, 5, 6, and 13 are rejected under 35 U.S.C. 103 as being unpatentable over Lee in view of Cho (US 2018/0350110). As to claim 3, Lee does not teach wherein the third information is obtained, based on the reordering unit of the plurality of channels being determined to be the picture unit, from a first parameter set at a picture level included in the bitstream. However, Lee does teach wherein the third information is obtained from a parameter set at a picture level included in the bitstream, the reordering unit of the plurality of channels being determined in a coding unit (paras. 66, 74-76, and 83-93; also see FIGs. 6-8 and their corresponding paragraphs). In addition, Cho teaches wherein the receiver 460 may extract the encoded image data from the payload of the bitstream, and may further extract information on a coding unit, prediction mode information on each coding unit, size information on the transform unit for transformation, and the like from a header or a parameter set raw byte (RB) sequence payload (RBSP). The information on the coding unit may include at least one of the type and size of the coding unit. The coding unit determiner 470 according to an embodiment determines a coding unit for decoding the multi-channel feature map images based on the information on the coding unit obtained by the image decoding device 450. As in the image encoding device 400, the coding unit determiner 470 of the image decoding device 450 may determine the coding unit to have a pixel depth corresponding to the number of channels in a channel direction in sets of feature map images (paras. 105-106). The sets of re-ordered feature map images have a higher degree of correlation between feature map images adjacent to each other in a channel direction. Therefore, the feature map images adjacent to each other in a channel direction may have similar directions of inter-pixel prediction. The image coding device 400 performs prediction by setting sub-blocks having similar prediction directions in the coding unit 850 as a basic predictor at the time of prediction of the coding unit 850. Encoding is performed in the sub-blocks having similar prediction directions in the same prediction mode. The sub-blocks are divided in a depth direction of a channel from the coding unit 850, and are referred to as the sub-coding units 860 (para. 136). In the CNN computational operation of size N×N×K as described above in FIG. 5, the memory 270 may simultaneously read/write pixel values of multi-channel feature map images for each line or each row for image processing and pipeline processing of the CNN computational operation. To this end, the image encoding device 400 may encode the multi-channel feature map images 1010, 1020, 1030, and 1040 in a predetermined order based on a coding unit, and may generate encoded data of a horizontal section having a width direction and a channel direction among sets of the multi-channel feature map images 1010, 1020, 1030, and 1040 and store the encoded data in a specific partition of the bitstream. For example, in a case of a CNN computational operation for an N×N×K region an N×K plane may be independently compressed and stored in a specific partition of the bitstream. When K is smaller than a predetermined set value, each row in feature map images may be determined as a coding unit. That is, the coding unit may be determined not in a channel direction of sets of multi-channel feature map images but in a row direction of each feature map image. Here, encoded data of a horizontal section having a width direction and a channel direction among the sets of the multi-channel feature map images 1010, 1020, 1030, and 1040 may be stored in a specific partition of the bitstream (paras. 149-150). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify Lee’s system with Cho’s system to show wherein the third information is obtained, based on the reordering unit of the plurality of channels being determined to be the picture unit, from a first parameter set at a picture level included in the bitstream in order to provide more efficient encoding and improve compression performance. As to claim 5, Lee does not teach wherein the third information is obtained, based on the reordering unit of the plurality of channels being determined to be the channel unit, from a third parameter set at a feature level included in the bitstream. However, Lee does teach wherein the third information is obtained from a parameter set at a level included in the bitstream, the reordering unit of the plurality of channels being determined to be a channel unit in a coding unit (paras. 66, 74-76, and 83-93; also see FIGs. 6-8 and their corresponding paragraphs). In addition, Cho teaches wherein the receiver 460 may extract the encoded image data from the payload of the bitstream, and may further extract information on a coding unit, prediction mode information on each coding unit, size information on the transform unit for transformation, and the like from a header or a parameter set raw byte (RB) sequence payload (RBSP). The information on the coding unit may include at least one of the type and size of the coding unit. The coding unit determiner 470 according to an embodiment determines a coding unit for decoding the multi-channel feature map images based on the information on the coding unit obtained by the image decoding device 450. As in the image encoding device 400, the coding unit determiner 470 of the image decoding device 450 may determine the coding unit to have a pixel depth corresponding to the number of channels in a channel direction in sets of feature map images (paras. 105-106). The sets of re-ordered feature map images have a higher degree of correlation between feature map images adjacent to each other in a channel direction. Therefore, the feature map images adjacent to each other in a channel direction may have similar directions of inter-pixel prediction. The image coding device 400 performs prediction by setting sub-blocks having similar prediction directions in the coding unit 850 as a basic predictor at the time of prediction of the coding unit 850. Encoding is performed in the sub-blocks having similar prediction directions in the same prediction mode. The sub-blocks are divided in a depth direction of a channel from the coding unit 850, and are referred to as the sub-coding units 860 (para. 136). In the CNN computational operation of size N×N×K as described above in FIG. 5, the memory 270 may simultaneously read/write pixel values of multi-channel feature map images for each line or each row for image processing and pipeline processing of the CNN computational operation. To this end, the image encoding device 400 may encode the multi-channel feature map images 1010, 1020, 1030, and 1040 in a predetermined order based on a coding unit, and may generate encoded data of a horizontal section having a width direction and a channel direction among sets of the multi-channel feature map images 1010, 1020, 1030, and 1040 and store the encoded data in a specific partition of the bitstream. For example, in a case of a CNN computational operation for an N×N×K region an N×K plane may be independently compressed and stored in a specific partition of the bitstream. When K is smaller than a predetermined set value, each row in feature map images may be determined as a coding unit. That is, the coding unit may be determined not in a channel direction of sets of multi-channel feature map images but in a row direction of each feature map image. Here, encoded data of a horizontal section having a width direction and a channel direction among the sets of the multi-channel feature map images 1010, 1020, 1030, and 1040 may be stored in a specific partition of the bitstream (paras. 149-150). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify Lee’s system with Cho’s system to show wherein the third information is obtained, based on the reordering unit of the plurality of channels being determined to be the channel unit, from a third parameter set at a feature level included in the bitstream in order to provide more efficient encoding and improve compression performance. As to claims 6 and 13, Lee does not teach wherein at least one first block included in the plurality of channels is reconstructed and encoded based on intra prediction, and the intra prediction is performed based on reconstructed reference samples included in a channel different from the first block. However, Lee does teach wherein at least one second block included in the plurality of channels is reconstructed and encoded based on prediction, and the prediction is performed based on prediction information of a reconstructed reference block included in a channel different from the second block (paras. 87, 90, and 92). In addition, Cho teaches intra-predictors performing intra-prediction on an intra-mode coding unit and inter-predictors performing inter-prediction on an inter-mode coding unit as well as inter-channel prediction (paras. 95, 107, 128, 132, and 157-170). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify Lee’s system with Cho’s system to show wherein at least one first block included in the plurality of channels is reconstructed and encoded based on intra prediction, and the intra prediction is performed based on reconstructed reference samples included in a channel different from the first block in order to provide more efficient encoding and improve compression performance. Allowable Subject Matter Claims 4, 8, and 9 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Response to Arguments Applicant's arguments filed 01/08/2026 have been fully considered but they are not persuasive. Examiner maintains that Lee discloses obtaining second information on a reordering unit of the plurality of channels from the bitstream, based on the first information specifying that the plurality of channel is reordered (paras. 66, 74-76, and 83-93); determining the reordering unit of the plurality of channels based on the second information, the reordering unit being determined to be one of a picture unit, a feature group unit, or a channel unit; obtaining third information on a reconstruction order of the plurality of channels from the bitstream, based on the reordering unit of the plurality of channels (paras. 66, 74-76, and 83-93; also see FIGs. 6-8 and their corresponding paragraphs). In the above citations, it is clear that reordering operations are being performed on a channel level, which would result in the reordering unit being a channel unit. In response to applicant's argument that the references fail to show certain features of the invention, it is noted that the features upon which applicant relies (i.e., determining a reordering unit adaptively among different levels such as a picture unit, a channel group unit, or a channel unit) are not recited in the rejected claim(s). Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. See In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993). Specifically, regarding the Lee reference, Examiner points out that the operations performed on the reordering unit being done at a channel level satisfies the limitation of “the reordering unit being determined to be one of a picture unit, a feature group unit, or a channel unit”. There is no requirement in the claim limitation that the reordering unit being determined to be a channel (or some other) unit is done adaptively, just as long as the reordering unit happens to be determined to be a channel (or some other) unit. In view of the above reasons, Examiner maintains all rejections. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ZHIHAN ZHOU whose telephone number is (571)270-7284. The examiner can normally be reached Mondays-Fridays 8:30am-5pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Christopher Kelley can be reached at 571-272-7331. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ZHIHAN ZHOU/Primary Examiner, Art Unit 2482
Read full office action

Prosecution Timeline

Feb 16, 2024
Application Filed
Oct 06, 2025
Non-Final Rejection — §102, §103
Jan 08, 2026
Response Filed
Feb 24, 2026
Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602830
METHOD FOR CALIBRATING CAMERAS OF A MULTICHANNEL MEDICAL VISUALIZATION SYSTEM AND MEDICAL VISUALIZATION SYSTEM
2y 5m to grant Granted Apr 14, 2026
Patent 12604043
Sample Adaptive Offset (SAO) Parameter Signaling
2y 5m to grant Granted Apr 14, 2026
Patent 12597167
SYSTEM AND METHODS FOR DETERMINING CAMERA PARAMETERS AND USES THEREOF
2y 5m to grant Granted Apr 07, 2026
Patent 12593055
SELECTIVE JUST-IN-TIME TRANSCODING
2y 5m to grant Granted Mar 31, 2026
Patent 12593039
CROSS-COMPONENT SAMPLE OFFSET (CCSO) WITH ADAPTIVE MULTI-TAP-FILTER CLASSIFIERS
2y 5m to grant Granted Mar 31, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
79%
Grant Probability
81%
With Interview (+1.3%)
2y 3m
Median Time to Grant
Moderate
PTA Risk
Based on 987 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month