DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 03/02/2026 has been entered.
Response to Arguments
Applicant’s arguments with respect to claim(s) 1-3, 5 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-3, 5 are rejected under 35 U.S.C. 103 as being unpatentable over NA et al. (US 20170310976 A1) in view of Hannuksela (US 20190082184 A1) and LIN et al. (US 20170353737 A1).
Regarding claims 3, 5. (Currently Amended) NA discloses A method of encoding an image with an encoding apparatus ([0001] video encoding and decoding methods; figure 11), comprising:
determining whether to perform type-based partitioning for a current block included in the image ([0205] it is determined whether to split each of the coding units; [0224] whether encoding is performed on coding units of a lower depth instead of a current depth), the current block being partitioned into a plurality of sub-partitions when the type-based partitioning being performed for the current block ([0213] Examples of a partition type; [0218] the coding unit determiner 120 not only determines a coded depth having a least encoding error, but also determines a partition type in a prediction unit), the determination being encoded into encoding data for the current block ([0224] The information about the coded depth may be defined by using split information according to depths, which indicates whether encoding is performed on coding units of a lower depth instead of a current depth; [0298]; [0319]);
generating a prediction block of the current block included in the image, the prediction block being generated for each of the plurality of sub-partitions when the type-based partitioning being performed for the current block ([0214] The encoding is independently performed on one prediction unit in a coding unit; figure 11, [0264] an intra predictor 420 performs intra prediction on coding units in an intra mode according to prediction units, from among a current frame 405, and an inter predictor 415 performs inter prediction on coding units in an inter mode by using a current image 405 and a reference image obtained from a reconstructed picture buffer 410 according to prediction units);
obtaining a residual block based on the prediction block, the residual block being obtained for each of the plurality of sub-partitions when the type-based partitioning being performed for the current block ([0265] Residue data is generated by removing prediction data regarding coding units of each mode that is output from the intra predictor 420 or the inter predictor 415 from data regarding encoded coding units of the current image 405);
encoding residual coefficients of the residual block ([0222] The encoded image data may be obtained by encoding residual data of an image; [0265] Residue data is output as a quantized transformation coefficient according to transformation units through a transformer 425 and a quantizer 430);
including syntax elements for the residual coefficients into encoding data for the current block ([0218] Encoding information according to coding units; [0229] The encoding information according to the deeper coding units may include the information about the prediction mode and about the size of the partitions. The encoding information according to the prediction units may include information about an estimated direction of an inter mode, about a reference image index of the inter mode, about a motion vector, about a chroma component of an intra mode, and about an interpolation method of the intra mode); and
encoding post processing information into the encoding data (figure 11, [0265] The reconstructed data in the space domain is generated as reconstructed images through a de-blocking unit 455 and an SAO performer 460 and the reconstructed images are stored in the reconstructed picture buffer 410. The reconstructed images stored in the reconstructed picture buffer 410 may be used as reference images for inter prediction of another image; figure 12, [0269] An entropy decoder 515 parses encoded image data to be decoded and encoding information required for decoding from a bitstream 505; [0271] (inherently the de-blocking and SAO information is parsed from the bitstream)),
wherein the residual coefficients are encoded by performing both transformation and quantization ([0265] Residue data is output as a quantized transformation coefficient according to transformation units through a transformer 425 and a quantizer 430),
wherein the post processing information is used for performing a post processing on a reconstructed image, the reconstructed image being an image obtained based on the residual block of the current block (figure 11, [0265] The reconstructed data in the space domain is generated as reconstructed images through a de-blocking unit 455 and an SAO performer 460 and the reconstructed images are stored in the reconstructed picture buffer 410. The reconstructed images stored in the reconstructed picture buffer 410 may be used as reference images for inter prediction of another image; figure 12, [0269] An entropy decoder 515 parses encoded image data to be decoded and encoding information required for decoding from a bitstream 505; [0271] (inherently the de-blocking and SAO information is parsed from the bitstream)).
However, NA does not explicitly disclose
wherein the post processing comprises padding at least one region to the reconstructed image,
wherein the padding is performed by
determining at least one equation which uses a width of the image or a height of the image,
determining a sample inside the reconstructed image based on the equation, and
using the sample inside the reconstructed image for padding the region.
Hannuksela discloses
wherein the post processing comprises padding at least one region to the reconstructed image (abstract, filling an area outside the effective picture area to produce a padded reference picture, wherein the filled area forms a boundary extension),
wherein the padding is performed by
determining at least one equation which uses a width of the image or a height of the image ([0225] by copying the signal of the effective picture area representing 360-degree panoramic content from the opposite side of the effective picture (inherently, the picture width and height are considered in copying)),
determining a sample inside the reconstructed image based on the equation ([0225] by copying the signal of the effective picture area representing 360-degree panoramic content from the opposite side of the effective picture (inherently, the picture width and height are considered in copying)), and
using the sample inside the reconstructed image for padding the region ([0223] equivalently areas outside the picture boundary in the reconstructed pictures may be padded with border sample values; [0229] the decoder decodes the pictures to obtain reconstructed frame(s) (block 82) and perform frame padding to fill in areas outside an effective picture area to produce a padded reference picture (inherently areas outside an effective picture area may be obtained by considering the width and height of the effective picture area); [0225] by copying the signal of the effective picture area representing 360-degree panoramic content from the opposite side of the effective picture (inherently, the picture width and height are considered in copying)).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the inventions of NA and Hannuksela, to apply a padding process, in order to code the image more efficiently.
However, NA in view of Hannuksela doesn’t explicitly disclose
wherein the padding is performed by
partitioning the reconstructed image into a plurality of partitioning units; and
performing the padding for the each partitioning unit.
LIN discloses
wherein the padding is performed by
partitioning the reconstructed image into a plurality of partitioning units (figure 30); and
performing the padding for the each partitioning unit (figure 30, assigning padding modes for each padding region in a layout scheme).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the inventions of NA, Hannuksela and LIN, to also apply padding for cube faces.
Regarding claim 1. (Currently Amended) the same analysis has been stated in claim 3 (corresponding decoding method).
Regarding claim 2. (Original) Hannuksela discloses The method of claim 1, wherein the post processing information is a SEI message ([0146] A non-VCL NAL unit may be a supplemental enhancement information (SEI) NAL unit. Parameter sets may be needed for the reconstruction of decoded pictures; [0147] a sequence parameter set RBSP includes parameters that can be referred to by one or more picture parameter set RBSPs or one or more SEI NAL units containing a buffering period SEI message).
The same analysis has been stated in claim 3.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to XIAOLAN XU whose telephone number is (571)270-7580. The examiner can normally be reached Mon. to Fri. 9am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, SATH V. PERUNGAVOOR can be reached at (571) 272-7455. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/XIAOLAN XU/ Primary Examiner, Art Unit 2488