DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
2. The information disclosure statement (IDS) was submitted on 11/11/2024. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Status
3. Claims 1-20 are currently pending.
Double Patenting
4. Claim 1 of the instant Application is patentably indistinct from claims 1 of the issued Patent No. US 12,113,998 (herein “conflicting patent”) pursuant to 37 CFR 1.78(f) or pre-AIA 37 CFR 1.78(b).
The nonstatutory obviousness double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees.
When two or more applications filed by the same applicant contain patentably indistinct claims, elimination of such claims from all but one application may be required in the absence of good and sufficient reason for their retention during pendency in more than one application. Applicant is required to either cancel the patentably indistinct claims from all but one application or maintain a clear line of demarcation between the applications. See MPEP § 822
A nonstatutory double patenting rejection is appropriate where the claims at issue are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); and In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on a nonstatutory double patenting ground provided the reference application or patent either is shown to be commonly owned with this application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The USPTO internet Web site contains terminal disclaimer forms which may be used. Please visit http://www.uspto.gov/forms/. The filing date of the application will determine what form should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to; http://www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp.
Although the claims at issue are not identically recited, they are not patentably distinct from each other as explained below;
The “conflicting patent” recites inter alia at claim 1; “an indication of decoder-side-prediction of an intra prediction mode, wherein a first value of the indication indicates that the intra prediction mode is to be determined by the decoder and a second value of the indication indicates that the intra prediction mode is to be determined from the bitstream; and
a reconstructed residual block;
based on the indication having the first value, generating, by the decoder, a plurality of reconstructed blocks for a plurality of intra prediction modes, wherein each reconstructed block of the plurality of reconstructed blocks is generated based on: “
The ordinary skilled in the art would have found obvious to find the same coding process at the instant claim 1, where the indication is represented by a flag or a directional index signaled in the bitstream indicating an intra-prediction processing of the current block, to be further determined at decoder for multiple predictors derived from multiple intra prediction modes, according to a visual quality metric ; “an indication that an intra prediction mode is to be determined at the decoder to decode the block; based on the indication:
generating a plurality of reconstructed blocks for a plurality of intra prediction modes, wherein each of the plurality of reconstructed blocks is generated based on a prediction block generated using a respective intra prediction mode of the plurality of intra prediction modes; and” , thus finding a direct correlation of the term indication to represent a signaled syntax in the form of a flag or index indicating the intra-prediction modes selected from multiple intra probable modes of an (MPM).
Instant Application vs. Conflicting Patent - claim analysis
Specifically, all the components of the instant application at the method claim 1, are defined and encompass the limitations claimed by the conflicting pending application.
The difference between the instant application at claim 1 and the “conflicting patent” is in effect a “species” of the “generic” invention of the instant application claim 1.
It has been held by the Court that the generic invention is “anticipated” by the “species”. See In re Goodman, 29 USPQ2d 2010 (Fed. Cir. 1993).
Since the instant application claim 1 is anticipated by claim 1 of the conflicting pending application for patent, it is deemed patentably indistinct from the named claim 1 of the conflicting patent, as detailed below.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
5. Claims 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over Cheng-Tsai Ho et al., (hereinafter Ho) (US 2014/0254659) and Ioannis Andreopoulos et al., (hereinafter Andreopoulos) (US 2021/0211739) in view of Zineb Agyo et al., (hereinafter Agyo) (US 2017/0264914).
The applied references do not have a common inventor with the instant application.
Re Claim 1. Ho discloses, a method (a visual quality-based coding, Abstract) comprising:
receiving, by a decoder and from a bitstream for a block (receiving a bitstream including source images IMGIN, at decoder, Par.[0046]),
an indication that an intra prediction mode is to be determined (at module 123 of the decoding/prediction loop at Par.[0022-0023, 0028, 0033] at Fig.1, selector 123 for intra-prediction mode, in Fig.1) at the decoder to decode the block (an intra-prediction mode being determined at module 123 of the decoder prediction loop, in Fig.1 below,
PNG
media_image1.png
200
400
media_image1.png
Greyscale
, Par.[0029 ]);
based on the indication (based on indication in the bitstream BS carrying encoded frame information corresponding to the source frame IMGIN, Par.[0021]):
generating a plurality of reconstructed blocks for a plurality of intra prediction modes (generating the reconstructed intra-predicted blocks, at the output of the summer 118, as the dotted line for intra prediction connection tract in Fig.1), wherein each of the plurality of reconstructed blocks is generated based on a prediction block generated using a respective intra prediction mode of the plurality of intra prediction modes (generating intra predicted pixel data at module 123 in Fig.1, Par.[0023]); and
selecting, based on a visual quality of each of the plurality of reconstructed blocks, a prediction mode, from the plurality of intra prediction modes, as the intra prediction mode of the block (selecting the best coding mode for the coding unit or prediction unit at decoder loop, based on the visual quality evaluation unit 104, by calculating a plurality of distinct quality metrics of the data in the coding loop of circuit 102, Par.[0025-0026] for each candidate inter or intra mode to decide which candidate should be selected, Par.[0029, 0032] at intra prediction module 123, Par.[0033-0036]); and
decoding the block based on the intra prediction mode (decoding according to the coding parameter(s) set based on the evaluated visual quality included in the received bitstream BS, of the source image IMGIN, Par.[0004, 0020, 0046]).
In analogous art to the visual quality reconstruction of the video blocks, Andreopoulos teaches about video information fidelity post-processing at decoder side as claimed, for the block based on a visual quality of each of the reconstructed blocks (by applying visual quality improvement measures Par.[0008], [0038-0039]).
Furthermore, in order to emphasize the mode selection and enablement at decoder side for processing,
The Art to Agyo, is identified to specifically teach the,
the indication of the decoder-side-prediction being enabled (the signaled intra-prediction mode (107) is encoded and transmitted at decoder by (105) in Fig.1 Par.[0047, 0049, 0055, 0070], as information to decoder for intra-mode prediction, Par.[0055, 0068]); and
Based on the various visual quality processing at decoder site, and the known Neural Network learning techniques related to visual quality coding, the ordinary skilled in the art would have found obvious before the effective filing date of invention to further extent the visual quality method applied to both intra and inter prediction modes found in Ho, and as further based on reconstructed video blocks relying on visual quality metrics, or among other, training a system based on scores represented by a blind image quality assessment of referenced values to ensure efficient coding to visual quality learning models as those identified in Andreopoulos (for intra prediction Par.[0003] and improving visual quality [0006]), where Agyo, at Par.[0070] obviates the already implicit signaled intra-prediction mode in Ho and expressly teaching the information being received at decoder, hence deeming the combination predictable.
Re Claim 2. Ho, Andreopoulos and Agyo disclose, the method of claim 1, Ho teaches about, wherein the decoded block comprises a reconstructed block of the plurality of reconstructed blocks generated using the prediction mode (the reconstructed image frame at Par.[0020, 0023] with enhanced visual quality, Par.[0027, 0029, 0033, 0035] and claim 4, etc.).
Re Claim 3. Ho, Andreopoulos and Agyo disclose, the method of claim 1, further comprising:
Ho teaches about,
receiving transform coefficients from the bitstream (receiving the transform coefficients from bitstream unit 113, Fig.1, Par.[0022-0023, 0028, 0035]); and
decoding the transform coefficients to generate a reconstructed residual block (decoding transform coefficients at 113-114-116-117 to generate the residual at adder unit 118, Fig.1 Par.[0022]),
wherein each reconstructed block of the plurality of reconstructed blocks is generated further based on the reconstructed residual block (generating the block reconstruction at modules 119-120, to be stored at frame buffer 121, Fig.1, Par.[0022-0023]).
Re Claim 4. Ho, Andreopoulos and Agyo disclose, the method of claim 3,
Ho teaches about, wherein the reconstructed block, generated for each respective intra prediction mode of the plurality of intra prediction modes, is generated by summing the prediction block generated for the respective intra prediction mode and the reconstructed residual block (reconstruction is based on the intra-predicted block(s) at module 123 output summed to the residual 118 per Fig.1 Par.[0022]).
Re Claim 5. Ho, Andreopoulos and Agyo disclose, the method of claim 1,
Ho teaches about, wherein the prediction mode for the block is selected further based on a visual quality, among the visual qualities of the reconstructed blocks, with a highest visual quality (the best video coding intra mode for block prediction is selected based on visual quality metrics and/or the best pixel-based distortion, Par.[0028-0031]).
Re Claim 6. Ho, Andreopoulos and Agyo disclose, the method of claim 1,
Ho teaches about, wherein the visual quality of each of the reconstructed blocks is determined without using the block as a reference (the visual quality of the reconstructed blocks is determined without using the block as reference, taken from the residual through the decoding loop at modules 123-104-116-117-118 fed back to 104, Fig.1).
Similarly, Andreopoulos teaches the visual quality being achieved, without using the block as a reference (a non-reference based score for quality, producing a blind image quality assessment of x^, Par.[0073], Figs.2a, 2c and 3).
Re Claim 7. Ho, Andreopoulos and Agyo disclose, the method of claim 1,
Andreopoulos teaches that, wherein the visual quality of each of the reconstructed blocks is determined based on a visual parameter measurement index (a structural similarity index metric for determining a cost function, Par.[0032, 0046, 0095]).
Re Claim 8. Ho, Andreopoulos and Agyo disclose, the method of claim 1,
Andreopoulos teaches, wherein the visual quality of each of the reconstructed blocks is determined based on a deep Learning for Blind Image Quality Assessment (BIQA at Par.[0008],[0073]).
Re Claim 9. This claim represents the decoding device comprising one or more processors (Ho: unit 100 in Fig.1) and memory (instruction storage memory not shown but considered implicitly present or inherent to any signal processing unit, 100 in Fig.1) or (Andreopoulos: Par.[0074], [0082], Fig.1) storing instructions that, when executed by the one or more processors, cause the decoder to implement each and every processing limitation of the method claim 1, hence it is rejected on the same evidentiary premise, mutatis mutandis.
Re Claim 10. This claim represents the decoding device comprising one or more processors (Ho: unit 100 in Fig.1) and memory (instruction storage memory not shown but considered implicitly present or inherent to any signal processing unit, 100 in Fig.1) or (Andreopoulos: Par.[0074], [0082], Fig.1) storing instructions that, when executed by the one or more processors, cause the decoder to implement each and every processing limitation of the method claim 3, hence it is rejected on the same evidentiary premise, mutatis mutandis.
Re Claim 11. This claim represents the decoding device comprising one or more processors (Ho: unit 100 in Fig.1) and memory (instruction storage memory not shown but considered implicitly present or inherent to any signal processing unit, 100 in Fig.1) or (Andreopoulos: Par.[0074], [0082], Fig.1) storing instructions that, when executed by the one or more processors, cause the decoder to implement each and every processing limitation of the method claim 4, hence it is rejected on the same evidentiary premise, mutatis mutandis.
Re Claim 12. This claim represents the decoding device comprising one or more processors (Ho: unit 100 in Fig.1) and memory (instruction storage memory not shown but considered implicitly present or inherent to any signal processing unit, 100 in Fig.1) or (Andreopoulos: Par.[0074], [0082], Fig.1) storing instructions that, when executed by the one or more processors, cause the decoder to implement each and every processing limitation of the method claim 2, hence it is rejected on the same evidentiary premise, mutatis mutandis.
Re Claim 13. This claim represents the decoding device comprising one or more processors (Ho: unit 100 in Fig.1) and memory (instruction storage memory not shown but considered implicitly present or inherent to any signal processing unit, 100 in Fig.1) or (Andreopoulos: Par.[0074], [0082], Fig.1) storing instructions that, when executed by the one or more processors, cause the decoder to implement each and every processing limitation of the method claim 5, hence it is rejected on the same evidentiary premise, mutatis mutandis.
Re Claim 14. This claim represents the decoding device comprising one or more processors (Ho: unit 100 in Fig.1) and memory (instruction storage memory not shown but considered implicitly present or inherent to any signal processing unit, 100 in Fig.1) or (Andreopoulos: Par.[0074], [0082], Fig.1) storing instructions that, when executed by the one or more processors, cause the decoder to implement each and every processing limitation of the method claim 6, hence it is rejected on the same evidentiary premise, mutatis mutandis.
Re Claim 15. This claim represents the decoding device comprising one or more processors (Ho: unit 100 in Fig.1) and memory (instruction storage memory not shown but considered implicitly present or inherent to any signal processing unit, 100 in Fig.1) or (Andreopoulos: Par.[0074], [0082], Fig.1) storing instructions that, when executed by the one or more processors, cause the decoder to implement each and every processing limitation of the method claim 7, hence it is rejected on the same evidentiary premise, mutatis mutandis.
Re Claim 16. This claim represents the decoding device comprising one or more processors (Ho: unit 100 in Fig.1) and memory (instruction storage memory not shown but considered implicitly present or inherent to any signal processing unit, 100 in Fig.1) or (Andreopoulos: Par.[0074], [0082], Fig.1) storing instructions that, when executed by the one or more processors, cause the decoder to implement each and every processing limitation of the method claim 8, hence it is rejected on the same evidentiary premise, mutatis mutandis.
Re Claim 17. This claim represents the encoding method (Ho: the encoding module 115 of the coding unit 102, generating the bitstream BS, per Fig.1) performing each and every limiting steps of the prediction loop of the decoding method of claim 1, hence it is rejected on the same evidentiary premise, mutatis mutandis.
Re Claim 18. This claim represents the encoding method (Ho: the encoding module 115 of the coding unit 102, generating the bitstream BS, per Fig.1) performing each and every limiting steps of the prediction loop of the decoding method of claim 6, hence it is rejected on the same evidentiary premise, mutatis mutandis.
Re Claim 19. Ho, Andreopoulos and Agyo disclose, the method of claim 17,
Andreopoulos teaches, wherein the signaling comprises signaling, in the bit stream, an indication that the first prediction mode is to be determined at a decoder based on the first prediction mode being same as the second prediction mode (both reference and non-reference modes are intra-prediction mode Par.[0073]).
Re Claim 20. Ho, Andreopoulos and Agyo disclose, the method of claim 17,
Agyo teaches about, wherein the signaling comprises signaling, in the bit stream, a syntax element indicating the first prediction mode based on the first prediction mode being not same as the second prediction mode (intra prediction based on different signaled indices, Par.[0068]).
Conclusion
6. The prior art made of record and not relied upon, is considered pertinent to applicant's disclosure as listed below.
US 2014/0254680; US 2022/0345715: US 2022/0377324.
See PTO-892 form. Applicant is required under 37 C.F.R. 1.111(c) to consider these references when responding to this action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DRAMOS KALAPODAS whose telephone number is (571)272-4622. The examiner can normally be reached on Monday-Friday 8am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, David Czekaj can be reached on 571-272-7327. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DRAMOS KALAPODAS/Primary Examiner, Art Unit 2487