Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Notice to Applicants
This action is in response to the communication filed on 09/01/2023.
The claims 1-20 are currently pending.
Information Disclosure Statement
The information disclosure statement (IDS) filed on 09/01/2023 has been fully considered.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claim 20 is rejected under 35 U.S.C. 101 because the claim is directed to “a computer-readable storage medium” and can be interpreted as a signal per se and not a hardware embodiment, where a machine claim is directed towards a system, apparatus, or arrangement. Paragraphs [0445-0446] of the specification provide examples of non-transitory embodiments for a computer-readable storage medium, but fail to define computer-readable storage medium to exclude transitory signal embodiments. It is advised that the applicant amend the phrase “a computer-readable storage medium” to read “a non-transitory computer-readable storage medium” in order to overcome 101 rejection to the claim.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-20 are rejected under 35 § U.S.C. 102(a)(1) as being anticipated by US 2022/0092827 A1 to DO et al. (hereinafter “DO”).
As per claim 1, DO discloses an encoding method (a computing system and corresponding method for selective image feature encoding/decoding; abstract; fig 1; paragraphs [0063-0067]), comprising: extracting a feature map from an input image (adapted to extract a feature map from an input image by using a first feature extraction unit 130 and second feature extraction unit 140; fig 1; paragraphs [0063-0067]); acquiring feature map information from the feature map (the system is adapted to acquire feature information from the feature map as a residual feature map and a first image bitstream; fig 1; paragraphs [0067-0068]); and generating a bitstream by performing encoding on the feature map information (step 270, the feature-information-encoding unit 170 performs encoding on the preprocessed residual feature information generating a residual feature map which is considered the residual feature map bitstream and is a second bit stream; paragraphs [0093-0097]).
As per claim 2, DO discloses the encoding method of claim 1, wherein the feature map information includes at least one of a basis vector or a transform coefficient, or a combination thereof (feature preprocessing unit 160 outputs a transform coefficient related to a transform basis vector; paragraph [0221]).
As per claim 3, DO discloses the encoding method of claim 1, wherein: a basis vector of the feature map is derived using the feature map, and a transform coefficient is derived by performing a transform on the feature map based on the basis vector (the basis vector is derived by feature information preprocessing unit 160 to output a transform basis vector as a result of a domain transformation performed on information/data indicating a picture; paragraph [0221]).
As per claim 4, DO discloses the encoding method of claim 1, wherein a fixed basis vector of the feature map is derived using the feature map information, and a transform that uses the fixed basis vector is performed on the feature map (transformed residual feature information includes the transform coefficient and the transform basis vector, which is used to perform quantization on the transform basis vector, thereby generating a quantized transform basis vector; paragraphs [0221-0224]).
As per claim 5, DO discloses the encoding method of claim 4, wherein a common transform coefficient is generated by performing the transform that uses the fixed basis vector on the feature map (using adder 1250 and the residual feature map bitstream which was transformed using the transformation coefficient to generate the basis vector to use on the bitstream to generate reconstructed feature information using the feature map information of the reconstructed image, wherein the feature maps are adapted to track image components the image components are arranged as respective components for example, at least two of the components may be combined into one component, or one component may be further divided into multiple components; fig 1; paragraphs [0059], [0326-0329]).
As per claim 6, DO discloses the encoding method of claim 4, wherein: a joined feature map is generated by joining multiple reconfigured feature maps, and the joined feature map is used to derive the fixed basis vector (using adder 1250 and the residual feature map bitstream which was transformed using the transformation coefficient to generate the basis vector to use on the bitstream to generate reconstructed feature information using the feature map information of the reconstructed image; fig 12; paragraphs [0326-0329]).
As per claim 7, DO discloses the encoding method of claim 1, wherein at least one of quantization or packing, or a combination thereof is performed on the feature map information (the computing system performs preprocessing steps which include quantization; paragraphs [0091]).
As per claim 8, DO discloses the encoding method of claim 7, wherein at least one of the quantization or the packing, or a combination thereof is skipped depending on a type of the feature map information (the various information processing steps may be skipped as desired by the user wherein the quantization step may be skipped; paragraphs [0343-0344], [0380]).
As per claim 9, DO discloses the encoding method of claim 1, wherein the feature map includes multiple feature maps having different resolutions (the image post processing unit 1260 is adapted to include resolution up-sampling unit 1510 and an inverse color format transformation unit 1520 and is adapted to provide a generated up sampled reconstructed image of a different resolution wherein up sampling would cause a higher resolution to occur; paragraphs [0400-0402]).
As per claim 10, DO discloses a computer-readable storage medium for storing a bitstream generated by the encoding method of claim 1 (the computing system further comprises a memory 1730 adapted to store data, information, programs, and instructions related to the image processing method described; fig 17; paragraph [0425]).
As per claim 11, DO discloses a decoding method (a computing system and corresponding method for selective image feature encoding/decoding; abstract; fig 12-13; paragraphs [0074], [0152]), comprising: acquiring feature map information from a bitstream (a residual feature map is generated from an input residual feature map bit stream; figs 12-13; paragraph [0275], [0285]); and acquiring a feature map from the feature map information (a reconstructed feature map is generated from the bit stream information relating the residual feature map; figs 12-13; paragraphs [0287], [0310], [0443]).
As per claim 12, DO discloses the decoding method of claim 11, wherein the feature map information includes at least one of a basis vector or a transform coefficient, or a combination thereof (feature preprocessing unit 160 outputs a transform coefficient related to a transform basis vector; paragraph [0221]).
As per claim 13, DO discloses the decoding method of claim 12, wherein the feature map is reconstructed by performing an inverse transform that uses at least one of the basis vectors or the transform coefficient, or a combination thereof (the basis vector is derived by feature information preprocessing unit 160 to output a transform basis vector as a result of a domain transformation performed on information/data indicating a picture; paragraph [0221]).
As per claim 14, DO discloses the decoding method of claim 11, wherein the feature map is reconstructed by performing an inverse transform that uses at least one of a fixed basis vector or a fixed transform coefficient of the feature map, or a combination thereof (transformed residual feature information includes the transform coefficient and the transform basis vector, which is used to perform quantization on the transform basis vector, thereby generating a quantized transform basis vector; paragraphs [0221-0224]).
As per claim 15, DO discloses the decoding method of claim 14, wherein the bitstream includes at least one of the fixed basis vector or the fixed transform coefficient, or a combination thereof (using adder 1250 and the residual feature map bitstream which was transformed using the transformation coefficient to generate the basis vector to use on the bitstream to generate reconstructed feature information using the feature map information of the reconstructed image; fig 12; paragraphs [0326-0329]).
As per claim 16, DO discloses the decoding method of claim 11, wherein at least one of inverse packing or inverse quantization or a combination thereof is performed on the feature map information (the computing system performs preprocessing steps which include inverse quantization; paragraphs [0091], [0334]).
As per claim 17, DO discloses the decoding method of claim 16, wherein at least one of the inverse-quantization or the inverse-packing or a combination thereof is skipped depending on a type of the feature map information (the various information processing steps may be skipped as desired by the user wherein the quantization step may be skipped; paragraphs [0343-0344], [0380]).
As per claim 18, DO discloses the decoding method of claim 11, wherein the feature map includes multiple feature maps having different resolutions (the image post processing unit 1260 is adapted to include resolution up-sampling unit 1510 and an inverse color format transformation unit 1520 and is adapted to provide a generated up sampled reconstructed image of a different resolution wherein up sampling would cause a higher resolution to occur; paragraphs [0400-0402]).
As per claim 19, DO discloses the decoding method of claim 11, further comprising: deriving a result of a deep-learning network by executing a machine-vision task using the feature map (the computing system is adapted to utilized deep learning processing unit 1270 in order to perform a computer vision task; fig 12; paragraphs [0010], [0274], [0464]).
As per claim 20, DO discloses a computer-readable storage medium for storing a bitstream for image decoding (computing system and corresponding method for selective image feature encoding/decoding the computing system further comprises a memory 1730 adapted to store data, information, programs, and instructions related to the image processing method described; abstract; figs 12-13, and 17; paragraph [0063-0067], [0425]), wherein: the bitstream includes feature map information (the system is adapted to acquire feature information from the feature map as a residual feature map and a first image bitstream; fig 1; paragraphs [0067-0068]), and a feature map is acquired from the feature map information (step 270, the feature-information-encoding unit 170 performs encoding on the preprocessed residual feature information generating a residual feature map which is considered the residual feature map bitstream and is a second bit stream; paragraphs [0093-0097]).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. These prior arts include the following:
US 2006/0291556 A1
US 2021/0279912 A1
US 2022/0108490 A1
WO 2013/044393 A1
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DEVIN JACOB DHOOGE whose telephone number is (571) 270-0999. The examiner can normally be reached 7:30-5:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Andrew Bee can be reached on (571) 270-5183. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800- 786-9199 (IN USA OR CANADA) or 571-272-1000.
/Devin Dhooge/
USPTO Patent Examiner
Art Unit 2677
/ANDREW W BEE/Supervisory Patent Examiner, Art Unit 2677