DETAILED ACTION
Response to Arguments
Applicant's arguments filed with respect to claims 1-8 have been fully considered but are moot in view of the new ground(s) of rejection. The rejections are necessitated due to claim amendments.
NOTE: Regarding 101, the claim integrates the abstract idea (mathematical concept) into practical application because the claim is not just “analyzing data” in the abstract; it applies the calculations to a concrete image processing flow of determine images similarities.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1, 7 and 8 are rejected under 35 U.S.C. 102 (a)(1) as being anticipated by TANG et al. (Pub. No. US 2021/0319236).
Regarding claim 1, TANG is at least one memory configured to store one or more instructions; and at least one processor configured to execute the one or more instructions to: perform, on a first image (input image) and on a second image (target image), extraction processing of extracting a feature value (keypoint descriptor) [Para. 38 “for each pixel of the image 302 processed by the keypoint model 300, the keypoint model 300 also outputs a keypoint descriptor 310”; “The descriptor of each keypoint in the input image 302 may be obtained by sampling an appropriate location in a dense descriptor map.”; and “the associated descriptor in a target frame may be obtained by sampling the appropriate location in the target descriptor map based on the warped keypoint position”] and estimation processing of estimating a cluster (regions/(semantic label)) to which each pixel belongs, for each of the first image and the second image [Para. 46, “the semantically aware keypoint matching model 400 segments each input image 302 (e.g., each frame of a sequence of frames) into multiple regions.”; Para. 47 “the semantically aware keypoint matching model 400 tracks one or more regions of the input image 302 (e.g., source image) to one or more regions of a target image, resulting in a matched region in the target image”; and Para. 41 “a pixel of the input image 302 may be associated with a keypoint descriptor 310 that is augmented with semantic information, such as semantic features and/or a semantic label”]; and compute a similarity degree (euclidean distance) for the feature value (keypoint descriptor) between pixels (keypoints) of the first image (input images) estimated to belong to a specific cluster (semantic label) and pixels (keypoints) of the second image (semantic label) estimated to belong to a corresponding same cluster, and thereby compute a similarity degree between the first and second image [Para. 48 “the keypoint matching considers both a similarity of keypoint descriptors 310 associated with keypoints of the input image 302 and the target image 404, as well as a similarity of semantic information associated with keypoints of the input image 302 and the target image 404. In some such implementations, the keypoint matching may be biased toward keypoints associated with the same semantic label”; Para. 49; and Para. 43].
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-3, 7, and 8 are rejected under 35 U.S.C. 103 as being unpatentable KHALILIAN-GOURTANI (Pub. No. 2023/0005243 hereinafter “KHAL”) in view of MAKI et al. (Pub. No. US 2016/0148393).
Regarding claim 1, KHAL teaches image processing apparatus comprising: at least one memory configured to store one or more instructions [Para. 14, and 42]; and at least one processor configured to execute the one or more instructions [Para. 42] to: perform, on an image, extraction processing of extracting a feature/color/location value [Para. 43, Fig. 9 and related description. It’s clear that color information is extracted], and estimation processing of estimating a cluster to which each pixel belongs [Para. 43, 53, Fig. 9 and related description]; and compute a similarity degree (more/most similar) for the feature value between pixels estimated to belong to a same cluster [Para. 53, 102, fig. 4 step 414, 416, 420; fig. 11 and related description], and thereby compute a similarity degree between two images (two non-adjoint pixel clusters) [Para. 102, 107. It’s clear that the collection/cluster of pixels is image].
However, KHAL doesn’t explicitly teach extracting features from first and second image, and specify clusters and pixels of the second image estimated to belong to a corresponding same cluster; and compute similarity between first and second image.
MAKI teaches extracting feature values (intensity value) [Para. 46]; estimation processing of estimating a cluster (subregions) to which each pixel belongs, for each of the first image (first image patch) and the second image (second image patch) [Para. 49 “the second image patch is segmented into a plurality of subregions. The second image patch is segmented by defining regions according to the intensity of the pixels of the first image patch. On the first image patch each subregion is defined as the set of pixels having intensities within a range of values. The subregions on the second image patch are defined as the sets of pixels of the second image patch which have locations corresponding to pixels within a given subregion on the first image patch”]; and compute a similarity degree (similarity measure) for the feature value (intensity value) between pixels (elements) of the first image (first image patch) estimated to belong to a specific cluster (subregion) and pixels/elements of the second image estimated to belong to a corresponding same cluster, and thereby compute a similarity degree/measure between the first and second images [Para. 21-23, and 26].
It would have been obvious to one of ordinary skill in the art before the effective filing date to modify KHAL to teach the claim limitation, feature as taught by MAKI; because the modification system to improve how similarity between two image patches is calculated by segmenting pixels into intensity-based subregions and computing a similarity measure by aggregating per subregion variance between corresponding pixels intensity values.
Regarding claims 2, KHAL teaches wherein a plurality of the clusters are classified into a reference cluster (one of the first non-adjoint pixel clusters) and a non- reference cluster (second of the first non-adjoint pixel clusters), and the processor is further configured to execute the one or more instructions to use a pixel estimated to belong to the reference cluster without using a pixel estimated to belong to the non-reference cluster, and thereby compute a similarity degree for the feature value (more/most similar) [fig. 4, 9, and 11 and related description. The term “reference” has no patentable weight because it is just the name of the cluster. Unless the claims explicitly define/use the “similarity degree for the feature value”, the reference teaching of “most/more similar” reads on the limitation].
Regarding claim 3, KHAL teaches wherein a plurality of the clusters are classified into the reference cluster and the non- reference cluster, based on a user input/indicated/guided/specified [Abstract, Para. 5, 14, 16, and 37].
Claim 4 is rejected under 35 U.S.C. 103 as being unpatentable KHALILIAN-GOURTANI (Pub. No. 2023/0005243 hereinafter “KHAL”) in view of MAKI et al. (Pub. No. US 2016/0148393) and in view of PIOTTO et al. (Pub. No. US 2016/0335523).
Regarding claim 4, KHAL in view of MAKI doesn’t explicitly teach the claim limitation. However, PIOTTO teaches wherein the processor is further configured to execute the one or more instructions to decide a pixel being a keypoint, computes a similarity degree for the feature value between the pixels decided as the keypoints, and thereby compute a similarity degree between two images [Para. 5, and 18].
It would have been obvious to one of ordinary skill in the art before the effective filing date to modify KHAL in view of MAKI to teach the claim limitation, feature as taught by PIOTTO; because the modification system to provide an efficient method for detecting incorrect associations between key points of a first image and key points of a second image.
Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable KHALILIAN-GOURTANI (Pub. No. 2023/0005243 hereinafter “KHAL”) in view of MAKI et al. (Pub. No. US 2016/0148393) and in view of Otake (Pub. No. US 2017/0061626).
Regarding Claim 5, KHAL in view of MAKI doesn’t explicitly teach the claim limitation.
However, Otake teaches wherein the processor is further configured to execute the one or more instructions to compute a similarity degree between a processing target image and each of a plurality of reference images associated with position information, and output, as the position information related to the processing target image, the position information associated with the reference image for which the similarity degree to the processing target image is equal to or larger than a threshold value [Para. 67, fig. 8, 12 and related decryption].
It would have been obvious to one of ordinary skill in the art before the effective filing date to modify KHAL in view of MAKI to teach the claim limitation, feature as taught by Otake; because the modification provides a method which performs pattern matching processing with the use of a reference image, an image processing apparatus, a robot apparatus, a program, and a recording medium.
Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable KHALILIAN-GOURTANI (Pub. No. 2023/0005243 hereinafter “KHAL”) in view of MAKI et al. (Pub. No. US 2016/0148393) and in view of GALLAGHER-GRUBER et al. (Pub. No. US 2021/0090238 hereinafter GAL).
Regarding claim 6, KHAL in view of MAKI doesn’t explicitly teach the claim limitation.
However, GAL teaches wherein the processor is further configured to execute the one or more instructions to perform the extraction processing and the estimation processing, based on an estimation model learned based on a loss function being generated based on a loss function concerning inter-pixel correlation and a loss function concerning repeatability of a feature value [Para. 183].
It would have been obvious to one of ordinary skill in the art before the effective filing date to modify KHAL in view of MAKI to teach the claim limitation, feature as taught by GAL; because the modification provides system that improve air quality.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SOLOMON G BEZUAYEHU whose telephone number is (571)270-7452. The examiner can normally be reached on Monday-Friday 10 AM-8 PM.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Oneal Mistry can be reached on 313-446-4912. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 888-786-0101 (IN USA OR CANADA) or 571-272-4000.
/SOLOMON G BEZUAYEHU/
Primary Examiner, Art Unit 2666