DETAILED ACTION
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103(a) which forms the basis for all obviousness rejections set forth in this Office action:
(a) A patent may not be obtained though the invention is not identically disclosed or described as set forth in section 102 of this title, if the differences between the subject matter sought to be patented and the prior art are such that the subject matter as a whole would have been obvious at the time the invention was made to a person having ordinary skill in the art to which said subject matter pertains. Patentability shall not be negatived by the manner in which the invention was made.
Claims 1-17 are rejected under 35 U.S.C. 103(a) as being unpatentable over Nakabayashi et al. (US 2011/0052069 A1) and in view of Takata et al. (US 20140089000 A1).
For claim 1, Nakabayashi et al. teaches an image processing apparatus comprising: a first image group reception device that receives an input of a first image group [image pickup/record device with storage unit of the images, 0034-0036: Nakabayashi]; and an image extraction device that, among a plurality of reference image groups each having information about an extracted part of images [extraction unit to take images and search against a group of key images, 0018: Nakabayashi], but does not teach an image extraction device that, among a plurality of reference image groups each having information about an extracted part of images, extracts, from the first image group, an image similar to the part of the images that have been extracted in the past from a second image group similar to the first image group.
Takata et al. teaches an image extraction device that, among a plurality of reference image groups each having information about an extracted part of images, extracts, from the first image group, an image similar to the part of the images that have been extracted in the past from a second image group similar to the first image group [image extraction and compare image attributes to stored secondary image case data for similarity from previously stored case data information of secondary image group, 0013: Takata].
Nakabayashi et al. (US 2011/0052069 A1) and Takata et al. (US 20140089000 A1) are analogous art because they are from the same field of image similarity searching.
At the time of the invention it would have been obvious to a person of ordinary skill in the art to modify the image search feature functionality as described by Nakabayashi et al. with similarity matching with secondary image group as taught by Takata et al.
The motivation for doing so would be for the “condition that images and keywords are associated with each other” [0004: Takata].
Therefore, it would have been obvious to combine Nakabayashi et al. (US 2011/0052069 A1) with Takata et al. (US 20140089000 A1) for matching similarity between two different image groups.
For claim 2, Nakabayashi et al. and Takata et al. teaches:
An image processing apparatus comprising: a first image group input device that receives an input of a first image group [image pickup/record device with storage unit of the images, 0034-0036: Nakabayashi]; and an extraction device that extracts, from the first image group, a plurality of images for which similarity with a plurality of consultation images that have been extracted in the past from a consultation image group is greater than or equal to a threshold value [extraction unit to take images and search against a group of key images, 0018: Nakabayashi; similarity check with threshold for match value with key images, 0092: Nakabayashi; secondary image group to compare for similarity, 0013: Takata; image interpretation report for consultation for similarity matching with frequency appearance calculation, 0174: Takata].
For claim 3, Nakabayashi et al. and Takata et al. teaches:
The image processing apparatus according to claim 2, wherein a plurality of the consultation image groups are present, the image processing apparatus further comprises a consultation image group detection device that detects, from the plurality of consultation image groups, the consultation image group for which similarity with the first image group is greater than or equal to a threshold value, and the extraction device extracts, from the first image group, a plurality of images for which similarity with the plurality of consultation images extracted from the consultation image group detected by the consultation image group detection device is greater than or equal to a threshold value [similarity detection and comparison with key image, 0050-0051; similarity check with threshold for match value with key images, 0092: Nakabayashi: Nakabayashi].
For claim 4, Nakabayashi et al. and Takata et al. teaches:
The image processing apparatus according to claim 2, wherein each consultation image of the plurality of consultation images extracted from the consultation image group is pasted in an image pasting region of a template, and the image processing apparatus further comprises an image pasting device that pastes, in the image pasting region of the template, an image for which similarity with the consultation image pasted in the image pasting region of the template is greater than or equal to a threshold value among the plurality of images extracted in the extraction device [reproduction of image in an area, 0055-0056; similarity detection and comparison with key image, 0050-0051: Nakabayashi].
For claim 5, Nakabayashi et al. and Takata et al. teaches:
The image processing apparatus according to claim 4, wherein the image pasting device pastes, in an image pasting region corresponding to the image pasting region in which the consultation image is pasted, the image for which the similarity with the consultation image pasted in the image pasting region of the template is greater than or equal to the threshold value among the plurality of images extracted in the extraction device [calculation of similarity, 0050; reproduction of image in an area, 0055-0056: Nakabayashi].
For claim 6, Nakabayashi et al. and Takata et al. teaches:
The image processing apparatus according to claim 4, wherein the image pasting device pastes, in an image pasting region corresponding to the image pasting region in which the consultation image is pasted, an image for which the similarity with the consultation image pasted in the image pasting region of the template is greater than or equal to the threshold value and that is captured at a timing at which an image corresponding to the consultation image is expected to be captured [calculation of similarity with read image, 0050; reproduction of image in an area, 0055-0056: Nakabayashi].
For claim 7, Nakabayashi et al. and Takata et al. teaches:
The image processing apparatus according to claim 4, further comprising: a template designation device that designates one template from a plurality of templates, wherein each consultation image of the plurality of consultation images extracted from the consultation image group is pasted in the image pasting region of the template designated by the template designation device [user setup of settings for similarity key image comparison, 0092: Nakabayashi].
For claim 8, Nakabayashi et al. and Takata et al. teaches:
The image processing apparatus according to claim 2, wherein an order of the plurality of consultation images is determined, and the image processing apparatus further comprises a first similarity adjustment device that increases the similarity with the consultation image for an image captured in an order corresponding to the order of each consultation image of the plurality of consultation images or an image captured at a timing at which an image corresponding to the consultation image is expected to be captured among images included in the first image group [ordering of images with determined magnitude of similarity, 0123: Nakabayashi].
For claim 9, Nakabayashi et al. and Takata et al. teaches:
The image processing apparatus according to claim 2, further comprising: a second similarity adjustment device that increases the similarity with the consultation image for an image having information similar to information about a face included in the consultation image among images included in the first image group [search pf key image based on face, 0064: Nakabayashi].
For claim 10, Nakabayashi et al. and Takata et al. teaches:
The image processing apparatus according to claim 2, further comprising: a third similarity adjustment device that increases the similarity with the consultation image for an image including a person for which the number of appearances in the image is greater than or equal to a threshold value among images included in the first image group [person search rank increasing based on person’s level of appearance on image, 0204: Nakabayashi].
For claim 11, Nakabayashi et al. and Takata et al. teaches:
The image processing apparatus according to claim 2, further comprising: a person designation device that designates a desired person among persons appearing in images included in the first image group; and a fourth similarity adjustment device that increases the similarity with the consultation image for an image including the person designated by the person designation device [a target person designation for similarity, 0205: Nakabayashi].
For claim 12, Nakabayashi et al. and Takata et al. teaches:
The image processing apparatus according to claim 11, further comprising: a face image detection device that detects a face image from the images included in the first image group; and a face image display control device that controls a display device to display the face image detected by the face image detection device on a display screen, wherein the person designation device designates the desired person by designating the face image displayed on the display screen [person detection from image through face detection, 0042-0043: Nakabayashi].
For claim 13, Nakabayashi et al. and Takata et al. teaches:
The image processing apparatus according to claim 2, further comprising: an image product creation device that creates an image product using the images extracted by the extraction device [image reproduction for display, 0048 and 0055: Nakabayashi].
Claim 14 is a method of the apparatus taught by claim 1. Nakabayashi et al. and Takata et al. teaches the limitations of claim 1 for the reasons stated above.
Claim 15 is a method of the apparatus taught by claim 2. Nakabayashi et al. and Takata et al. teaches the limitations of claim 2 for the reasons stated above.
Claim 16 is a medium of the apparatus taught by claim 1. Nakabayashi et al. and Takata et al. teaches the limitations of claim 1 for the reasons stated above.
Claim 17 is a medium of the apparatus taught by claim 2. Nakabayashi et al. and Takata et al. teaches the limitations of claim 2 for the reasons stated above.
Response to Arguments
Applicant's arguments and amendments filed October 14, 2025 have been fully considered and a new reference has been brought in to address the new limitations. The rejection is addressed in detail above in the 35 U.S.C. 103 Rejection.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to AJITH M JACOB whose telephone number is (571)270-1763. The examiner can normally be reached on Monday-Friday: Flexible Hours.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Apu Mofiz can be reached on 571-272-4080. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
12/27/2025
/AJITH JACOB/Primary Examiner, Art Unit 2161