DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Status
Claims 1 and 3-23 are pending. Claim 2 has been cancelled. No claim has been added. Claims 1, 10, and 22-23 have been amended.
Specification
The objection to the title is withdrawn in view of Applicant’s amendment to the title.
Compact Prosecution
With respect to Claim Interpretation, the Examiner has provided some notes regarding “[BRI on the record]” throughout the Office Action, so that the record is clear about the scope of the claimed invention, and the record is also clear about the basis for the Examiner’s analyses. A clear record of the claim interpretation could expedite the examination by creating the condition to allow the examination to focus on Applicant’s inventive concept and its comparison with related prior art.
If there are disagreements, Applicant may present an alternative interpretation based on MPEP 2111. The Examiner will adopt Applicant’s interpretation on the record, if Applicant’s interpretation is reasonable and/or arguments are persuasive.
Applicant may amend claims relying on the Examiner’s claim interpretation provided on the record.
Claim Objections
The objections to Claims 1 and 22-23 due to minor informalities are withdrawn in view of Applicant’s amendment to Claims 1 and 22-23.
The objection to Claim 10 due to minor informalities are withdrawn in view of Applicant’s amendment to Claim 10.
Response to Arguments
Applicant states:
PNG
media_image1.png
250
578
media_image1.png
Greyscale
Remarks, pp 9-10.
The Examiner disagrees because of the possible misconception of the combination.
Applicant states, “then high-quality but only weakly relevant image would be promoted – the ones that best match the query – could be excluded or pushed down.” However, after the combination of Khem et al. (US 9456148 B1), there is no requirement of excluding relevance consideration.
(a) The plurality of first image data could be selected based on both relevance and quality, e.g., using thresholds or other combined selection/ranking criterion.
(b) It is also possible that a user could be allowed to select selection/ranking criterion for the claimed first criterion. Therefore, the outcome would be a user’s preference, and there is no need to second-guess a user’s expressed preference.
(c) Finally, there is no requirement that after the combination of the Examiner’s references, the combined teaching must produce an optimal outcome.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 3-9, 11-18, 20, and 22-23 are rejected under 35 U.S.C. 103 as being unpatentable over Sayko et al. (WO 2014184785 A2) in view of Nakabayashi et al. (US 20110052069 A1) and Khem et al. (US 9456148 B1).
Regarding Claim 1, Sayko teaches A display control apparatus (Savko fig. 1
PNG
media_image2.png
390
180
media_image2.png
Greyscale
) comprising: a processor (Sayko fig. 1; “Figure 1 is a schematic illustration of a computer system being an implementation of the present technology.” Sayko ¶ 75. A computer has a processor.); and a memory (Sayko ¶ 69),
wherein the processor is configured to acquire a plurality of image data (Sayko fig. 14
PNG
media_image3.png
450
550
media_image3.png
Greyscale
Sayko teaches acquiring images from queries/searches, stating “Method for providing image information to a client device comprising: Receiving a search query from the client device. Effecting a search in respect of the search query. Sending to the client device search results including a group of images and another image, the group of images being selected from a plurality of groups of images stored on a server prior to having received the search query.” Sayko Abstract.),
perform control of displaying, in a first region of a display (Sayko fig. 14:
PNG
media_image4.png
359
123
media_image4.png
Greyscale
), a first image data group (image 170 or a visual representation 872, e.g.,
PNG
media_image5.png
66
120
media_image5.png
Greyscale
) including a plurality of first image data (the group has “24” images in the example) selected, based on a first criterion (relevance criterion, stating, “The more relevant an individual image 170 and or a visual representation 872 is, the closer to the top of the column 856 it is.” Sayko ¶ 135.), from the plurality of acquired image data (images acquired after queries/searches, stating “Sending to the client device search results including a group of images and another image.” Sayko Abstract.) (
[BRI on the record]
With respect to “first image data group,” the Examiner is reading the limitation to mean a representative image that represents a first group of images. The interpretation is based on the specification. The claimed feature is supported by fig. 7 and Spec. ¶ 106.
Fig. 7
PNG
media_image6.png
534
598
media_image6.png
Greyscale
“The processor 43 may display, in the first region 44A, the representative image data included in the representative image data group 50B in an order based on the first criterion. Specifically, the processor 43 displays, in the first region 44A, the representative image data included in the representative image data group 50B in descending order of the score.” Spec. ¶ 106.
Here, 44A is first region, the “first image data group” corresponds to “g1” (group 1), “g2” (group 2), “g3” (group 3), or “g4” (group 4), which could be represented by a representative image.
With respect to “perform control of displaying . . . a first image data group including a plurality of first image data selected from the plurality of acquired image data based on a first criterion,” clarification of the limitation is required. Objection has been made for the limitation. For the purposes of art rejection, the Examiner is reading the limitation to mean: “perform control of displaying . . . a first image data group based on a first criterion, wherein the first image data group includes a plurality of first image data selected from the plurality of acquired image data.”
[Mapping Analysis]
“In the SERP 850, the second visual representation 780 is expanded further as a third visual representation 880 and the individual images 170 and the visual representation 172 are reorganized to be displayed in a single column 856 to the left of the third visual representation 880. . . . To permit the display in the column 856, the three thumbnails that are displayed horizontally next to each other in the visual representations 172 are rearranged to be vertically next to each other in visual representations 872.” Sayko ¶ 135.
“When the user 14 selects an image 170 or a visual representation 872, the third visual representation 880 is modified accordingly.” Sayko ¶ 136. “. . . following the search query, the most relevant image is displayed as the large image 882, and if this image belongs to a group of images, the thumbnails 884 and the elements of the third visual representation 880 associated with groups of images are also displayed.” Sayko ¶ 136.Here, when image 170 or a visual representation 872 (first image data group) is selected, 882 (selected image data) is displayed, so are thumbnail images (884) of a selected group (second image data group).
Sayko provides an explanation about
PNG
media_image5.png
66
120
media_image5.png
Greyscale
, stating “For this reason, the visual representation 172 also has a box 178 containing a number corresponding to the number of images present in the group of images.” Sayko ¶ 120.),
perform control of displaying, in a second region of the display (Sayko fig. 14:
PNG
media_image7.png
246
366
media_image7.png
Greyscale
), selected image data (including Sayko fig. 14 882) that is at least one image data selected from the first image data group (
“When the user 14 selects an image 170 or a visual representation 872, the third visual representation 880 is modified accordingly.” Sayko ¶ 136. Fig. 14 882 is an image placed within 880 to be modified.),
and perform control of displaying, in a third region of the display (Sayko fig. 14:
PNG
media_image8.png
36
386
media_image8.png
Greyscale
), a second image data group (thumbnail images (884) of a selected group currently displayed) including second image data selected from the plurality of image data (results of image query/search (see Abstract)) based on a second criterion (a criterion of belonging to the same image group) based on specific image data that is at least one of the selected image data (Sayko fig. 14 882) (Sayko teaches a second criterion of belonging to the same image group, stating “The user 14 can also view as the larger image 882 another image of the group of images by selecting one of the thumbnails 884. The thumbnail 884 corresponding to the larger image 882 has a colored border 854 around it.” Sayko ¶ 135. Similarly, “In such an embodiment, following the search query, the most relevant image is displayed as the large image 882, and if this image belongs to a group of images, the thumbnails 884 and the elements of the third visual representation 880 associated with groups of images are also displayed.” Sayko ¶ 136.)
With respect to “first image data group,” the Examiner is reading the limitation to mean a representative image that represents a first group of images. The interpretation is based on the specification (fig. 7 and Spec. ¶ 106).
However, if “first image data group” is interpreted as a first image group, Sayko does not explicitly teaches it.
Nakabayashi teaches a first image data group as a first image group of stacked images (Nakabayashi fig. 10B
PNG
media_image9.png
86
114
media_image9.png
Greyscale
, which shows a group of images and it works with Sayko’s interface. After the combination of references, we would have an interface similar to the following:
PNG
media_image10.png
226
530
media_image10.png
Greyscale
).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine Nakabayashi’s image group representation with Sayko. One of ordinary skill in the art would be motivated to allow a user of graphical user interface to intuitively understand that the icon represents an image group.
Sayko in view of Nakabayashi does not explicitly disclose
Khem teaches wherein the first criterion is a criterion relating to an image quality (
“. . . the images will be shown and/or selected for display in the order they were captured, while in other embodiments the images can be sorted based on image quality parameters or other such factors.” Khem col. 7 lines 16-22.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine Khem’s displaying based on image quality with Sayko in view of Nakabayashi. One of ordinary skill in the art would be motivated to show better quality images to a viewer. The viewer may find such images more visually pleasing or useful.
Claims 22 and 23 are substantially similar to Claim 1. The rejection analyses for Claim 1 are also applied to Claims 22 and 23. In addition, Claim 22 recites, “A display control method . . .” (Sayko fig. 4), and Claim 23 recites, “A non-transitory computer readable medium storing a display control program causing a processor to execute steps . . .” (Sayko fig. 1; “Figure 1 is a schematic illustration of a computer system being an implementation of the present technology.” Sayko ¶¶ 69, 75.).
Regarding Claim 3, Sayko in view of Nakabayashi and Khem teaches The display control apparatus according to claim 1,
wherein the second criterion is a criterion relating to a degree of similarity with the specific image data (
The criterion is that sameness criterion, stating “The user 14 can also view as the larger image 882 another image of the group of images by selecting one of the thumbnails 884. The thumbnail 884 corresponding to the larger image 882 has a colored border 854 around it.” Sayko ¶ 135.
This mapping is consistent with the specification, which states “For example, the second criterion may be set based on the degree of similarity (sameness) of the subject in addition to the degree of similarity of the composition or the degree of similarity of the image quality described above.” Spec. 50.
Alternatively, Nakabayashi teaches similarity searches, stating, “… the user interface unit displays the search result in the order of similarities to the key images.” Nakabayashi Abstract. “However, if a full and a side face key image is used in search and a side face image having high similarity to the side face key image is included in the search result, the side face image are displayed with a high rank. If a full face image included in the search result has a high similarity to the full face key image, the full face image is also displayed with a high rank.” Nakabayashi ¶ 180.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine Nakabayashi’s similarity search with Sayko. One of ordinary skill in the art would be motivated to save time to find relevant/needed images. “Moreover, a great deal of time is required to search a necessary image from a large amount of recorded images.” Nakabayashi ¶ 6.
Regarding Claim 4, Sayko in view of Nakabayashi and Khem teaches The display control apparatus according to claim 1, wherein the processor is configured to
acquire the plurality of image data captured by a plurality of imaging apparatuses including a first imaging apparatus and a second imaging apparatus (Nakabayashi fig. 4:
PNG
media_image11.png
164
270
media_image11.png
Greyscale
, showing multiple imaging apparatus based on camera IDs selected.
Further, Nakabayashi explains, “The image search execution unit 230 is a part that uses the image search data storage unit 220 based on input search condition, and executes the search process of the image recording apparatus 202 that records images picked up by the image pick-up apparatuses 201-1 to 201-n such as a surveillance camera.” Nakabayashi ¶ 49.
Nakabayashi fig. 1:
PNG
media_image12.png
542
476
media_image12.png
Greyscale
), and
the second criterion is a criterion for determining whether or not image data is captured by the second imaging apparatus different from the first imaging apparatus, that captures the specific image data, in a predetermined period based on an imaging timing of the specific image data (
After the combination of Sayko and Nakabayashi, we could have the following:
PNG
media_image13.png
352
552
media_image13.png
Greyscale
Sayko states, “The user 14 can also view as the larger image 882 another image of the group of images by selecting one of the thumbnails 884. The thumbnail 884 corresponding to the larger image 882 has a colored border 854 around it.” Sayko ¶ 135.
If 884 are images captured by the second imaging apparatus, 886 will be captured by the second imaging apparatus, different from the first imaging apparatus.
In addition, Nakabayashi fig. 4:
PNG
media_image11.png
164
270
media_image11.png
Greyscale
discloses designating the predetermined period to search images. “The time D103 is data which expresses the time when the image frame is picked up or recorded in Greenwich Mean Time (GMT) or in terms of frame number.” Nakabayashi ¶ 78. “The search range input area 410 is a display region for inputting the search time range and the camera ID which identifies the image pick-up apparatuses 201-1 to 201-n executing the image search.” Nakabayashi ¶ 88.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine Nakabayashi’s surveillance image searching with Sayko. One of ordinary skill in the art would be motivated to quickly find relevant/needed images, e.g., based on time window and/or camera perspective “Moreover, a great deal of time is required to search a necessary image from a large amount of recorded images.” Nakabayashi ¶ 6.
Regarding Claim 5, Sayko in view of Nakabayashi and Khem teaches The display control apparatus according to claim 1,
wherein the second criterion is a criterion for determining whether or not image data shows a same subject as a subject included in the specific image data (
Nakabayashi states, “… the user interface unit displays the search result in the order of similarities to the key images.” Nakabayashi Abstract. “However, if a full and a side face key image is used in search and a side face image having high similarity to the side face key image is included in the search result, the side face image are displayed with a high rank. If a full face image included in the search result has a high similarity to the full face key image, the full face image is also displayed with a high rank.” Nakabayashi ¶ 180. “First, for the person detection, the image feature extraction unit 210 determines whether or not a face exists in the image by using a known face detection technique and, upon detecting the presence of the face, calculates a coordinate of its region.” Nakabayashi ¶ 43.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine Nakabayashi’s face detection with Sayko. One of ordinary skill in the art would be motivated to quickly find surveillance images related to a person. “Further, for detection of the object feature by using the image recognition technique, the image feature extraction unit 210 detects, for example, a clothes feature. For detection of the clothes feature, the image feature extraction unit 210 detects and use, e.g., the color distribution or frequency characteristics of clothes stated above as the clothes feature, with respect to the clothes region.” Nakabayashi ¶ 45.
Regarding Claim 6, Sayko in view of Nakabayashi and Khem teaches The display control apparatus according to claim 1,
wherein the processor is configured to acquire the plurality of image data captured by a plurality of imaging apparatuses including a first imaging apparatus and a second imaging apparatus (Nakabayashi fig. 4:
PNG
media_image11.png
164
270
media_image11.png
Greyscale
, showing multiple imaging apparatus based on camera IDs.
Nakabayashi explains, stating “The image search execution unit 230 is a part that uses the image search data storage unit 220 based on input search condition, and executes the search process of the image recording apparatus 202 that records images picked up by the image pick-up apparatuses 201-1 to 201-n such as a surveillance camera.” Nakabayashi ¶ 49; see fig. 1.), and
the second criterion is a criterion for determining whether or not image data is captured by the second imaging apparatus different from the first imaging apparatus that captures the specific image data (
After the combination of Sayko and Nakabayashi, we could have the following:
PNG
media_image13.png
352
552
media_image13.png
Greyscale
Sayko states, “The user 14 can also view as the larger image 882 another image of the group of images by selecting one of the thumbnails 884. The thumbnail 884 corresponding to the larger image 882 has a colored border 854 around it.” Sayko ¶ 135.
If 884 are images captured by the second imaging apparatus, 886 will be captured by the second imaging apparatus, different from the first imaging apparatus.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine Nakabayashi’s surveillance image searching with Sayko. One of ordinary skill in the art would be motivated to quickly find relevant/needed images. “Moreover, a great deal of time is required to search a necessary image from a large amount of recorded images.” Nakabayashi ¶ 6.
Regarding Claim 7, Sayko in view of Nakabayashi and Khem teaches The display control apparatus according to claim 1,
wherein the processor is configured to receive an operation on the specific image data to change the second criterion based on the received operation (
Nakabayashi teaches similarity criterion based on addition or removal of key image(s) to the specific image data, stating “wherein the user interface unit allows multiple key images to be inputted as the single search condition, the image search execution unit executes the search by using key images included in the single search condition, and the user interface unit displays the search result in the order of similarities to the key images.” Nakabayashi Abstract.
Nakabayashi teaches modifying similarity criterion threshold based on the specific image data:
PNG
media_image14.png
372
708
media_image14.png
Greyscale
).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine Nakabayashi’s surveillance image searching with Sayko. One of ordinary skill in the art would be motivated to quickly find relevant/needed images by modifying search criterion. “Moreover, a great deal of time is required to search a necessary image from a large amount of recorded images.” Nakabayashi ¶ 6.
Regarding Claim 8, Sayko in view of Nakabayashi and Khem teaches The display control apparatus according to claim 1, wherein the processor is configured to receive a request for addition of information for the image data displayed in any one of the first region, the second region, or the third region, and change the first criterion or the second criterion based on the request for addition (
Nakabayashi teaches similarity criterion, second criterion, based on addition of key image(s) to the specific image data in a region, stating “wherein the user interface unit allows multiple key images to be inputted as the single search condition, the image search execution unit executes the search by using key images included in the single search condition, and the user interface unit displays the search result in the order of similarities to the key images.” Nakabayashi Abstract.
Nakabayashi teaches modifying similarity criterion threshold, second criterion, by requesting for addition of configuration information for image data:
PNG
media_image14.png
372
708
media_image14.png
Greyscale
).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine Nakabayashi’s surveillance image searching with Sayko. One of ordinary skill in the art would be motivated to quickly find relevant/needed images by modifying search criterion. “Moreover, a great deal of time is required to search a necessary image from a large amount of recorded images.” Nakabayashi ¶ 6.
Regarding Claim 9, Sayko in view of Nakabayashi and Khem teaches The display control apparatus according to claim 1, wherein at least one of the first criterion or the second criterion is changeable (
Sayko teaches changing first criterion, stating “It is also contemplated that the thumbnails could be arranged in an order other than by rank. For example, the thumbnails 174 could be arranged chronologically based on a time and date of creation of the images corresponding to the thumbnails 174 in order to give at a glance an idea of a chronology of events shown on the thumbnails 174. The thumbnails 174 could also be arranged in some other logical sequence in the visual representation 172.” Sayko ¶ 120.
Nakabayashi teaches similarity criterion, second criterion, based on addition of key image(s) to the specific image data in a region, stating “wherein the user interface unit allows multiple key images to be inputted as the single search condition, the image search execution unit executes the search by using key images included in the single search condition, and the user interface unit displays the search result in the order of similarities to the key images.” Nakabayashi Abstract.
Nakabayashi teaches modifying similarity criterion threshold, second criterion:
PNG
media_image14.png
372
708
media_image14.png
Greyscale
).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine Nakabayashi’s surveillance image searching with Sayko. One of ordinary skill in the art would be motivated to quickly find relevant/needed images by modifying search criterion. “Moreover, a great deal of time is required to search a necessary image from a large amount of recorded images.” Nakabayashi ¶ 6.
Regarding Claim 11, Sayko in view of Nakabayashi and Khem teaches The display control apparatus according to claim 1,
wherein the processor is configured to continuously acquire new image data to update at least one of the first image data group or the second image data group (
Nakabayashi figs 7A-C:
PNG
media_image15.png
434
628
media_image15.png
Greyscale
Nakabayashi teaches adding a label “newly added” to indicate an image element has been added as shown in figs. 7A-C, stating “When the user selects an image group desired to be displayed and pushes a `group display` button 550 on the key image display area 442 by using the pointing device or the like, the user interface unit 240 switches the display area to the key image display area 443 of FIG. 7C and displays the selected image group.” Nakabayashi ¶ 124.
The addition or editing of the group could be done continuously as the user would like to.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine Nakabayashi’s surveillance image searching with Sayko. One of ordinary skill in the art would be motivated to provide updated/new data for a user, so that the user would have better information.
Regarding Claim 12, Sayko in view of Nakabayashi and Khem teaches The display control apparatus according to claim 11, wherein the processor is configured to perform, in a case where the new image data is included in the first image data group or the second image data group, control of displaying the new image data in a mode distinguishable from other image data (
PNG
media_image16.png
292
630
media_image16.png
Greyscale
, showing “newly added” for new image data, which is distinguishable from other image data.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine Nakabayashi’s surveillance image searching with Sayko. One of ordinary skill in the art would be motivated to provide updated/new data for a user and make it easier for the user to find them, so that the user would have better information.
Regarding Claim 13, Sayko in view of Nakabayashi and Khem teaches The display control apparatus according to claim 1, wherein the processor is configured to
display, in the first region, the image data included in the first image data group in an order based on the first criterion (Savko states,“The more relevant an individual image 170 and or a visual representation 872 is, the closer to the top of the column 856 it is.” Sayko ¶ 135.).
Regarding Claim 14, Sayko in view of Nakabayashi and Khem teaches The display control apparatus according to claim 1, wherein the processor is configured to further perform control of
displaying, on the display, an acquisition image for acquiring the specific image data (
PNG
media_image3.png
450
550
media_image3.png
Greyscale
the acquisition image is mapped to search bars as shown in Sayko fig. 14.
“In such an embodiment, following the search query, the most relevant image is displayed as the large image 882, and if this image belongs to a group of images, the thumbnails 884 and the elements of the third visual representation 880 associated with groups of images are also displayed.” Sayko ¶ 136.).
Regarding Claim 15, Sayko in view of Nakabayashi and Khem teaches The display control apparatus according to claim 1,
wherein the first region extends along a first direction (Sayko fig. 1:
PNG
media_image4.png
359
123
media_image4.png
Greyscale
), and the third region (Sayko fig. 1:
PNG
media_image8.png
36
386
media_image8.png
Greyscale
) extends along a second direction intersecting the first direction (
PNG
media_image3.png
450
550
media_image3.png
Greyscale
, which shows that the first and second directions are perpendicular to each other, and therefore, the two directions intersect. Note, the claim requires directions, not regions, intersect.).
Regarding Claim 16, Sayko in view of Nakabayashi and Khem teaches The display control apparatus according to claim 15, wherein a display size of the selected image data displayed in the second region (Sayko fig. 14:
PNG
media_image7.png
246
366
media_image7.png
Greyscale
) is larger than a display size of each image data displayed in the first region (Sayko fig. 14:
PNG
media_image4.png
359
123
media_image4.png
Greyscale
) and a display size of each image data displayed in the third region (Sayko fig. 14:
PNG
media_image8.png
36
386
media_image8.png
Greyscale
).
Regarding Claim 17, Sayko in view of Nakabayashi and Khem teaches The display control apparatus according to claim 15, wherein one side of the selected image data (Sayko fig. 14 882) displayed in the second region matches the second direction (horizontal/longitudinal direction as shown for Sayko fig. 14 884).
Regarding Claim 18, Sayko in view of Nakabayashi and Khem teaches The display control apparatus according to claim 15, wherein a longitudinal direction of a display region of the display (Sayko fig. 14) matches the second direction (horizontal/longitudinal direction as shown for Sayko fig. 14 884).
Regarding Claim 20, Sayko in view of Nakabayashi and Khem teaches The display control apparatus according to claim 1,
wherein the processor is configured to further perform control of displaying, in the second region, image data selected from the second image data group (“When the user 14 selects an image 170 or a visual representation 872, the third visual representation 880 is modified accordingly.” Sayko ¶ 136. “. . . following the search query, the most relevant image is displayed as the large image 882, and if this image belongs to a group of images, the thumbnails 884 and the elements of the third visual representation 880 associated with groups of images are also displayed.” Sayko ¶ 136.).
Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Sayko in view of Nakabayashi and Khem as applied to Claim 9, in further view of Moulton et al. (US 20200184429 A1).
Regarding Claim 10, Sayko in view of Nakabayashi and Khem teaches The display control apparatus according to claim 9.
Sayko in view of Nakabayashi and Khem does not explicitly disclose wherein the processor is configured to change at least one of the first criterion or the second criterion based on a number of acquired images included in the acquired image data (
[BRI on the record] With respect to “based on number of the acquired image data,” and its meaning should be clarified. The claim has been objected to. For the purposes of art rejection, the Examiner is reading the limitation to mean “a number of acquired images.”).
Moulton teaches wherein the processor is configured to change at least one of the first criterion or the second criterion based on a number of acquired images included in the acquired image data (
Moulton teaches controlling the number of images by adjusting a threshold when there are too many images, stating “For example, if many check images (e.g., a number of check images that exceeds a change threshold) are being flagged for manual review that are ultimately processed from an account corresponding to the selected profile, the predetermined correlation threshold may be reduced . . ..” Moulton ¶ 59.
For claim 9, this claim’s parent, the Examiner has provided the analysis: Nakabayashi teaches modifying similarity criterion threshold, second criterion:
PNG
media_image14.png
372
708
media_image14.png
Greyscale
After Sayko in view of Nakabayashi and Khem is combined with Moulton, when there are too many images to review, the threshold as shown Nakabayashi fig. 9 could be adjusted.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine Moulton’s controlling the number of images to be review with Sayko in view of Nakabayashi and Khem. One of ordinary skill in the art would be motivated to control the number of images to view. When there are too many images to review, it could be overwhelming. When there are no image that satisfies a requirement, the requirement may be adjusted to review the images.
Claim 19 is rejected under 35 U.S.C. 103 as being unpatentable over Sayko in view of Nakabayashi and Khem as applied to Claim 1, in further view of Maeda (US 20130268893 A1).
Regarding Claim 19, Sayko in view of Nakabayashi and Khem teaches The display control apparatus according to claim 1.
Sayko in view of Nakabayashi and Khem does not explicitly disclose wherein the processor is configured to change the image data displayed in the second region to a display form distinguishable from image data that is not displayed in the second region.
Maeda teaches wherein the processor is configured to change the image data displayed in the second region to a display form distinguishable from image data that is not displayed in the second region (
[BRI on the record] With respect to “display form,” the Examiner is reading the limitation to mean display appearance. The specification states:
[0101] It is preferable that the processor 43 changes the image data displayed in the second region 44B to a display form distinguishable from the image data that is not displayed in the second region 44B. For example, the image data displayed in the second region 44B need only be grayed out and displayed, or a viewed mark need only be superimposed and displayed. Accordingly, the user can easily recognize the checked image data and the unchecked image data, and the efficiency of the work of selecting the image data can be enhanced.
Spec. ¶ 101.
[Mapping Analysis]
Maeda teaches changing display appearance after a region is selected, stating “In the first to fourth embodiments, a selected area is highlighted, and images in an area except for the selected area are displayed at low brightness.” Maeda ¶ 83.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine Maeda’s highlighting selected region with Sayko in view of Nakabayashi and Khem. One of ordinary skill in the art would be motivated to bring attention to a selected region and/to see better view an image of a user’s interest/attention.
Claim 21 is rejected under 35 U.S.C. 103 as being unpatentable over Sayko in view of Nakabayashi and Khem as applied to Claim 1, in further view of Niga (CN 110475059 A).
Regarding Claim 21, Sayko in view of Nakabayashi and Khem teaches The display control apparatus according to claim 1,
wherein the processor is configured to acquire the plurality of image data from an imaging apparatus (“The present invention relates to an image search apparatus for searching an image by using image features; and, more particularly, to an image search apparatus for searching and displaying an image, which matches search conditions, from images stored in an image pick-up apparatus such as a surveillance camera.” Nakabayashi ¶ 1. )
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine Nakabayashi’s surveillance image searching with Sayko. One of ordinary skill in the art would be motivated to quickly find relevant/needed images. “Moreover, a great deal of time is required to search a necessary image from a large amount of recorded images.” Nakabayashi ¶ 6.
Sayko in view of Nakabayashi and Khem does not explicitly disclose further perform control of changing a setting of the imaging apparatus.
Niga teaches further perform control of changing a setting of the imaging apparatus (“On the other hand, when the user presses the distal adjustment button 406, may be directed to the camera view angle adjusting the focus to the far side. when the user operates any one of buttons 405 and 406, control device 110 through the network 120 for focusing of the setting command transmitted to the surveillance camera 100. when the user presses the ‘application’ button 407, the control apparatus 110 of the display part 222 from the GUI focus setting screen switching to the rocking angle setting screen.” Niga p. 6).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine Niga’s controlling method of a camera with Sayko in view of Nakabayashi and Khem. One of ordinary skill in the art would be motivated to acquire better quality images of a scene. See Niga background.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Hampapur et al. (US 20080298693 A1): “As shown in FIG. 4, each image 100A-D includes a pair of check boxes, permitting the user to mark the image as either ‘OK’ or ‘Flag.’ In some cases, a user may be required to somehow annotate the images to confirm that the images were, in fact, reviewed.” ¶ 33. Related to “display form” of Claim 19.
Minyard et al. (US 6891920 B1): “Additionally, images may be marked as reviewed, as may be desired.” Col. 4 lines 29-58. Related to “display form” of Claim 19.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ZHENGXI LIU whose telephone number is (571)270-7509. The examiner can normally be reached M-F 9 AM - 5 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kee Tung can be reached at 571-272-7794. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ZHENGXI LIU/ Primary Examiner, Art Unit 2611