DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 03/29/2024 was filed after the mailing date of the application. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Objections
Claim 9 is objected to because of the following informalities: Claim 9 recites “in the carousel format”. There is lack of antecedent basis for this feature. Appropriate correction is required. The following rejection is assumed as claim 9 is dependent on claim 8.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-3, 5-7, 10-18 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Saito et al. (US. Patent App. Pub. No. 2020/0126407, “Saito”, hereinafter).
As per claim 1, Saito teaches a method comprising:
obtaining an image captured by a camera among a plurality of cameras installed at predetermined locations (Fig. 6, ¶ [61], obtaining captured images showing the situation of the intersection); and
displaying on a display unit a map (road map MP1) on which each of a plurality of image objects indicating a location of each of the plurality of cameras are superimposed at a position corresponding to each of the predetermined locations (see Fig. 21, ¶ [150], displaying cameras CM1 to CM5 icons (see also Fig. 20) that are superimposed on the intersections on the map),
wherein the image captured by at least one of the plurality of cameras is displayed on the display unit together with the map (still at Fig. 21, and ¶ [150], “The live image screen MOV1 is a display screen of a captured image (that is, the current live image) captured by the camera CM2 specified by the vehicle search server 50 as a camera to be searched next”).
As per claim 2, Saito also teaches receiving a designation of a reference point of the map to display on the display unit of a computer (¶ [149], such as designating an arrow icon indicating passing direction), wherein the displayed map shows an area corresponding to the reference point (Fig. 21, showing in the captured image MOV1 recited above).
As per claim 3, as addressed above, Saito does teach wherein the displayed image is captured by a camera selected from among the plurality of cameras (captured image from camera CM2 among plurality of cameras CM1-CM5 shown in Fig. 20-21), the selected camera being selected based on a distance from the reference point (¶ [150], “…within a predetermined distance including the intersection of the camera CM2 displayed on the live image screen MOV1”).
As per claim 5, Saito further teach wherein the designation of the reference point is received in response to an input by a user (¶ [172] referring to Fig. 22).
As per claim 6, Saito does further teach receiving, by a selecting unit, a selection of a camera from among the plurality of cameras (Fig. 9, ¶ [94-95], selecting camera for tracking vehicle passing intersection), wherein in displaying the image captured by the selected camera, the image has a different appearance from other images captured by cameras other than the selected camera (Fig. 8, ¶ [88], because each camera has different view angle and at different location).
As per claim 7, as addressed above referring to Fig. 21, Saito wherein in response to the selection of the camera, the map is shown based on a location of the selected camera as the reference point (in the example camera CM2 is selected and displayed based on the reference point (arrow icon)).
As per claim 10, Saito inherently teaches receiving a selection of an image object from among the plurality of image objects (as addressed above, selecting camera icon such as CM2 in the example), wherein at least two images captured by at least two of the plurality of cameras are displayed on the display unit together with the map (¶ [49], at least two image portions from two different cameras are used), and in displaying the at least two images, an image captured by a camera corresponding to the selected image object has a different appearance from other images (also ¶ [49], referring to Fig. 5, two image portions from different direction and viewing angle).
As per claim 11, as addressed in claim 10, Saito does also teach receiving a designation of a position on the map (see claim 2), wherein at least two images captured by at least two of the plurality of cameras are displayed on the display unit together with the map (see claim 10), and in displaying the at least two images, an image captured by a camera located at a location nearest to a location corresponding to the designated position is displayed on the map (see Fig. 18, ¶ [138], i.e., camera CM18 is displayed as being anticipated as nearest location from the incident captured by camera CM15).
As per claim 12, Saito does further teach wherein in displaying the map, a UI object for displaying geographical information superimposed on the map is displayed, and in response to receiving an instruction via the UI object to display the geographical information, the displayed map is switched to the map on which the geographical information is superimposed (¶ [144-145], referring to Fig. 21, responsive to a selection of a road map MP1, the displayed map is switched to displaying as shown in Fig. 21).
As per claim 13, as shown in Fig. 16-19, Saito does also teach wherein in response to receiving the instruction to display the geographical information (as addressed above), the selecting unit shows a screen for selection of an image from among a plurality of images that were previously captured by a camera located at a location nearest to the reference point (i.e., screens that shown previous nearest locations to camera CM19 such as from camera CM18, CM15 as shown in Fig. 16-19).
As per claim 14, as addressed in claims 12 and 13 above, Saito does also teach in response to receiving a selection of the image that was previously captured by the camera located at the location nearest to the reference point, the selected image is displayed on the display unit together with the map.
Claim 15, which is similar in scope to claim 1 as addressed above, is thus rejected under the same rationale.
Claim 16 is similar in scope to claim 1 as addressed above, with the exception of the processor, memory and display, which is also taught by Saito as shown in Fig. 6. Claim 16 is thus rejected under the same rationale.
Claim 17 is similar in scope to claim 1 as addressed above, with the exception of communicating with a user terminal having a display device which is also taught by Saito referring to Fig. 6 and its disclosure. Claim 17 is thus rejected under the same rationale.
Claim 18, which is similar in scope to claims 16 and 17 as addressed above, is thus rejected under the same rationale.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim 4 is rejected under 35 U.S.C. 103 as being unpatentable over by Saito et al. (US. Patent App. Pub. No. 2020/0126407, “Saito”) in view of Johnson et al. (US. Patent App. Pub. No. 2019/0304190, “Johnson”).
As per claim 4, Saito does not expressly teach obtaining location information of the computer, wherein the received designation shows the obtained location information as a location of the reference point.
However, in a similar method of overlaying image object onto a map as shown in Fig. 9, ¶ [198], Johnson teaches this feature, i.e., obtaining location information of the computer, wherein the received designation shows the obtained location information as a location of the reference point (¶ [180], i.e., by using the current location of device 200 as reference for displaying additional information as shown in Fig. 6-9).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the method as taught by Johnson into the method as taught by Saito as addressed above, the advantage of which is to provide user more information related to the current location where the user is.
Claims 8 and 9 are rejected under 35 U.S.C. 103 as being unpatentable over by Saito et al. (US. Patent App. Pub. No. 2020/0126407, “Saito”) in view of Higgins et al. (US. Patent App. Pub. No. 20230064675, “Higgins”).
As per claim 8, Saito does not explicitly teach wherein the selecting unit shows, in a carousel format, images captured by the plurality of cameras installed at predetermined locations, and the selection is received via an input operation on the carousel format. However, as addressed, Saito does teach selecting cameras at predetermined locations (intersection) and the cameras are selected in a sequence as shown in Fig. 16-19.
Higgins, in a very similar method to that of Saito (see ¶ [5]), teaches this feature, i.e., selecting and displaying captured image in a carousel format (Fig. 5, ¶ [45], selecting cameras surrounding the target location, and displaying captured images (551-1 to 551-4) according the predetermined parameter. See further Fig. 6 and 7).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to apply the method as taught by Higgins to the method as taught by Saito as addressed above, the advantage of which is to capture more images to create a 360-degree view at the target location.
As per claim 9, the combined teachings of Saito and Higgins also includes wherein in the carousel format, the images are arranged in an order of distance of the camera from the reference point (see Higgins, ¶ [51], displayed images are sorted based on distance from the user to target point).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Hau H. Nguyen whose telephone number is: 571-272-7787. The examiner can normally be reached on MON-FRI from 8:30-5:30.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Tammy Goddard, can be reached on (571) 272-7773.
The fax number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free).
/HAU H NGUYEN/Primary Examiner, Art Unit 2611