DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application is being examined under the pre-AIA first to invent provisions.
Information Disclosure Statement
2. The information disclosure statement (IDS) submitted on 10/07/2024 is in compliance with the provisions of 37 CFR 1.97 and was considered by the examiner.
Claim Objections
3. Claims 1-21 are objected to because of the following informalities: The term "AR" is an acronym, which can mean different things and/or change in meaning over time; hence, it would be desirable to write out the actual words to which the acronym refers. Appropriate correction is required.
Double Patenting
4. The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
5. Claims 1, 4 and 18 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1, 2 and 9 of Inoue U.S. Patent No. 12,125,276.
Although the claims at issue are not identical, they are not patentably distinct from each other because they are both claiming substantially the same features.
Note the following similarities between the application claims and patent claims.
Instant Application No. 18/891,166
Inoue US Patent 12,125,276
Claim 1
A portable information apparatus of communicating with an information retrieval server via a communication network,
the portable information apparatus comprising:
an imager via which an object is imaged;
a display on which imaged data of the object is displayed;
a locator via which position information about an imaging position of the object is acquired; and a circuitry comprising:
a controller that transmits the position information to the information retrieval server via the communication network such that the position information is used in the information retrieval server to search related information on the object, the controller receives the related information from the information retrieval server via the communication network, and
the controller displays the related information on the object on the display, wherein the related information on the object includes AR information, and
the controller has a mode where a composition image is displayed on the display in which at least a part of the AR information is overlapped with the imaged data of the object.
Claim 1
A portable information apparatus for communicating with an information retrieval server via a communication network,
the portable information apparatus comprising:
an imager via which an object is imaged;
a display on which imaged data of the object is displayed;
a locator via which imaging position information of the object is acquired; and
circuitry comprising: a controller that acquires information on the object from the information retrieval server based on the imaged data of the object and the imaging position information and
that displays, on the display, the imaged data of the object and the information on the object; and a hold controller that outputs a control signal to the controller
to hold the imaged data of the object and the information on the object, wherein the hold controller outputs the control signal in response to at least one of: (i) acceleration of the portable information apparatus exceeding a predetermined threshold value; (ii) change in an image pattern in the displayed imaged data exceeding a predetermined threshold value; and (iii) change in elevation of the portable information apparatus exceeding a predetermined threshold value.
Claim 4
The portable information apparatus according to claim 1, wherein the controller changes between a display mode in which the related information on the object is displayed on the display and a non-display mode.
Claim 2
The portable information apparatus according to claim 1, wherein the controller controls the display to change between a display mode and a non-display mode of the imaged data of the object and the information on the object.
Claim 18
A non-transitory computer-readable recording medium on which is recorded a program that causes a computer to perform:
acquire position information about an imaging position of an object that is imaged by an imaging apparatus of a portable information apparatus; transmit the position information to an information retrieval server via a communication network such that the position information is used in the information retrieval server to search related information on the object;
receive the related information from the information retrieval server via the communication network; and
display the related information on the object on a display such that a composition image is displayed on the display in which at least a part of AR information is overlapped with the imaged data of the object, the related information on the object including the AR information.
Claim 9
A non-transitory computer-readable recording medium on which is recorded a program that causes a computer to perform:
acquiring imaging position information of an object that is imaged by an imaging apparatus of a portable information apparatus;
acquiring information on the object from an information retrieval server via a communication network based on imaged data of the object and the imaging position information;
displaying, on a display, the imaged data of the object and the information on the object; and holding the imaged data of the object and the information on the object in response to at least one of: (i) acceleration of the portable information apparatus exceeding a predetermined threshold value; (ii) change in an image pattern in the displayed imaged data exceeding a predetermined threshold value; and (iii) change in elevation of the portable information apparatus exceeding a predetermined threshold value.
6. Claims 1 and 18 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 27 and 32 of Inoue U.S. Patent No. 9,420,251.
Although the claims at issue are not identical, they are not patentably distinct from each other because they are both claiming substantially the same features.
Note the following similarities between the application claims and patent claims.
Instant Application No. 18/891,166
Inoue US Patent 9,420,251
Claim 1
A portable information apparatus of communicating with an information retrieval server via a communication network,
the portable information apparatus comprising:
an imager via which an object is imaged;
a display on which imaged data of the object is displayed;
a locator via which position information about an imaging position of the object is acquired; and a circuitry comprising:
a controller that transmits the position information to the information retrieval server via the communication network such that the position information is used in the information retrieval server to search related information on the object, the controller receives the related information from the information retrieval server via the communication network, and
the controller displays the related information on the object on the display, wherein the related information on the object includes AR information, and
the controller has a mode where a composition image is displayed on the display in which at least a part of the AR information is overlapped with the imaged data of the object.
Claim 27
An information acquisition system which includes an imaging device and an information retrieval system configured to extract a structure situated in a range of a direction based on latitude and longitude information and an azimuthal angle transmitted from the imaging device, and to transmit information about the extracted structure to the imaging device,
the imaging device comprising: a body;
an imager;
a latitude and longitude detector that detects the latitude and longitude information of the body; an azimuthal angle detector which detects the azimuthal angle by which image data is imaged by the imager and
a hardware computer configured to:
acquire augmented reality (AR) information about the structure imaged by the imager based on the latitude and longitude information and the azimuthal angle,
the computer being operable in an AR display mode in which a real-time image acquired by the imager and the augmented reality information are displayed on a display of the imaging device; and
output a hold control signal upon detecting a predetermined user action while the computer is operating in the AR display mode, the hold control signal causing the computer to hold, on the display, the real-time image and the augmented reality information displayed in the AR display mode and to continue displaying the held real-time image data and the held augmented reality information on the display after completion of the predetermined user action, the AR information including at least one AR object, the computer being configured such that parameters of (a) contents of the real-time image displayed on the display, (b) the number of the at least one AR object displayed on the display, and (c) an arrangement of the at least one AR object with respect to the real- time image displayed on the display, immediately before the user action are respectively the same as those after completion of the user action, and the information retrieval system comprising: a database which stores (i) a map data, in which a structure identification number of a structure corresponds to latitude and longitude information of the structure, and (ii) a structure table, in which the structure identification number corresponds to the augmented reality information of the structure shown by the structure identification number; and an information retrieval server which (a) retrieves, from the map data, the structure identification number of the structure situated in the range of the direction based on the latitude and longitude information and the azimuthal angle transmitted from the imaging device, (b) reads the augmented reality information added to the structure shown by the structure identification number from the structure table, and (c) transmits the augmented reality information of the read structure to the imaging device.
Claim 18
A non-transitory computer-readable recording medium on which is recorded a program that causes a computer to perform:
acquire position information about an imaging position of an object that is imaged by an imaging apparatus of a portable information apparatus; transmit the position information to an information retrieval server via a communication network such that the position information is used in the information retrieval server to search related information on the object;
receive the related information from the information retrieval server via the communication network; and
display the related information on the object on a display such that a composition image is displayed on the display in which at least a part of AR information is overlapped with the imaged data of the object, the related information on the object including the AR information.
Claim 32
A non-transitory computer-readable storage medium that stores a program that allows a computer to execute functions of an imaging device so that the computer executes the steps comprising:
inputting positional information of a position, which is detected by a positional information acquirer, of a subject, image data of which is imaged by an imager of the imaging device;
acquiring information on the subject based on the positional information;
AR-displaying a real-time image acquired by the imager and the information on the subject on a display; and outputting a hold control signal based on a user action during the AR-displaying, the hold control signal causing holding, on the display, of the real-time image and the information on the subject and continued displaying of the held real-time image data and the held information on the subject on the display after completion of the user action, the displayed information on the subject including at least one AR object, the holding being performed such that parameters of (a) contents of the image data of the subject displayed on the display, (b) the number of the at least one AR object displayed on the display, and (c) an arrangement of the at least one AR object with respect to the image data of the subject displayed on the display, immediately before the user action are respectively the same as those after completion of the user action.
Claim Rejections - 35 USC § 102
7. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
8. The following is a quotation of the appropriate paragraphs of pre-AIA 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(b) the invention was patented or described in a printed publication in this or a foreign country or in public use or on sale in this country, more than one year prior to the date of application for patent in the United States.
9. Claims 1, 5, 18 and 19 are rejected under pre-AIA 35 U.S.C. 102(b) as being anticipated by Lee et al. (US-PGPUB 2008/0147733).
Regarding claim 1, Lee discloses a portable information apparatus (Mobile Device 110; see fig. 1) of communicating with an information retrieval server (Servers 120, 130, 140; see figure 1) via a communication network (Cellular infrastructure 105; see fig. 1 and paragraph 0030), the portable information apparatus comprising:
an imager via which an object is imaged (Camera 112; see fig. 3 and paragraph 0038);
a display on which imaged data of the object is displayed (User interface 119; see paragraph 0038);
a locator via which position information about an imaging position of the object is acquired (Locator 114 that identifies the location; see fig. 3 and paragraph 0038); and
a circuitry comprising:
a controller (Processor 118; see fig. 3 and paragraph 0038) that transmits the position information to the information retrieval server via the communication network such that the position information is used in the information retrieval server to search related information on the object (At step 206; the mobile device 110 can send a packet of information 117 containing the street-level image 121, a location of the mobile device 110, a camera setting, and a compass heading to the image server 120; see fig. 2 and paragraph 0040),
the controller receives the related information from the information retrieval server via the communication network (At step 208, location specific information associated with the at least one object in the image can be retrieved in response to the recognizing. The image server, upon recognizing the objects in the image can send back a packet of information of information related to the object; see fig. 2 and paragraph 0041), and
the controller displays the related information on the object on the display, wherein the related information on the object includes AR information, and the controller has a mode where a composition image is displayed on the display in which at least a part of the AR information is overlapped with the imaged data of the object (At step 210, the location specific information can be overlaid onto the image. Mobile device 110 can overlay the advertisements 137 onto the captured image. Items 137 can be located at positions in the image corresponding to the building or business associated with the advertisement; see figs. 2, 5 and paragraph 0042).
Regarding claim 5, Lee discloses everything claimed as applied above (see claim 1). In addition, Lee discloses the controller transmits the imaged data to the information retrieval server via the communication network such that the imaged data is used in the information retrieval server to search the related information on the object (Sending the image with the location of the mobile device to an image server that can recognize at least one object in the image. Camera settings can also be sent with the location for narrowing a search of the object in an image database. The image server can respond with location specific information associated with the at least one object given the location; see paragraphs 0029, 0041).
Regarding claim 18, Lee discloses a non-transitory computer-readable recording medium on which is recorded a program that causes a computer (Combination of hardware and software can be a mobile communications device with a computer program that, when being loaded and executed, can control the mobile communications device such that it carries out the methods; see paragraph 0060) to perform:
acquire position information about an imaging position of an object that is imaged by an imaging apparatus of a portable information apparatus (Locator 114 that identifies the location; see fig. 3 and paragraph 0038. The location identifies a coordinate of the device in relation to the image; see fig. 2 and paragraph 0039);
transmit the position information to an information retrieval server via a communication network such that the position information is used in the information retrieval server to search related information on the object (At step 206; the mobile device 110 can send a packet of information 117 containing the street-level image 121, a location of the mobile device 110, a camera setting, and a compass heading to the image server 120; see fig. 2 and paragraph 0040);
receive the related information from the information retrieval server via the communication network (At step 208, location specific information associated with the at least one object in the image can be retrieved in response to the recognizing. The image server, upon recognizing the objects in the image can send back a packet of information of information related to the object; see fig. 2 and paragraph 0041); and
display the related information on the object on a display such that a composition image is displayed on the display in which at least a part of AR information is overlapped with the imaged data of the object, the related information on the object including the AR information (At step 210, the location specific information can be overlaid onto the image. Mobile device 110 can overlay the advertisements 137 onto the captured image. Items 137 can be located at positions in the image corresponding to the building or business associated with the advertisement; see figs. 2, 5 and paragraph 0042).
Regarding claim 19, Lee discloses everything claimed as applied above (see claim 18). In addition, Lee discloses the imaged data is transmitted to the information retrieval server via the communication network such that the imaged data is used in the information retrieval server to search the related information on the object (Sending the image with the location of the mobile device to an image server that can recognize at least one object in the image. Camera settings can also be sent with the location for narrowing a search of the object in an image database. The image server can respond with location specific information associated with the at least one object given the location; see paragraphs 0029, 0041).
10. The following is a quotation of the appropriate paragraphs of pre-AIA 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(e) the invention was described in (1) an application for patent, published under section 122(b), by another filed in the United States before the invention by the applicant for patent or (2) a patent granted on an application for patent by another filed in the United States before the invention by the applicant for patent, except that an international application filed under the treaty defined in section 351(a) shall have the effects for purposes of this subsection of an application filed in the United States only if the international application designated the United States and was published under Article 21(2) of such treaty in the English language.
11. Claims 1-11 and 14-21 are rejected under pre-AIA 35 U.S.C. 102(e) as being anticipated by Kankainen (US-PGPUB 2011/0141141).
Regarding claim 1, Kankainen discloses a portable information apparatus (UE 101; see figs. 1-2) of communicating with an information retrieval server (Content mapping platform 103 having elements 109a; 109b; see fig. 1 and paragraphs 0042, 0044) via a communication network (Communication Network 105; see fig. 1), the portable information apparatus comprising:
an imager (Image module 203; see fig. 2 and paragraph 0051) via which an object is imaged (A live image 601; see fig. 6A and paragraphs 0051, 0074);
a display on which imaged data of the object is displayed (see fig. 6A and paragraph 0074);
a locator via which position information about an imaging position of the object is acquired (This information in conjunction with the magnetometer information and location information is utilized in selecting available content items to present navigational information to the user; see paragraphs 0056, 0058, 0060, 0074-0075); and
a circuitry comprising:
a controller that transmits the position information to the information retrieval server via the communication network such that the position information is used in the information retrieval server to search related information on the object (Use geographic coordinate data to retrieve relevant content geo-tagged in the live image; see paragraphs 0021, 0042), the controller receives the related information from the information retrieval server via the communication network, and the controller displays the related information on the object on the display, wherein the related information on the object includes AR information, and the controller has a mode where a composition image is displayed on the display in which at least a part of the AR information is overlapped with the imaged data of the object (The application 107 displays a live view of the currently location and augmented content 605 (e.g., "Farragut West Station") with the "A" tab on the live image; see paragraph 0074. Figure 6A shows augmented content 605 overlapped on the live image).
Regarding claim 2, Kankainen discloses everything claimed as applied above (see claim 1). In addition, Kankainen discloses the AR information includes a name of the object (Application 107 augmented content 605 ("Farragut West Station"; which is a name of the captured object) with the "A" tab on the live image; see paragraph 0074), a description of the object (The augmented content data includes ratings as description of the captured object; see paragraph 0028), and post information related to the object (The augmented content data includes hours and prices for the captured location as post information; see paragraphs 0028, 0022).
Regarding claim 3, Kankainen discloses everything claimed as applied above (see claim 2). In addition, Kankainen discloses the controller displays the AR information on the display as a tag (Application 107 augmented content with the "A" tab on the live image; see paragraph 0074).
Regarding claim 4, Kankainen discloses everything claimed as applied above (see claim 1). In addition, Kankainen discloses the controller changes between a display mode in which the related information on the object is displayed on the display and a non-display mode (The mapping and augmented reality application 107 allows the user to switch between the live image (fig. 6B) and a combined image of augmented content 605 on the live image (see fig. 6A); see paragraphs 0074-0075).
Regarding claim 5, Kankainen discloses everything claimed as applied above (see claim 1). In addition, Kankainen discloses the controller transmits the imaged data to the information retrieval server via the communication network such that the imaged data is used in the information retrieval server to search the related information on the object (Control logic 201 interacts with a location module 205 to retrieve location data of the current location of the UE 101. The location data can include addresses, geographic coordinates (e.g., GPS coordinates) or other indicators (e.g., longitude and latitude information) that can be associated with the current location. Use geographic coordinate data to retrieve relevant content geo-tagged in the live image. Application 107 requests mapping and content data from the content mapping platform 103; see paragraphs 0021, 0042, 0052, 0050).
Regarding claim 6, Kankainen discloses everything claimed as applied above (see claim 5). In addition, Kankainen discloses the circuitry further comprises a hold controller that outputs a control signal to the controller to hold the imaged data of the object (A property of the augmented reality user interface is that the display follows the movement and pointing of the UE 101. When the user has found and is displaying a favorite viewpoint, the user may wish to "lock" or fix the display at a particular viewpoint without having to maintain the UE 107 in the same position; see paragraphs 0071, 0077).
Regarding claim 7, Kankainen discloses everything claimed as applied above (see claim 6). In addition, Kankainen discloses the hold controller outputs the control signal to the controller to hold the imaged data of the object and the related information on the object (The user can fix the viewpoint of the correlated prerecorded panoramic image so as to save a POI in the system 100. Application 107 can set the building as a POI with the POI button and the POI icon in the live image; see paragraphs 0071, 0077).
Regarding claim 8, Kankainen discloses everything claimed as applied above (see claim 5). In addition, Kankainen discloses the locator has an azimuth sensor, and the position information includes azimuth information (Magnetometer module 211 which determines horizontal orientation or directional heading of the UE 101; see paragraph 0056. An accelerometer module 213 which determines vertical orientation or an angle of elevation of the UE 101; see paragraphs 0056, 0074-0075).
Regarding claim 9, Kankainen discloses everything claimed as applied above (see claim 8). In addition, Kankainen discloses the locator has a GPS, and the position information includes latitude and longitude information (Control logic 201 interacts with a location module 205 to retrieve location data of the current location of the UE 101. The location data includes geographic coordinates (e.g., GPS coordinates) and other indicators (e.g., longitude and latitude information) that can be associated with the current location; see paragraphs 0052, 0053).
Regarding claim 10, Kankainen discloses everything claimed as applied above (see claim 9). In addition, Kankainen discloses the controller transmits angle of view information on the image of the object to the information retrieval server via the communication network (Magnetometer module 211 which determines horizontal orientation or directional heading of the UE 101; see paragraph 0056. An accelerometer module 213 which determines vertical orientation or an angle of elevation of the UE 101; see paragraphs 0056, 0074-0075), and the angle of view information is used in the information retrieval server to search the related information on the object (UE 101 retrieves content information and mapping information from a content mapping platform 103 via a communication network 105; see paragraph 0042 and figs. 6A-6B).
Regarding claim 11, Kankainen discloses everything claimed as applied above (see claim 5). In addition, Kankainen discloses the object includes a structure, and the structure includes a store and the related information includes commercial information and coupon information on the store (The surrounding of the Station contains points of interest (e.g., restaurants) tagged with augmented content (e.g., cosines, ratings, prices, hours, etc.), and the user can see the augmented content in a live image view; see paragraph 0028).
Regarding claim 14, Kankainen discloses everything claimed as applied above (see claim 5). In addition, Kankainen discloses a memory in which the imaged data and the position information are stored (Correlating at least one live image with a prerecorded panoramic image, when a location of a device used to capture the at least one live image matches a location of a device used to capture the panoramic prerecorded image; see paragraph 0003), wherein the controller displays a plurality of images (Images 601 and 603; see fig. 6A) of the imaged data on the display so that an image of the imaged data related to the object is selected by a user (Application 107 displays a live view of the currently location side-by-side with the correlated prerecorded panoramic image of the destination; see paragraph 0074), the controller transmits the position information on the selected image data to the information retrieval server via the communication network (The user can select a viewpoint and save it as POI in the system; see paragraphs 0071 and 0077).
Regarding claim 15, Kankainen discloses everything claimed as applied above (see claim 14). In addition, Kankainen discloses the controller has a mode where a plurality of thumbnail images of the respective imaged data is displayed on the display (The content information can include occupants/shops/facilities located in the POI such as thumbnail images; see paragraph 0040).
Regarding claim 16, Kankainen discloses everything claimed as applied above (see claim 14). In addition, Kankainen discloses the object includes a structure, and the controller acquires an imaged data of a map including the structure from the information retrieval server via the communication network and has a mode where a guidance route and a current position of a user are overlapped with the imaged data of the map and are displayed on the display (The application 107 shows in the map 500 a tab "A" 501 to the current location, a tab "B" 503 to the destination location, and dots for turning points. As the user continues walking along the route, the application 107 displays similar contents for the turning points located on the route. The application 107 further provides the user a navigation arrow 683 to order a switch between a live image view and a correlated prerecorded panoramic image of the current location and/or a POI; see paragraphs 0073-0079 and figs. 5, 6A-6H).
Regarding claim 17, Kankainen discloses everything claimed as applied above (see claim 14). In addition, Kankainen discloses the controller has a mode where the imaged data of the object and the AR information are uploaded to a website including a Social Networking Service (The content can provided by the service platform 111 which includes one or more services 113a-113n (social networking service), the one or more content providers 115a-115m, other content source available or accessible over the communication network 105; see paragraph 0042. The POIs can be pre-set by users, service providers (social network), and/or device manufacturers, and the relevant content can be embedded/tagged by any one of a combination of these entities as well; see paragraph 0039).
Regarding claim 18, Kankainen discloses a non-transitory computer-readable recording medium on which is recorded a program that causes a computer (Techniques are performed by computer system 700 in response to processor 702 executing one or more sequences of one or more processor instructions contained in memory 704; see paragraphs 0096, 0004) to perform:
acquire position information about an imaging position of an object that is imaged by an imaging apparatus of a portable information apparatus (This information in conjunction with the magnetometer information and location information is utilized in selecting available content items to present navigational information to the user; see paragraphs 0056, 0058, 0060, 0074-0075);
transmit the position information to an information retrieval server via a communication network such that the position information is used in the information retrieval server to search related information on the object (Use geographic coordinate data to retrieve relevant content geo-tagged in the live image; see paragraphs 0021, 0042);
receive the related information from the information retrieval server via the communication network; and display the related information on the object on a display such that a composition image is displayed on the display in which at least a part of AR information is overlapped with the imaged data of the object, the related information on the object including the AR information (The application 107 displays a live view of the currently location and augmented content 605 (e.g., "Farragut West Station") with the "A" tab on the live image; see paragraph 0074. Figure 6A shows augmented content 605 overlapped on the live image).
Regarding claim 19, Kankainen discloses everything claimed as applied above (see claim 18). In addition, Kankainen discloses the imaged data is transmitted to the information retrieval server via the communication network such that the imaged data is used in the information retrieval server to search the related information on the object (Control logic 201 interacts with a location module 205 to retrieve location data of the current location of the UE 101. The location data can include addresses, geographic coordinates (e.g., GPS coordinates) or other indicators (e.g., longitude and latitude information) that can be associated with the current location. Use geographic coordinate data to retrieve relevant content geo-tagged in the live image. Application 107 requests mapping and content data from the content mapping platform 103; see paragraphs 0021, 0042, 0052, 0050).
Regarding claim 20, Kankainen discloses everything claimed as applied above (see claim 19). In addition, Kankainen discloses the imaged data of the object and the related information on the object are held (A property of the augmented reality user interface is that the display follows the movement and pointing of the UE 101. When the user has found and is displaying a favorite viewpoint, the user may wish to "lock" or fix the display at a particular viewpoint without having to maintain the UE 107 in the same position; see paragraphs 0071, 0077).
Regarding claim 21, Kankainen discloses everything claimed as applied above (see claim 20). In addition, Kankainen discloses the imaged data and the position information are stored in a memory (Correlating at least one live image with a prerecorded panoramic image, when a location of a device used to capture the at least one live image matches a location of a device used to capture the panoramic prerecorded image; see paragraph 0003), a plurality of images of the imaged data on the display is displayed (Images 601 and 603; see fig. 6A) so that an image of the imaged data related to the object is selected by a user (Application 107 displays a live view of the currently location side-by-side with the correlated prerecorded panoramic image of the destination; see paragraph 0074), and the position information on the selected image data is transmitted to the information retrieval server via the communication network (The user can select a viewpoint and save it as POI in the system; see paragraphs 0071 and 0077).
Claim Rejections - 35 USC § 103
12. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
13. The following is a quotation of pre-AIA 35 U.S.C. 103(a) which forms the basis for all obviousness rejections set forth in this Office action:
(a) A patent may not be obtained though the invention is not identically disclosed or described as set forth in section 102, if the differences between the subject matter sought to be patented and the prior art are such that the subject matter as a whole would have been obvious at the time the invention was made to a person having ordinary skill in the art to which said subject matter pertains. Patentability shall not be negated by the manner in which the invention was made.
14. Claims 12-13 are rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Kankainen in view of Lee et al. (US-PGPUB 2008/0147730).
Regarding claim 12, Kankainen discloses everything claimed as applied above (see claim 5). However, Kankainen does not expressly disclose an input portion via which information is input by a user, wherein the post information is input by the user via the input portion.
On the other hand, Lee discloses an input portion via which information is input by a user, wherein the post information is input by the user via the input portion (Location specific information can include notes or messages left by other individuals. Upon receiving the advertisements, a user can provide comments regarding the advertisement. The user can upload the comments to the mobility manager 115 which can then share the comments with other users. If a second user takes a picture at the same location, with similar buildings or businesses identified, the second user can be provided with the feedback from the first user; see paragraphs 0043-0044).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Kankainen and Lee to provide an input portion via which information is input by a user, wherein the post information is input by the user via the input portion for the purpose of easily updating promotional information related to the captured objects.
Regarding claim 13, Kankainen and Lee disclose everything claimed as applied above (see claim 12). In addition, Kankainen discloses the post information is input with image data and text information, and the input text information is displayed on the display (The surrounding of the Station contains points of interest (e.g., restaurants) tagged with augmented content (e.g., cosines, ratings, prices, hours, etc.), and the user can see the augmented content in a live image view; see paragraphs 0028, 0061, 0022).
Contact Information
15. Any inquiry concerning this communication or earlier communications from the examiner should be directed to CYNTHIA CALDERON whose telephone number is (571)270-3580. The examiner can normally be reached M-F 9:00 AM-5:00 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, TWYLER HASKINS can be reached at (571)272-7406. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CYNTHIA CALDERON/Primary Examiner, Art Unit 2639 12/19/2025