DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 2/18/2026 has been entered.
Response to Arguments
Applicant’s arguments filed 2/18/2026, with respect to claims 1, 2, 4-17 have been fully considered but are moot in view new ground(s) of rejection.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 2 and 8-17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kim et al. (PGPUB Document No. US 2019/0180485) in view of Beyeler et al. (PGPUB Document No. US 2012/0197696) in view of Polidi et al. (PGPUB Document No. US 2002/0138196).
Regarding claim 1, Kim teaches a method for providing an augmented reality (AR) service, the method comprising:
Receiving, by an AR service device, information necessary for the AR service from a server, the information collected and processed in the server (communicating with a server (Kim: 0256-0257, 0310) to carry out the AR service (Kim: 0313));
And providing, by the AR service device, the AR service using the information transmitted from the server (the resulting AR service according to the teachings of Kim disclosed above),
Wherein the providing is configured such that the AR service device determines a type of an AR object to be output based on information that matches a traveling road which the vehicle is on (information such as 1330a and 1430b that corresponds to the POI (Kim: 0373, 0379, FIG.13A, FIG.14B)).
However, Kim does not expressly teach but Beyeler teaches,
Wherein the receiving is configured such that the AR service device receives the information related to points of interest (POIs) from the server and classifies the information related to POIs received from the server into POIs that match the traveling road and POIs that do not match the traveling road, based on a preset navigation route (classifying POIs based on whether they are along the route (“match the traveling road”) or not (“do not match the traveling road”) (Beyeler: 0043-0044)).
Therefore, before the effective filing date of the claimed invention, it would have been obvious to one of an ordinary skill in the art to modify the teachings of Kim such as to provide POI information in the manner taught by Beyeler, because this enables providing useful information to the user that are along the navigation route.
Further, the combined teachings above do not expressly teach but Polidi teaches wherein the providing is configured such that the AR service device determines whether to remove overlapping POIs among a plurality of AR objects corresponding to a plurality of POIs viewed from a current location of the vehicle, based on a traveling speed of the vehicle (limiting the number of POI’s at higher speeds (Polidi: 0033). Note, the claim does not specify to remove POI’s only among overlapping POI’s, or an explicit step of recognizing overlapping POI’s and then limiting the number POI’s only among said overlapping POI’s. Therefore, the teachings of Polidi applies as removing POI’s according to Polidi applies equally to both overlapping and non-overlapping POI’s).
Therefore, before the effective filing date of the claimed invention, it would have been obvious to one of an ordinary skill in the art to modify the combined teachings above such as to filter the number of POI’s in the manner suggested by Polidi, because this enables efficient display of POI’s that the user can practically absorb/recognize (Polidi: 0033)).
Regarding claim 2, the combined teachings above teach the method of claim 1, wherein the receiving is configured such that the AR service device requests the server to transmit information related to POIs present within a predetermined radius based on a current location, and receives the information related to the POIs present within the predetermined radius based on the current location from the server (output a graphic object when the user/vehicle is within a threshold distance (Kim: 0023-0026)).
Regarding claim 8, the combined teachings above teach the method of claim 1, wherein the processing is configured such that the AR service device extracts attribute information of a POI that matches the road on which the vehicle is traveling and overlays an AR object onto an image based on the extracted attribute information of the POI (information such as “name of the destination, a remaining distance to the destination, or an advertisement associated with the destination” are extracted based on the location/vicinity of the user’s vehicle (Kim: 0390)).
Regarding claim 9, the combined teachings above teach the method of claim 8, wherein the processing is configured such that the AR service device determines a type of the AR object based on the attribute information of the POI for which the AR object is to be overlaid, and determines a size of the AR object based on a distance to the POI (“when the destination is approached, the processor 870 enlarges a size of the graphic object that is output” (Kim: 0326)).
Regarding claim 10, the combined teachings above teach the method of claim 1, wherein the processing is configured such that the AR service device displays the AR object in a different way depending on whether or not a traveling speed of the vehicle exceeds a threshold speed (the speed of the vehicle compared to a “specific speed” determines the type of graphic object to be displayed (Kim: 0392)).
Regarding claim 11, the combined teachings above teach the method of claim 1, wherein the processing is configured such that the AR service device determines whether a condition for outputting an AR carpet, which is an AR object shaped like a carpet, is met, and when the condition is met, overlays the AR carpet onto the image (displaying a graphic carpet 1150 expressing a path for the vehicle when within a threshold distance (Kim: 0353, FIG.11B(b))).
Regarding claim 12, the combined teachings above teach the method of claim 1, wherein the processing is configured such that the AR service device displays an AR object corresponding to a POI set as a landmark on an image (information 1212 and 1210 corresponding to the parking lot (Kim: 0368-0369, FIG.12A)),
and outputs detailed information on the landmark received from the server to be overlaid onto the image when the AR object corresponding to the landmark POI output on the image is selected (parking lot is selected through a user interface device according to a driver's request (Kim: 0426)).
Regarding claim 13, the combined teachings above teach the method of claim 1, wherein the processing is configured such that the AR service device, when a POI in the image where an AR object is overlaid is a place corresponding to a destination, varies the AR object depending on a distance to the destination (“when the destination is approached, the processor 870 enlarges a size of the graphic object that is output” (Kim: 0326)).
Regarding claim 14, the combined teachings above teach the method of claim 1, wherein the server transmits AR advertisement information to the AR service device,
The AR advertisement information includes information on an output position and an output format (graphic object may be an advertisement (Kim: 0390), the position and graphic type used in displaying said advertisement correspond to the output position and format),
And the processing is configured such that the AR service device extracts AR advertisement information mapped to a direction of travel of the vehicle and a road on which the vehicle is traveling, based on the AR advertisement information (graphic carpet 1150 and navigation information 1160 is a factor of the vehicle direction and road (Kim: 0353, FIG.11B(b))),
And renders the extracted AR advertisement information so that an AR advertisement is output at the output position in the display format, by using the extracted AR advertisement information (the resulting graphic objects as displayed in FIG.11B(b) of Kim).
Regarding claim 15, the combined teachings above teach the method of claim 1, wherein the AR service device, when a destination is a place where the vehicle is allowed to approach and the vehicle approaches the destination (the Examiner construes a parking lot with parking spaces available as indicated in graphic object 1434 of FIG.14C corresponds “allowing” the vehicle to approach said parking lot (Kim: 0358, FIG.14C)), outputs a page related to a service available at the destination to be overlaid onto an image (graphic object 1434 indicates information related to parking spaces available (Kim: 0358, FIG.14C)).
Regarding claim 16, the combined teachings above teach the method of claim 1, wherein the type of the AR object includes at least one of a group POI (graphic objects of the same category displayed in FIG.23B), a mini POI (the Examiner believes limitation “mini” is broad. Therefore, under the broadest reasonable interpretation, the Examiner construes graphic objects 1330a and 1320 of FIG.13A of Kim to be mini POIs), a remote POI, a bubble POI, a brand carpet, a POI not existing on a traveling road, and a 3D object.
Regarding claim 17, the combined teachings above teach the method of claim 1, further comprising: setting a POI display condition when the AR service device determines the type of an AR object to be output based on information that matches the traveling road which the vehicle is on (whether a POI is along the route corresponds to setting a POI display condition as presently claimed (Beyeler: 0043-0044). Further, under the broadest reasonable interpretation a “type of an AR object” could be simply be whether the POI is a type that is to be displayed (along the route) or not (not along the route)).
Claim(s) 3-5 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kim in view of Beyeler in view of Polidi as applied to the claim(s) above, and further in view of Linn (PGPUB Document No. US 2005/0102102).
Regarding claim 3, the combined teachings above contain a “base” process of presenting POI to a user driving a vehicle corresponding to the current location of said vehicle (Kim: 0331), which the claimed invention can be seen as an “improvement” in that the validation process is carried out by, “classifies the information related to the POIs received from the server into POIs that match the traveling road and POIs that do not match the traveling road, based on a preset navigation route.”
Linn teaches a known technique of a navigation system capable of listing various types of POIs along the route (Linn: 0008, 0038, FIG.17, 0063, 0082). The teachings of Linn is applicable to the “base” process.
Linn’s known technique would have been recognized by one skill in the art as applicable to the “base” process of the combined teachings above and the results would have been predictable and resulted in presenting POI information to the user along a navigation route, which results in an improved process.
Therefore, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention.
Regarding claim 4, the combined teachings teach the method of claim 3, wherein the receiving is configured such that the AR service device determines types of AR objects that are overlaid on an image captured by a camera of the vehicle, based on information that matches the traveling road (applying the teachings Linn as stated in the rejection above to claim 3 to the AR navigation of Kim enables displaying POIs of Kim that are along the navigation route).
Regarding claim 5, the combined teachings teach the method of claim 4, wherein the information that matches the traveling road is a point of interest (POI) attribute information, and the POI attribute information includes at least one of a POI type, user preference, vehicle driving speed, and a distance from the current location to the POI (POI classified into distance from the user position (Linn: 0019)).
Claim(s) 6 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kim in view of Beyeler in view of Polidi as applied to claim(s) 1 above, and further in view of Pozdnyakov (PGPUB Document No. US 2022/0207800) in view of ***.
Regarding claim 6, the combined teachings above do not expressly teach but Pozdnyakov teaches the method of claim 1, wherein the processing is configured such that the AR service device outputs, based on a preset method, a plurality of AR objects corresponding to a plurality of POIs viewed from a current location of the vehicle when the plurality of POIs overlaps one another (preventing overlap between labels on a map (Pozdnyakov: 0027)).
Therefore, before the effective filing date of the claimed invention, it would have been obvious to one of an ordinary skill in the art to modify the combined teachings above such as to prevent overlap between POI labels as taught by Pozdnyakov, because this enables effectively presenting POI information.
Further, the combined teachings above do not expressly teach but England teaches, wherein the preset method is configured such that the AR service device determines at least one of a number of real-time exposures to a POI, an adjustment of a POI size, and an exposure time of a POI, based on a traveling speed of the vehicle (the navigation engine 240 uses navigation information from the visitor tracker 225 to determine POIs to which a population of users frequently navigated (England: col.10, line 45-48). Note, the claim does not exclude the exposure tracking being one of a plurality of separate processes within the preset method, wherein the exposure tracking of England is not associated with the other steps).
Therefore, before the effective filing date of the claimed invention, it would have been obvious to one of an ordinary skill in the art to modify the combined teachings above such as to utilize the POI exposure teaching of England, because this enables an alternative method of customizing the display of POI along a navigation path.
Claim(s) 7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kim in view of Beyeler in view of Polidi as applied to the claim(s) above, and further in view of Kokkas (PGPUB Document No. US 20110153198).
Regarding claim 7, the combined teachings above do not expressly teach but Bauer teaches the method of claim 1, wherein the processing is configured such that the AR service device outputs an AR object for a POI closest to the vehicle as a 3D object when receiving, from the server, 3D information regarding the closest POI among a plurality of POIs with AR objects displayed on the image (Kokkas teaches the concept of rendering 3D objects associated with POIs when the 3D object is within the 3D rendering cut-off radius (Kokkas: 0089-0091). Note objects rendered in 3D that is within the cut-off radius comprise of 3D objects “closest” to the user).
Therefore, before the effective filing date of the claimed invention, it would have been obvious to one of an ordinary skill in the art to modify the combined teachings above such as to display 3D objects associated with POIs in the manner taught by Kokkas, because this enables effective processing of information (Kokkas: 0006).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to David H Chu whose telephone number is (571)272-8079. The examiner can normally be reached M-F: 9:30 - 1:30pm, 3:30-8:30pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Daniel F Hajnik can be reached at (571) 272-7642. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DAVID H CHU/Primary Examiner, Art Unit 2616