Prosecution Insights
Last updated: April 19, 2026
Application No. 17/653,749

METHOD AND APPARATUS FOR ROUTE GUIDANCE USING AUGMENTED REALITY VIEW

Final Rejection §103
Filed
Mar 07, 2022
Examiner
ALLEN, PAUL MCCARTHY
Art Unit
3669
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Naver Labs Corporation
OA Round
4 (Final)
44%
Grant Probability
Moderate
5-6
OA Rounds
3y 6m
To Grant
79%
With Interview

Examiner Intelligence

Grants 44% of resolved cases
44%
Career Allow Rate
80 granted / 180 resolved
-7.6% vs TC avg
Strong +35% interview lift
Without
With
+35.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 6m
Avg Prosecution
40 currently pending
Career history
220
Total Applications
across all art units

Statute-Specific Performance

§101
16.8%
-23.2% vs TC avg
§103
36.4%
-3.6% vs TC avg
§102
9.4%
-30.6% vs TC avg
§112
34.7%
-5.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 180 resolved cases

Office Action

§103
DETAILED ACTION Introduction Claims 1-3 and 6-20 have been examined in this application. Claims 1, 18, and 19 are amended. Claims 2, 3, 7, 11, 12, 14, 16, and 17 are original. Claims 6, 8-10, 13, and 15 are as previously presented. Claim 20 is new. Claims 4 and 5 are cancelled. This is a final office action in response to the arguments and amendments filed 1/12/2026. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Office Action Formatting The following is an explanation of the formatting used in the instant Office Action: • [0001] – Indicates a paragraph number in the most recent, previously cited source; • [0001, 0010] – Indicates multiple paragraphs (in example: paragraphs 1 and 10) in the most recent, previously cited source; • [0001-0010] – Indicates a range of paragraphs (in example: paragraphs 1 through 10) in the most recent, previously cited source; • 1:1 – Indicates a column number and a line number (in example: column 1, line 1) in the most recent, previously cited source; • 1:1, 2:1 – Indicates multiple column and line numbers (in example, column 1, line 1 and column 2, line 2) in the most recent, previously cited source; • 1:1-10 – Indicates a range of lines within one column (in example: all lines spanning, and including, lines 1 and 10 in column 1) in the most recent, previously cited source; • 1:1-2:1 – Indicates a range of lines spanning several columns (in example: column 1, line 1 to column 2, line 1 and including all intervening lines) in the most recent, previously cited source; • p. 1, ln. 1 – Indicates a page and line number in the most recent, previously cited source; • ¶1 – The paragraph symbol is used solely to refer to Applicant's own specification (further example: p. 1, ¶1 indicates first paragraph of page 1); and • BRI – the broadest reasonable interpretation. Priority Acknowledgment is made of applicant's claim for foreign priority based on application KR10-2021-0030967 filed in The Republic of Korea on 03/09/2021. Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. Response to Arguments Applicant's arguments, filed 1/12/2026, have been fully considered. Regarding the arguments pertaining to the claim rejections under 103 (presented on p. 9-10 under the heading “REMARKS”), the arguments and amendments are persuasive. Therefore, the rejections have been withdrawn. However, upon further consideration, a new grounds of rejection is made in view of the additional prior art of US2021/0003414A1 (Yamaguchi et al.), US2019/0301886A1 (Elangovan et al.), and US2020/0378779A1 (Yu) as well as the previously relied upon prior art of US2016/0232788A1 (Byun), US2015/0221220A1 (Arai et al.), US2021/0102820A1 (Le et al.), KR20160146384A (Lee Chang), US2023/0314154A1 (Nozaki), US2019/0017839A1 (Eyler et al.), US2021/0078503A1 (Horihata et al.), US2021/0192787A1 (Jung et al.), and US2019/0049266A1 (Takakura et al.). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1, 6, 8, 18, and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Publication US2021/0003414A1 (Yamaguchi et al.) in view of Publication US2016/0232788A1 (Byun), further in view of Publication US2019/0301886A1 (Elangovan et al.). Regarding Claim 1, Yamaguchi et al. discloses a route guidance method (see Figure 12) performed by a user terminal (see [0082] by display apparatus 1, [0028, 0030, 0038] computer in vehicle), comprising: acquiring a route from a source to a destination (see [0083] acquire planned path, a path necessarily having start/end); and providing route guidance from the source to the destination through an augmented reality (AR) view (see [0086-0087] display path guidance using superimposed information) that includes an image of a field of view (FOV) (see Figure 2, [0031, 0034] projection area with field of view of occupant), based on the route (see Figure 12, display at S15 or S17 based on path from S11), wherein the route includes at least one turn point (see [0053] “change position” of the route such as turn or intersection), and the providing of the route guidance comprises, in response to the user terminal moving toward the destination based on the route (see Figure 12, [0084] S12 advancing along route): in response to a first turn point approached by the user terminal among the at least one turn point being included in the FOV (see [0086] “yes” at S16 for route change position in the display area, proceed to S17), displaying on the image a first point indicator (see [0087] adjust display of guidance information such as that seen in Figures 9B, 10B, or 11B) that indicates a location of the first turn point on the route (see Figures 9B, 10B, or 11B, curved lines/marker pattern indicates location(s) of turn) while a first instruction indicator and instructs a movement of the user terminal towards the first turn point is not displayed (see Figures 9A, 10A, 11A, straight lines / pattern that instruct in the direction of the turn/intersection are no longer displayed), and in response to the first turn point being located outside the FOV (see [0086] “no” at S16 for change position not in display area), displaying on the image the first instruction indicator (see [0084], Figures 9A, 10A, 11A, straight lines / pattern that instruct in the direction of the turn/intersection being displayed) while the first point indicator is not displayed (see Figure 12, for “no” at S16, adjusted/curved guidance not yet displayed). Yamaguchi et al. does not explicitly recite: acquiring, from a remote server, a route from a source to a destination set by a user of the user terminal. However, Byun teaches a technique for acquiring a vehicle route, comprising: acquiring, from a remote server, a route from a source to a destination set by a user of the user terminal (see [0038-0040]). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify the route of Yamaguchi et al. to be obtained in the manner taught by Byun, with a reasonable expectation of success, with the motivation of improving route finding by incorporating a centralized source of data such as traffic information and route conditions (see Byun [0037]). Yamaguchi et al. does not explicitly recite an augmented reality (AR) view: that includes an image of a field of view (FOV) captured by a camera of the user terminal. However, Elangovan et al. teaches a technique of displaying augmented navigation information for a user terminal (see Claim 1, [0109], processor and display in vehicle), wherein the view: that includes an image of a field of view (FOV) captured by a camera of the user terminal (see [0109] AR indications of a navigation route rendered over camera video stream). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify the augmented reality view using a HUD of Yamaguchi et al. to additionally or alternatively use an image from a camera as taught by Elangovan et al., with a reasonable expectation of success, with the motivation of enhancing the robustness and flexibility of the system to display information on in-vehicle displays and improve user experience by reducing image errors (see Elangovan et al., [0001-0002]). Regarding Claim 6, Yamaguchi et al. discloses the route guidance method of claim 1, wherein the first instruction indicator includes a first element that indicates a direction from a current location of the user terminal to the first turn point (see Figure 9A, [0068] arrows 32 in a line/pattern which “extends… toward the direction in which the left turn exists”) and a second element that connects from the first element to the first turn point (see Figure 9A, [0068] strip 31 extending from elements of 32 toward turn point). Regarding Claim 8, Yamaguchi et al. discloses wherein the first instruction indicator is displayed to direct to the first turn point (see Figure 9A). Yamaguchi et al. does not explicitly recite the route guidance method of claim 1, wherein the first instruction indicator is displayed to direct to the first turn point according to a rotation of the camera. However, Elangovan et al. teaches the technique as above, wherein navigation information is displayed according to a rotation of the camera (see [0109] rendering based on camera pose which includes orientation [0117] rotation). The motivation to combine Yamaguchi et al. and Elangovan et al. was provided in the rejection of Claim 1. Regarding Claim 18, Yamaguchi et al. discloses a route guidance method (see Figure 12) performed by a user terminal (see [0082] by display apparatus 1, [0028, 0030, 0038] computer in vehicle), comprising: acquiring a route from a source to a destination (see [0083] acquire planned path, a path necessarily having start/end); and providing route guidance from the source to the destination through an augmented reality (AR) view (see [0086-0087] display path guidance using superimposed information) that includes an image of a field of view (FOV) (see Figure 2, [0031, 0034] projection area with field of view of occupant), based on the route (see Figure 12, display at S15 or S17 based on path from S11), wherein the route includes a plurality of turn points (see [0053] “change position” of the route such as turn and turn intersection), and, the providing of the route guidance comprises: in response to the user terminal moving toward the destination based on the route(see Figure 12, [0084] S12 advancing along route): in response to a first turn point approached by the user terminal among the plurality of turn points being included in the FOV (see [0086] “yes” at S16 for route change position in the display area, proceed to S17), displaying on the image a first point indicator (see [0087] adjust display of guidance information such as that seen in Figures 9B, 10B, or 11B) that indicates a location of the first turn point on the route (see Figures 9B, 10B, or 11B, curved lines/marker pattern indicates location(s) of turn) while a first instruction indicator that instructs a movement of the user terminal towards the first turn point is not displayed (see Figures 9A, 10A, 11A, straight lines / pattern that instruct in the direction of the turn/intersection are no longer displayed), in response to the first turn point being located outside the FOV (see [0086] “no” at S16 for change position not in display area), displaying on the image the first instruction indicator (see [0084], Figures 9A, 10A, 11A, straight lines / pattern that instruct in the direction of the turn/intersection being displayed) while the first point indicator is not displayed (see Figure 12, for “no” at S16, adjusted/curved guidance not yet displayed). and displaying a second instruction indicator that instructs a movement towards a second turn point to which the user terminal is to move after the first turn point (see Figure 12, for a subsequent iteration of S15 in the process after “no” at S18, at a time when the vehicle has driven through the first turn point), without displaying the first instruction indicator (see Figure 9A, [0084] instruction indicator only to upcoming turn (not previous turn) being shown), responsive to a distance from the user terminal to the first turn point being a predetermined value or less (see Figure 12, [0084] responsive to “yes” at S12, change position within 100 or 200 meters) and the second turn point being located outside the FOV (see Figure 12 for “no” at S16). Yamaguchi et al. does not explicitly recite the route from a source to a destination: set by a user of the user terminal. However, Byun teaches a technique for acquiring a vehicle route: set by a user of the user terminal (see [0038-0040]). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify the route of Yamaguchi et al. to be obtained in the manner taught by Byun, with a reasonable expectation of success, with the motivation of improving route finding by incorporating a centralized source of data such as traffic information and route conditions (see Byun [0037]). Yamaguchi et al. does not explicitly recite an augmented reality (AR) view: that includes an image of a field of view (FOV) captured by a camera of the user terminal. However, Elangovan et al. teaches a technique of displaying augmented navigation information for a user terminal (see Claim 1, [0109], processor and display in vehicle), wherein the view: that includes an image of a field of view (FOV) captured by a camera of the user terminal (see [0109] AR indications of a navigation route rendered over camera video stream). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify the augmented reality view using a HUD of Yamaguchi et al. to additionally or alternatively use an image from a camera as taught by Elangovan et al., with a reasonable expectation of success, with the motivation of enhancing the robustness and flexibility of the system to display information on in-vehicle displays and improve user experience by reducing image errors (see Elangovan et al., [0001-0002]). Regarding Claim 19, all limitations as recited have been analyzed with respect to Claim 1. Claim 19 pertains to an apparatus corresponding to the method of Claim 1. Claim 19 does not teach or define any new limitations beyond Claim 1, and therefore is rejected under the same rationale. Claim 2 is rejected under 35 U.S.C. 103 as being unpatentable over Publication US2021/0003414A1 (Yamaguchi et al.) in view of Publication US2016/0232788A1 (Byun), further in view of Publication US2019/0301886A1 (Elangovan et al.), further in view of Publication US2015/0221220A1 (Arai et al.). Regarding Claim 2, Yamaguchi et al. discloses wherein: the first turn point is a point to which the user terminal is to move from a current location in order to move toward the destination (see [0053] position at which there is curve or turn, as part of the path, and Figure 12, [0084], vehicle approaches). Yamaguchi et al. does not explicitly recite the route guidance method of claim 1, wherein each of the at least one turn point is a point at which a turn of a desired angle or more by the user terminal is required on the route. However, Arai et al. teaches a technique in determining turn-based navigation (see [0039-0040] determination of guide target intersections where a vehicle travels into and exits and intersection), wherein each of the at least one turn point is a point at which a turn of a desired angle or more by the user terminal is required on the route (see [0040] intersections for which the absolute value of a turning angle is equal to or more than a threshold as the guide target intersections). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify the turn points of Yamaguchi et al. to include points of a turn of a desired angle or more, as taught by Arai et al., with a reasonable expectation of success, with the motivation of improving user comfort and the intuitiveness of navigation (see Arai et al., [0003]). Claim 3 is rejected under 35 U.S.C. 103 as being unpatentable over Publication US2021/0003414A1 (Yamaguchi et al.) in view of Publication US2016/0232788A1 (Byun), further in view of Publication US2019/0301886A1 (Elangovan et al.), further in view of Published Application US2021/0102820A1 (Le et al.). Regarding Claim 3, Yamaguchi et al. does not explicitly recite the route guidance method of claim 1, wherein the displaying of the first point indicator comprises displaying a distance from the user terminal to the first turn point on the first point indicator. However, Le et al. teaches a technique in augmented reality navigation (see [0021]), wherein the displaying of the first point indicator (see [0032] a POI object that can represent an instruction to make a turn) comprises displaying a distance from the user terminal to the first turn point on the first point indicator (see [0032] Figure 1B e.g. 0.09 mile indicator). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify the point indicator of Yamaguchi et al. to further comprise distance as taught by Le et al., with a reasonable expectation of success, with the motivation of expanding the method to other POIs while creating an intuitive user experience (see Le et al., [0002, 0021-0022]). Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Publication US2021/0003414A1 (Yamaguchi et al.) in view of Publication US2016/0232788A1 (Byun), further in view of Publication US2019/0301886A1 (Elangovan et al.), further in view of Publication KR20160146384A (Lee Chang) (Original copy IDS 9/19/2022, English translation 11/25/2024 relied upon for citations). Regarding Claim 7, Yamaguchi et al. discloses wherein: the second element includes a plurality of dots or a line that connects from the first element to the first turn point (see Figure 9A, strip 31 has boundary (i.e. line extending toward the turn point, and overlapping first element)). Yamaguchi et al. further discloses the first element including an arrow that indicates a direction of the upcoming turn (see Figure 9A, elements 32 being arrows) but does not explicitly recite the route guidance method of claim 6, wherein the first element includes an arrow: that indicates the direction from the current location of the user terminal to the first turn point. However, Lee Chang teaches a technique to display augmented reality navigation elements (see Figure 2a), wherein the first element includes an arrow: that indicates the direction from the current location of the user terminal to the first turn point (see Figure 2a, the TBT1 ribbon including arrow elements pointing toward the turn). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify the first element of Yamaguchi et al. to instead include arrows indicating the direction to the turn point as taught by Lee Chang, with a reasonable expectation of success. The prior art demonstrates that both variations of the arrow elements are known, and the simple substitution of one element for another would produce the predictable result of displaying different information about the turn. One would be motivated to make the modification with the motivation of improving presentation of directions by clearly displaying information without compromising realism (see Lee Chang, [0004]). Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over Publication US2021/0003414A1 (Yamaguchi et al.) in view of Publication US2016/0232788A1 (Byun), further in view of Publication US2019/0301886A1 (Elangovan et al.), further in view of Published Application US2023/0314154A1 (Nozaki). Regarding Claim 9, Yamaguchi et al. does not explicitly recite the route guidance method of claim 1, further comprising: displaying a map view that includes a map corresponding to the image of the AR view, wherein the map view includes the route and a current location of the user terminal, and the first instruction indicator is displayed at a boundary between the map view and the AR view. However, Nozaki teaches a technique to display an AR view (see Figure 9, [0080] augmented reality image), comprising: displaying a map view that includes a map corresponding to the image of the AR view (see Figure 9, [0079-0080] map image and AR both showing target point 605), wherein the map view includes the route and a current location of the user terminal (see Figure 9, [0080] route 903 and current location 601), and the first instruction indicator is displayed at a boundary between the map view and the AR view (see Figure 9, the route line intersecting the boundary). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify the AR display of Yamaguchi et al. to further include map elements, as taught by Nozaki, with a reasonable expectation of success, with the motivation of enhancing the robustness of the system to provide additional information and improving comfort of the user (see Nozaki, [0025]). Claims 10 and 11 are rejected under 35 U.S.C. 103 as being unpatentable over Publication US2021/0003414A1 (Yamaguchi et al.) in view of Publication US2016/0232788A1 (Byun), further in view of Publication US2019/0301886A1 (Elangovan et al.), further in view of Publication US2019/0017839A1 (Eyler et al.). Regarding Claim 10, Yamaguchi et al. does not explicitly recite the route guidance method of claim 1, further comprising displaying the first instruction indicator and a second point indicator that guides to a second turn point or the destination point when the second turn point or the destination point is included in the FOV of the camera. However, Eyler et al. teaches a technique in AR navigation (see [0020]), comprising displaying the first instruction indicator (see Figure 8, [0179] element 802 showing movement to a turn) and a second point indicator that guides to a second turn point or the destination point (see Figure 8, [0183] drop off location element 806, a destination for passenger drop off) when the second turn point or the destination point is included in the FOV (see Figure 8, showing a case when the destination point is included in the AR view). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify the method of Yamaguchi et al. to further show a destination as taught by Eyler et al., with a reasonable expectation of success, with the motivation of enhancing the robustness and flexibility of the system to be used for transportation service applications while providing a more efficient, enjoyable, and well-informed transportation experience (see Eyler et al., [0005]). Regarding Claim 11, Yamaguchi et al. does not explicitly recite the route guidance method of claim 10, wherein the first instruction indicator and the second point indicator are displayed when a distance from the user terminal to the first turn point is less than a distance from the user terminal to the second turn point or the destination point. However, Eyler et al. teaches the technique as above, wherein the first instruction indicator and the second point indicator are displayed (see Figure 8) when a distance from the user terminal to the first turn point is less than a distance from the user terminal to the second turn point or the destination point (see Figure 8, showing a case of turning and then dropping off, i.e. the first turn point being closer). The motivation to combine Yamaguchi et al. and Eyler et al. was provided in the rejection of Claim 10. Claims 12, 13, and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Publication US2021/0003414A1 (Yamaguchi et al.) in view of Publication US2016/0232788A1 (Byun), further in view of Publication US2019/0301886A1 (Elangovan et al.), further in view of Published Application US2021/0078503A1 (Horihata et al.). Regarding Claim 12, Yamaguchi et al. does not explicitly recite the route guidance method of claim 1, further comprising changing a display form of the first point indicator when a distance from the user terminal to the first turn point is a desired value or less, wherein the changed first point indicator includes guidance information about a second turn point to which the user terminal is to move after the first turn point in order to move toward the destination among the at least one turn point or a destination point indicating the destination. However, Horihata et al. teaches a technique in route guidance (see Figure 1, [0017, 0021]), comprising changing a display form of the first point indicator when a distance from the user terminal to the first turn point is a desired value or less (see Figure 3A, 3B, [0054], first point indicator changed to include IE2 when distance 100m or less), wherein the changed first point indicator includes guidance information about a second turn point to which the user terminal is to move after the first turn point (see Figure 3b, [0050] point indicator IE2 shows traveling direction based on route, [0047] route with turns/intersections) in order to move toward the destination among the at least one turn point or a destination point indicating the destination (see [0024] route to destination). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify the point indicator of Yamaguchi et al. to further include an element changed based on distance and indicating a second turn point, as taught by Horihata et al., with a reasonable expectation of success, with the motivation of improving the user experience by presenting intuitive virtual images and reducing feelings of strangeness (see Horihata et al., [0016]). Regarding Claim 13, Yamaguchi et al. discloses the method further comprising displaying a second instruction indicator that instructs a movement to the second turn point or the destination point through augmentation on the image (see Figure 12, a subsequent iteration of S15 after the vehicle passes the first turn point, and “no” at s18), and displaying the second instruction indicator without displaying the changed first point indicator when the first turn point is not included in the FOV of the camera (see Figure 12, after the vehicle passes through the turn (i.e. turn behind the vehicle, not in FOV), the second instruction indicator will be shown for a future iteration of S15, but not the first point indicator (changed per the combination with Horihata et al. in Claim 12)). Yamaguchi et al. does not explicitly recite the route guidance method of claim 12, wherein the displaying of the second instruction indicator includes, displaying the second instruction indicator and the changed first point indicator when the first turn point is included in the FOV of the camera. However, Horihata et al. teaches the technique as above, wherein the displaying of the second instruction indicator includes, displaying the second instruction indicator and the changed first point indicator when the first turn point is included in the FOV of the camera (see Figure 3B, IE2 (changed first point indicator) and IE3 (second instruction indicator to next turn) both shown). The motivation to combine Yamaguchi et al. and Horihata et al. was provided above in the rejection of Claim 12. Regarding Claim 15, Yamaguchi et al. discloses the route guidance method of claim 13, wherein the second instruction indicator is displayed when the second turn point or the destination point is not included in the FOV of the camera (a later iteration of S15 in Figure 12 for a subsequent route change position, after “no” at S16 (second turn point not in FOV of the camera per the combination with Elangovan et al. in the rejection of Claim 1)), and the route guidance method further comprises: displaying a second point indicator that guides to the second turn point or the destination point without displaying the second instruction indicator when the second turn point or the destination point is included in the FOV of the camera (see Figure 12, the second iteration of S17 showing second point indicator when “yes” at S16). Claim 14 is rejected under 35 U.S.C. 103 as being unpatentable over Publication US2021/0003414A1 (Yamaguchi et al.) in view of Publication US2016/0232788A1 (Byun), further in view of Publication US2019/0301886A1 (Elangovan et al.), further in view of Published Application US2021/0078503A1 (Horihata et al.), further in view of Publication KR20160146384A (Lee Chang) (Original copy IDS 9/19/2022, English translation 11/25/2024 relied upon for citations). Regarding Claim 14, Yamaguchi et al. discloses wherein the second instruction indicator includes an arrow (see Figure 9A, elements 32) and a plurality of dots or a line that connects from the arrow to the second turn point or the destination point (see Figure 9A, element 31 as line connecting all arrows toward turn point). Yamaguchi et al. does not explicitly recite the route guidance method of claim 13, wherein the arrow: indicates a direction from a current location of the user terminal to the second turn point or the destination point. However, Lee Chang teaches a technique to display augmented reality navigation elements (see Figure 2a), wherein the first element includes an arrow: indicates a direction from a current location of the user terminal to the second turn point or the destination point (see Figure 2a, the TBT1 ribbon including arrow elements pointing toward the turn – for a second turn). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify the first element of Yamaguchi et al. to instead include arrows indicating the direction to the turn point as taught by Lee Chang, with a reasonable expectation of success. The prior art demonstrates that both variations of the arrow elements are known, and the simple substitution of one element for another would produce the predictable result of displaying different information about the turn. One would be motivated to make the modification with the motivation of improving presentation of directions by clearly displaying information without compromising realism (see Lee Chang, [0004]). Claim 16 is rejected under 35 U.S.C. 103 as being unpatentable over Publication US2021/0003414A1 (Yamaguchi et al.) in view of Publication US2016/0232788A1 (Byun), further in view of Publication US2019/0301886A1 (Elangovan et al.), further in view of Published Application US2021/0192787A1 (Jung et al.). Regarding Claim 16, Yamaguchi et al. does not explicitly recite the route guidance method of claim 1, wherein a location at which the first point indicator is displayed in the image is determined based on a location of a vanishing point of the image. However, Jung et al. teaches a technique in AR navigation (see [0009]), wherein a location at which the first point indicator is displayed in the image (see Figure 28(c), [0497], arrow shape displayed on a location of a real road [0006] on image) is determined based on a location of a vanishing point of the image (see Figure 26, operation of AR mode (i.e. including determining where to render indicator) based on calibration which [0484, 0490] is based on vanishing point). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify the system of Yamaguchi et al. to include a calibration as taught by Jung et al., with a reasonable expectation of success, with the motivation of improving accuracy of the AR device and ensuring proper positioning of objects (see Jung et al., [0006]). Claim 17 is rejected under 35 U.S.C. 103 as being unpatentable over Publication US2021/0003414A1 (Yamaguchi et al.) in view of Publication US2016/0232788A1 (Byun), further in view of Publication US2019/0301886A1 (Elangovan et al.), further in view of Publication US2019/0049266A1 (Takakura et al.). Regarding Claim 17, Yamaguchi et al. does not explicitly recite the route guidance method of claim 1, wherein the providing of the route guidance comprises searching again for the route to the destination when a location of the user terminal deviates from the route by a predetermined distance or more. However, Takakura et al. teaches a route guidance method, wherein the providing of the route guidance comprises searching again for the route to the destination when a location of the user terminal deviates from the route by a predetermined distance or more (see [0034] for a present location of vehicle deviating from the route over a predetermined distance, the route calculation portion 3 c recalculates the route [0032] to destination). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify the navigation system of Yamaguchi et al. to include route recalculation, as taught by Takakura et al., with a reasonable expectation of success, with the motivation of improving user experience and flexibility of the system to respond to unexpected conditions, while outputting information only when necessary (see Takakura et al., [0007]). Claim 20 is rejected under 35 U.S.C. 103 as being unpatentable over Publication US2021/0003414A1 (Yamaguchi et al.) in view of Publication US2016/0232788A1 (Byun), further in view of Publication US2019/0301886A1 (Elangovan et al.), further in view of Publication US2020/0378779A1 (Yu). Regarding Claim 20, Yamaguchi et al. does not explicitly recite the route guidance method of claim 1, further comprising: displaying a map view, on a screen of the user terminal along with the AR view, that includes a map corresponding to the image of the AR view, and wherein the first instruction indicator is displayed overlapping the map view and the AR view at a boundary therebetween. However, Yu teaches a technique to display augmented navigation information (see e.g. [0017]) comprising: displaying a map view, on a screen of the user terminal along with the AR view, that includes a map corresponding to the image of the AR view (see Figure 4, [0035] mini-map shown along with the live view, showing current location (i.e. corresponding to same place AR data will be overlaid on)), and wherein the first instruction indicator is displayed overlapping the map view and the AR view at a boundary therebetween (see Figures 2, 4, [0035-0036] destination icon 5 overlaps both AR view and edge of mini-map. Examiner’s note: the destination icon 5 therefore making up part of the overall “instruction indicator.” The claim does not explicitly require an element of the instruction indicator such as an arrow or dot that is on the route, as shown in Applicant’s Figure 9B)). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to modify the first instruction indicator of Yamaguchi et al. to additionally include a map as taught by Yu, with a reasonable expectation of success, with the motivation of improving situational awareness while minimizing clutter (see Yu, [0035]). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Paul Allen whose telephone number is (571) 272-4383. The examiner can normally be reached Monday - Friday from 9am to 5pm, Eastern. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Erin Piateski can be reached at 571-270-7429. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /P.A./Examiner, Art Unit 3669 /Erin M Piateski/Supervisory Patent Examiner, Art Unit 3669
Read full office action

Prosecution Timeline

Mar 07, 2022
Application Filed
Nov 16, 2024
Non-Final Rejection — §103
Feb 19, 2025
Response Filed
May 12, 2025
Final Rejection — §103
Jul 18, 2025
Response after Non-Final Action
Aug 14, 2025
Request for Continued Examination
Aug 19, 2025
Response after Non-Final Action
Sep 06, 2025
Non-Final Rejection — §103
Jan 12, 2026
Response Filed
Feb 13, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12565190
CONTROL DEVICE FOR VEHICLE
2y 5m to grant Granted Mar 03, 2026
Patent 12560944
CORRELATED MOTION AND DETECTION FOR AIRCRAFT
2y 5m to grant Granted Feb 24, 2026
Patent 12554277
CONTROL METHOD FOR LIGHT SOURCES OF VISION MACHINE, AND VISION MACHINE
2y 5m to grant Granted Feb 17, 2026
Patent 12516954
ROAD GRAPH GENERATION
2y 5m to grant Granted Jan 06, 2026
Patent 12498230
AUTONOMOUS VEHICLE, CONTROL SYSTEM FOR REMOTELY CONTROLLING THE SAME, AND METHOD THEREOF
2y 5m to grant Granted Dec 16, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
44%
Grant Probability
79%
With Interview (+35.0%)
3y 6m
Median Time to Grant
High
PTA Risk
Based on 180 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month