Prosecution Insights
Last updated: April 19, 2026
Application No. 18/887,286

Navigation Directions Preview

Non-Final OA §DP
Filed
Sep 17, 2024
Examiner
JACKSON, DANIELLE MARIE
Art Unit
3657
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Google LLC
OA Round
1 (Non-Final)
80%
Grant Probability
Favorable
1-2
OA Rounds
2y 8m
To Grant
99%
With Interview

Examiner Intelligence

Grants 80% — above average
80%
Career Allow Rate
111 granted / 139 resolved
+27.9% vs TC avg
Strong +28% interview lift
Without
With
+28.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
17 currently pending
Career history
156
Total Applications
across all art units

Statute-Specific Performance

§101
7.7%
-32.3% vs TC avg
§103
51.4%
+11.4% vs TC avg
§102
20.1%
-19.9% vs TC avg
§112
17.0%
-23.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 139 resolved cases

Office Action

§DP
DETAILED ACTION This is a non-final rejection in response to amendments filed 9/17/2024. Claims 1-20 are pending. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Double Patenting Claims 1-20 of this application are patentably indistinct from claims 1-18 of U.S. Patent No. 12117308. Pursuant to 37 CFR 1.78(f), when two or more applications filed by the same applicant or assignee contain patentably indistinct claims, elimination of such claims from all but one application may be required in the absence of good and sufficient reason for their retention during pendency in more than one application. Applicant is required to either cancel the patentably indistinct claims from all but one application or maintain a clear line of demarcation between the applications. See MPEP § 822. The non-statutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A non-statutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on non-statutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp. Claims 1-20 rejected on the ground of non-statutory double patenting as being unpatentable over claims 1-18 of U.S. Patent No. 12117308 (herein referred to as ‘308). Although the claims at issue are not identical, they are not patentably distinct from each other. As shown in the table below claim 1 is rejected by ‘308’s claims 1, 3, and 5. Claim This Application’s Claim ‘308 Claims 1 A method for providing a navigation directions preview, the method comprising: receiving, at one or more processors, a request for navigation directions from a starting location to a destination location; 1. A method for providing a navigation directions preview, the method comprising: receiving, at one or more processors, a request for navigation directions from a starting location to a destination location; 1 generating, by the one or more processors, a set of navigation directions in response to the request, the set of navigation directions including a set of route segments for traversing from the starting location to the destination location; 1. generating, by the one or more processors, a set of navigation directions in response to the request, the set of navigation directions including a set of route segments on a route for traversing from the starting location to the destination location; 1 filtering, by the one or more processors, the set of route segments into a subset of the route segments for previewing the route based on at least one of: whether there is a point of interest (POI) visible from each route segment in the set of route segments, whether there is a traffic signal associated with each route segment in the set of route segments, or a familiarity metric indicative of the user’s familiarity with each route segment in the set of route segments; 1. filtering, by the one or more processors, the set of route segments into a subset of the route segments for previewing the route based on a complexity level for a maneuver for each route segment in the set of route segments, the complexity level determined based on an amount of time or distance between the maneuver and a previous maneuver; 3. wherein the subset of the route segments is further filtered based on at least one of: a point of interest (POI) visible from the route segment, a traffic signal associated with the route segment 5. wherein the subset of the route segments is filtered by assigning a familiarity metric to each route segment 1 and for the filtered subset of route segments, providing, by the one or more processors, previews of the subset of route segments to be displayed on a client device, the previews of the subset of route segments including panoramic street level imagery depicting the subset of route segments. 1. and for the filtered subset of route segments, providing, by the one or more processors, previews of the subset of route segments to be displayed on a client device, the previews of the subset of route segments including panoramic street level imagery depicting the subset of route segments. ‘308’s claim 1 is the same as the present application’s claim 1 except that the set of route segments is filtered based on whether there is a point of interest, a traffic signal, or a familiarity metric instead of the complexity of the segment. However, 308’s dependent claim 3 cites filtering based on a point of interest or a traffic signal and 308’s dependent claim 5 cites filtering based on a familiarity metric and it would be obvious to present notable or known landmarks that would enable the viewer to better understand the route. Thus, the only difference is the metric used for filtering the route segments. Further, claim 3 of the present application cites filtering the route based on the complexity of the maneuver, showing that both applications teach filtering the route segments by complexity, familiarity, traffic signals or points of interest. For the same reasons, the following table shows which claims from ‘308 correspond to this application’s 2-8. Claim 2 cites assigning scores to the route segments based on the POI, traffic signal and familiarity metric, as shown below claim 2 from ‘308 teaches assigning a score to the route segments based on the complexity. As described above, it would be obvious to use the different metrics to filter the route segments to give the user more notable landmarks in the preview. Further it would be obvious to use the ranking system for the complexity for the other filtering metrics to be able to process the filtering to all of the route segments. 2 wherein filtering the subset of the route segments includes: assigning, by the one or more processors, a score or ranking to each route segment based on whether there is a POI visible from the route segment, whether there is a traffic signal associated with the route segment, or the familiarity metric for the route segment; and selecting, by the one or more processors, the route segments having a score above a threshold score or ranked above a threshold ranking. 2. a score or ranking to each route segment based on the complexity level of the route segment; and selecting, by the one or more processors, the route segments having a score above a threshold score or ranked above a threshold ranking. 3. wherein the subset of the route segments is further filtered based on at least one of: a point of interest (POI) visible from the route segment, a traffic signal associated with the route segment 5. wherein the subset of the route segments is filtered by assigning a familiarity metric to each route segment 3 wherein filtering the subset of the route segments further includes filtering the subset of the route segments based on at least one of: a complexity level for a maneuver for the route segment, a type of maneuver for the route segment, or an amount of time or distance between consecutive maneuvers for consecutive route segments. 1. filtering, by the one or more processors, the set of route segments into a subset of the route segments for previewing the route based on a complexity level for a maneuver for each route segment in the set of route segments, the complexity level determined based on an amount of time or distance between the maneuver and a previous maneuver; 4 wherein filtering the subset of the route segments further includes filtering the subset of the route segments based on characteristics of a user requesting the navigation directions. 4. wherein the subset of the route segments is further filtered based on characteristics of a user requesting the navigation directions. 5 wherein filtering the subset of the route segments further includes filtering the subset of the route segments by determining behavioral attributes of the user from previous navigation sessions, and assigning a compatibility metric to each route segment based on comparing characteristics of the route segment to the behavioral attributes of the user. 6. wherein the subset of the route segments is filtered by determining behavioral attributes of the user from previous navigation sessions, and assigning a compatibility metric to each route segment based on comparing characteristics of the route segment to the behavioral attributes of the user. 6 wherein the subset of the route segments is filtered based on indications of previous interactions with previous navigation previews. 7. wherein the subset of the route segments is filtered based on indications of previous interactions with previous navigation previews. 7 wherein the familiarity metric for each route segment is based on historical route data for the user indicative of the user’s familiarity with the route segment. 5. wherein the subset of the route segments is filtered by assigning a familiarity metric to each route segment according to historical route data for the user indicative of the user's familiarity with the route segment. As shown in the table below claim 8 is rejected by ‘308’s claim 8. 8 A computing device for providing a navigation directions preview, the computing device comprising: one or more processors; and a non-transitory computer-readable memory coupled to the one or more processors and storing instructions thereon that, when executed by the one or more processors, cause the computing device to: receive a request for navigation directions from a starting location to a destination location; generate a set of navigation directions in response to the request, the set of navigation directions including a set of route segments for traversing from the starting location to the destination location; 8. A computing device for providing a navigation directions preview, the computing device comprising: one or more processors; and a non-transitory computer-readable memory coupled to the one or more processors and storing instructions thereon that, when executed by the one or more processors, cause the computing device to: receive a request for navigation directions from a starting location to a destination location; generate a set of navigation directions in response to the request, the set of navigation directions including a set of route segments on a route for traversing from the starting location to the destination location; 8 filter the set of route segments into a subset of the route segments for previewing the route based on at least one of: whether there is a point of interest (POI) visible from each route segment in the set of route segments, whether there is a traffic signal associated with each route segment in the set of route segments, or a familiarity metric indicative of the user’s familiarity with each route segment in the set of route segments; 8. filter the set of route segments into a subset of the route segments for previewing the route based on a complexity level for a maneuver for each route segment in the set of route segments, the complexity level determined based on an amount of time or distance between the maneuver and a previous maneuver; 10. wherein the subset of the route segments is further filtered based on at least one of: a point of interest (POI) visible from the route segment, a traffic signal associated with the route segment, a type of maneuver for the route segment, or an amount of time or distance between consecutive maneuvers for consecutive route segments. 12. wherein the subset of the route segments is filtered selected by assigning a familiarity metric to each route segment 8 and for the filtered subset of route segments, provide a preview of the route segment to be displayed on a client device, previews of the subset of route segments to be displayed on a client device, the previews of the subset of route segments including panoramic street level imagery depicting the subset of route segments. 8. and for the filtered subset of route segments, provide previews of the subset of route segments to be displayed on a client device, the previews of the subset of route segments including panoramic street level imagery depicting the subset of route segments. ‘308’s claim 8 is the same as the present application’s claim 8 except that the set of route segments is filtered based on whether there is a point of interest, a traffic signal, or a familiarity metric instead of the complexity of the segment. However, 308’s dependent claim 10 cites filtering based on a point of interest or a traffic signal and 308’s dependent claim 12 cites filtering based on a familiarity metric and it would be obvious to present notable or known landmarks that would enable the viewer to better understand the route. Thus, the only difference is the metric used for filtering the route segments. Further, claim 10 of the present application cites filtering the route based on the complexity of the maneuver, showing that both applications teach filtering the route segments by complexity, familiarity, traffic signals or points of interest. For the same reasons, the following table shows which claims from ‘308 correspond to this application’s claims 9-14. Claim 9 cites assigning scores to the route segments based on the POI, traffic signal and familiarity metric, as shown below claim 9 from ‘308 teaches assigning a score to the route segments based on the complexity. As described above, it would be obvious to use the different metrics to filter the route segments to give the user more notable landmarks in the preview. Further it would be obvious to use the ranking system for the complexity for the other filtering metrics to be able to process the filtering to all of the route segments. 9 wherein to filter the subset of the route segments, the instructions cause the computing device to: assign whether there is a POI visible from the route segment, whether there is a traffic signal associated with the route segment, or the familiarity metric for the route segment; and select the route segments having a score above a threshold score or ranked above a threshold ranking. 9. wherein to filter the subset of the route segments based on the complexity level for each route segment, the instructions cause the computing device to: assign a score or ranking to each route segment based on the complexity level of the route segment; and select the route segments having a score above a threshold score or ranked above a threshold ranking. 10. wherein the subset of the route segments is further filtered based on at least one of: a point of interest (POI) visible from the route segment, a traffic signal associated with the route segment, a type of maneuver for the route segment, or an amount of time or distance between consecutive maneuvers for consecutive route segments. 12. wherein the subset of the route segments is filtered selected by assigning a familiarity metric to each route segment 10 wherein the subset of the route segments are further filtered based on at least one of: a complexity level for a maneuver for the route segment, a type of maneuver for the route segment, or an amount of time or distance between consecutive maneuvers for consecutive route segments. 8. filter the set of route segments into a subset of the route segments for previewing the route based on a complexity level for a maneuver for each route segment in the set of route segments, the complexity level determined based on an amount of time or distance between the maneuver and a previous maneuver 11 wherein the subset of the route segments are further filtered based on characteristics of a user requesting the navigation directions. 11. wherein the subset of the route segments is further filtered based on characteristics of a user requesting the navigation directions. 12 wherein the subset of the route segments are further filtered by determining behavioral attributes of the user from previous navigation sessions, and assigning a compatibility metric to each route segment based on comparing characteristics of the route segment to the behavioral attributes of the user. 6. wherein the subset of the route segments is filtered by determining behavioral attributes of the user from previous navigation sessions, and assigning a compatibility metric to each route segment based on comparing characteristics of the route segment to the behavioral attributes of the user. 13 wherein the subset of the route segments are further filtered based on indications of previous interactions with previous navigation previews. 7. wherein the subset of the route segments is filtered based on indications of previous interactions with previous navigation previews. 14 wherein the familiarity metric for each route segment is based on historical route data for the user indicative of the user’s familiarity with the route segment. 12. wherein the subset of the route segments is filtered by assigning a familiarity metric to each route segment according to historical route data for the user indicative of the user's familiarity with the route segment. As shown in the table below claim 15 is rejected by ‘308’s claim 13. 15 A method for providing a navigation directions preview, the method comprising: receiving, at one or more processors, a request for navigation directions from a starting location to a destination location; obtaining, by the one or more processors, a set of navigation directions in response to the request, the set of navigation directions including a set of route segments for traversing from the starting location to the destination location; 13. A method for providing a navigation directions preview, the method comprising: receiving, at one or more processors, a request for navigation directions from a starting location to a destination location; obtaining, by the one or more processors, a set of navigation directions in response to the request, the set of navigation directions including a set of route segments on a route for traversing from the starting location to the destination location; 15 providing, by the one or more processors, previews of a subset of the route segments including displaying video frames of panoramic street level imagery depicting an area leading up to, at, and subsequent to a waypoint location for a maneuver associated with each route segment, 13. providing, by the one or more processors, previews of a subset of the route segments including displaying video frames of panoramic street level imagery depicting an area leading up to, at, and subsequent to a waypoint location for a maneuver associated with each route segment, 15 wherein the subset of the route segments for previewing the route is filtered based on at least one of: whether there is a point of interest (POI) visible from each route segment in the set of route segments, whether there is a traffic signal associated with each route segment in the set of route segments, or a familiarity metric indicative of the user’s familiarity with each route segment in the set of route segments. 13. wherein the subset of the route segments for previewing the route is filtered based on a complexity level for a maneuver for each route segment in the set of route segments, the complexity level determined based on an amount of time or distance between the maneuver and a previous maneuver. 10. wherein the subset of the route segments is further filtered based on at least one of: a point of interest (POI) visible from the route segment, a traffic signal associated with the route segment, a type of maneuver for the route segment, or an amount of time or distance between consecutive maneuvers for consecutive route segments. 12. wherein the subset of the route segments is filtered selected by assigning a familiarity metric to each route segment ‘308’s claim 13 is the same as the present application’s claim 15 except that the set of route segments is filtered based on whether there is a point of interest, a traffic signal, or a familiarity metric instead of the complexity of the segment. However, 308’s dependent claim 10 cites filtering based on a point of interest or a traffic signal and 308’s dependent claim 12 cites filtering based on a familiarity metric and it would be obvious to present notable or known landmarks that would enable the viewer to better understand the route. Thus, the only difference is the metric used for filtering the route segments. Further, claim 10 of the present application cites filtering the route based on the complexity of the maneuver, showing that both applications teach filtering the route segments by complexity, familiarity, traffic signals or points of interest. For the same reasons, the following table shows which claims from ‘308 correspond to this application’s 16-20. 16 wherein providing a preview of a route segment in the subset includes providing audio describing the maneuver for the route segment, wherein the audio during the preview is different from an audio navigation instruction provided during navigation. 14. wherein providing a preview of a route segment in the subset includes providing audio describing the maneuver for the route segment, wherein the audio during the preview is different from an audio navigation instruction provided during navigation. 17 wherein the audio during the preview includes additional details not included in the audio navigation instruction provided during navigation. 15. wherein the audio during the preview includes additional details not included in the audio navigation instruction provided during navigation. 18 obtaining, by the one or more processors, an expected date or time for traversing the set of navigation directions; and adjusting, by the one or more processors, style parameters for the panoramic street level imagery in accordance with the expected date or time for traversing the set of navigation directions. 16. obtaining, by the one or more processors, an expected date or time for traversing the set of navigation directions; and adjusting, by the one or more processors, style parameters for the panoramic street level imagery in accordance with the expected date or time for traversing the set of navigation directions. 19 wherein adjusting style parameters for the panoramic street level imagery includes: obtaining, by the one or more processors, estimated weather conditions for the expected date or time for traversing the set of navigation directions; and adjusting, by the one or more processors, the panoramic street level imagery in accordance with the estimated weather conditions. 17. wherein adjusting style parameters for the panoramic street level imagery includes: obtaining, by the one or more processors, estimated weather conditions for the expected date or time for traversing the set of navigation directions; and adjusting, by the one or more processors, the panoramic street level imagery in accordance with the estimated weather conditions. 20 wherein adjusting style parameters for the panoramic street level imagery includes: determining, by the one or more processors, that the expected date or time for traversing the set of navigation directions is at night; and providing, by the one or more processors, a night view of the panoramic street level imagery. 18. wherein adjusting style parameters for the panoramic street level imagery includes: determining, by the one or more processors, that the expected date or time for traversing the set of navigation directions is at night; and providing, by the one or more processors, a night view of the panoramic street level imagery. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: generating a preview of a route based on navigational complexity (McKenzie, US 20150066368, IDS); categorizing complexity level based on an amount of distance between maneuvers (Vandanapu, US 20160202082, IDS); determining a user’s familiarity with a route including prior behavioral attributes of the user (Grochocki, US 20180094943, IDS); computing a familiarity metric and comparing this metric to a threshold (Arastafar, US 20120158283, IDS); determining user familiarity of a portion of a route based on the user’s previous interactions with the navigation device (Byrne, US 20090281726, IDS); obtaining and basing the display characteristics on an estimated time and date of the travel (Rosenberg, US 20070150188, IDS); providing characteristics to the user so that a route may be selected (Sheridan, US 20180073885, IDS); filtering video segments irrelevant to the path of interest (Bostick, US 20170092326); and removing directions from a route for familiar portions to the user (Weir, US 20150100231); Any inquiry concerning this communication or earlier communications from the examiner should be directed to DANIELLE M JACKSON whose telephone number is (303)297-4364. The examiner can normally be reached Monday-Friday 7:00-4:30 MT. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abby Lin can be reached at (571) 270-3976. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /D.M.J./ Examiner, Art Unit 3657 /ABBY LIN/Supervisory Patent Examiner, Art Unit 3657
Read full office action

Prosecution Timeline

Sep 17, 2024
Application Filed
Jan 08, 2026
Non-Final Rejection — §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12576519
LEARNING TYPE-GENERALIZED SKILLS FOR SYMBOLIC PLANNING FOR AUTONOMOUS DEVICES
2y 5m to grant Granted Mar 17, 2026
Patent 12570004
System for Companion Robot with Three-Dimensional (3D) Display and Method Thereof
2y 5m to grant Granted Mar 10, 2026
Patent 12564958
CONTROLLING A MOBILE ROBOT
2y 5m to grant Granted Mar 03, 2026
Patent 12552374
SYSTEMS AND METHODS FOR OPERATING ONE OR MORE SELF-DRIVING VEHICLES
2y 5m to grant Granted Feb 17, 2026
Patent 12515345
METHOD OF SYNTHESISING TRAINING DATASETS FOR AUTONOMOUS ROBOTIC CONTROL
2y 5m to grant Granted Jan 06, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
80%
Grant Probability
99%
With Interview (+28.5%)
2y 8m
Median Time to Grant
Low
PTA Risk
Based on 139 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month