Prosecution Insights
Last updated: April 19, 2026
Application No. 18/766,123

UNCONTROLLED INTERSECTION DETECTION AND WARNING SYSTEM

Non-Final OA §101§103
Filed
Jul 08, 2024
Examiner
NGUYEN, STEVEN VU
Art Unit
3668
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Uber Technologies, Inc.
OA Round
1 (Non-Final)
78%
Grant Probability
Favorable
1-2
OA Rounds
2y 9m
To Grant
86%
With Interview

Examiner Intelligence

Grants 78% — above average
78%
Career Allow Rate
125 granted / 160 resolved
+26.1% vs TC avg
Moderate +8% lift
Without
With
+7.9%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
25 currently pending
Career history
185
Total Applications
across all art units

Statute-Specific Performance

§101
14.3%
-25.7% vs TC avg
§103
44.6%
+4.6% vs TC avg
§102
17.3%
-22.7% vs TC avg
§112
18.9%
-21.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 160 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority The Examiner acknowledges that the current application is a continuation (CON) of the parent application 17660336, which has an effective filing date of 04/22/2022. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim 20 is rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claim(s) does/do not fall within at least one of the four categories of patent eligible subject matter because it recites a “machine-storage medium” that is not limited to tangible embodiment. Particularly, the specification, par. [0078], discloses “ a machine-storage medium (e.g., a machine-storage device, a non-transitory machine-storage medium, a computer-storage medium, or any suitable combination thereof)”. This implies that the machine-storage medium could be both transitory and non-transitory medium. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1 – 2, 5, 11 – 14, 17 – 20 are rejected under 35 U.S.C. 103 as being unpatentable over Katsuyama, Yoshihiro (Patent No. US 10807610 B1; hereinafter Katsuyama) in view of Van et al. (Publication No. US 20130345959 A1; hereinafter Van) Regarding to claim 1, Katsuyama teaches A method comprising: in response to a navigation request from a user device, generating, by a network system, a navigation route using integrated map data, the integrated map data including indications of uncontrolled intersections that are intersections where at least some of the traffic is not required to stop; ([Col. 4, line 33 – 46], “The traffic-information-collection interface 11 is configured to receive real-time traffic signal status information 2 for signalized intersections along a roadway of vehicle travel. In some embodiments, the traffic-information-collection interface 11 includes an infrastructure-to-vehicle receiver 17 (such as, for example, an infrared beacon receiver) to receive the real-time traffic signal status information 2 from the roadside traffic signal controller or from the traffic center 4. Although it is not shown in the drawings, the information can be obtained by a network interface via, for example, cloud-based networking, or through other wireless data signal transmission techniques. The traffic-information collection interface 11 comprises an input port to obtain the real-time traffic signal status information 2.”; [Col. 4, line 54 - 66], “The navigation display interface 14 is configured to send vehicle navigation information to a display 15 viewable by the driver. The electronic processor 13 writes to the memory device 16 and provides output information for the display via the navigation display interface 14, or the navigation display interface 14 may comprise memory, and the electronic processor 13 can write to the memory within the navigation display interface 14. (16) The memory device 16 stores map information and instructions executable by an electronic processor 13. Information for signalized intersections may be stored in the memory device 16, which includes traffic lane information such as a number of the lanes and assignment of directions.”; [Col. 5, line 42 – 51], “As shown in FIG. 5, the in-vehicle system 10 generates a display of traffic information comprising the visual sign indicator 8 for a non-signalized intersection in a vicinity of the vehicle 1. The visual sign indicator is indicative of traffic signs, such as ‘STOP”, “YIELD”, “NO TURN ON RED”, “DO NOT ENTER”, and so forth.” This is interpreted as the navigation system generates navigation routes by correlating map information stored in its memory with real-time traffic information from the roadside traffic signal controller or the traffic center via a network. The navigation display interface presents road information, including signalized intersections and non-signalized intersections. The term 'non-signalized intersection' refers to uncontrolled intersections and the non-signalized intersection with the sign “YIELD” means that the traffic of the other road of the intersection does not stop. Note that it should be understood that the navigation system displays route information only upon the driver or user request. causing, by the network system, presentation of the navigation route in a user interface on the user device; ([Col. 4, line 54 - 66], “The navigation display interface 14 is configured to send vehicle navigation information to a display 15 viewable by the driver. The electronic processor 13 writes to the memory device 16 and provides output information for the display via the navigation display interface 14, or the navigation display interface 14 may comprise memory, and the electronic processor 13 can write to the memory within the navigation display interface 14. (16) The memory device 16 stores map information and instructions executable by an electronic processor 13. Information for signalized intersections may be stored in the memory device 16, which includes traffic lane information such as a number of the lanes and assignment of directions.”) monitoring, by the network system, navigation of a vehicle associated with the user device along the navigation route; ([Col. 6, line 9 – 17], “The vehicle location interface 12 receives a geospatial location of the vehicle 1 from a satellite linked positioning system, and the in-vehicle system 10 identifies a roadway of vehicle travel correlated using the geospatial location of the vehicle and map information in the memory device 16. The navigation display interface 14 sends vehicle navigation information to a display 15 viewable by the driver to show a vehicle position mark 9 on a display generated from map data including the roadway of vehicle travel.” This is interpreted as the navigation system tracks the vehicle on the road using geospatial location of the vehicle and map information) detecting, by the network system, that the vehicle associated with the user device is approaching an uncontrolled intersection based on the monitoring and the integrated map data; ([Col. 6, line 18 – 24 and Col. 6, line 59 – 63], “24) As the vehicle 1 approaches a signalized intersection, the traffic-information-collection interface 11 receives real-time traffic signal status information 2 for the signalized intersection and for signalized intersections located along the roadway of vehicle travel. The in-vehicle system 10 identifies the signalized intersections by correlating with the map information…the in-vehicle system 10 can generate a display of traffic information comprising the visual sign indicator 8 for non-signalized intersections in a vicinity of the vehicle 1.”; [Col. 5, line 42 – 51], “As shown in FIG. 5, the in-vehicle system 10 generates a display of traffic information comprising the visual sign indicator 8 for a non-signalized intersection in a vicinity of the vehicle 1. The visual sign indicator is indicative of traffic signs, such as ‘STOP”, “YIELD”, “NO TURN ON RED”, “DO NOT ENTER”, and so forth.” This is interpreted as the navigation system tracks the location of the vehicle. When the vehicle approaches a signalized intersection or an unsignalized intersection, the traffic information related to that intersection is displayed on the vehicle's display interface.) in response to the detecting, causing presentation of a warning indicating the uncontrolled intersection ([Col. 5, line 42 – 51], “As shown in FIG. 5, the in-vehicle system 10 generates a display of traffic information comprising the visual sign indicator 8 for a non-signalized intersection in a vicinity of the vehicle 1. The visual sign indicator is indicative of traffic signs, such as ‘STOP”, “YIELD”, “NO TURN ON RED”, “DO NOT ENTER”, and so forth.”) Katsuyama teaches to display on whether the vehicle is approaching an uncontrolled intersection as described above, but does not explicitly disclose determining whether a critical navigation maneuver will occur within a threshold distance; and in response to the detecting and based on a determination that the critical navigation maneuver will occur within the threshold distance, causing presentation of a warning indicating the uncontrolled intersection with a lower priority than navigation instructions associated with the critical navigation maneuver. However, Van teaches determining whether a critical navigation maneuver will occur within a threshold distance; ([Par. 0254], “the mapping application may display a navigation sign well before the maneuver described by the navigation sign will be performed. For instance, if a user enters a freeway, and the next maneuver involves a freeway exit in 15 miles, the application may display a navigation sign indicating the upcoming freeway exit well before the user needs to begin preparing to actually exit the freeway. When it comes time to alert the user that the juncture at which to perform the maneuver is approaching, different embodiments use different techniques. Some embodiments include audio alerts, with the user device providing voice navigation to indicate that the juncture is approaching”) and in response to the detecting and based on a determination that the critical navigation maneuver will occur within the threshold distance, causing presentation of a warning indicating the uncontrolled intersection with a lower priority than navigation instructions associated with the critical navigation maneuver. ([Par. 0394], “FIG. 52 illustrates an example of the synthesis of different instructions for a particular maneuver at a juncture according to some embodiments. FIGS. 53 and 54 then illustrate different scenarios in which these different instructions for the maneuver are used. As shown, the mapping application uses received route instructions and juncture data to identify specific aspects of maneuver instructions. The table 5205 conceptually illustrates how various strings might be generated for a juncture. Specifically, the maneuver instructions include an "At" field, a "Turn" field, an "Onto" field, a "Towards" field, and a "For" field. For each juncture, the application initially populates these string fields, in order to synthesize the instructions from the fields. [0395] In some embodiments, the "At" field is based on map information that includes traffic light and stop sign information, etc. For the examples shown in FIG. 52, the first juncture takes place "at the end of the road", while the second juncture takes place "at the next light". The "Turn" field describes the maneuver to be made; examples of this field include "turn right" (the maneuver performed at the first juncture), "exit freeway", "keep left", "slight left turn", "U-turn", or other maneuvers. The route directions that include a maneuver description may be mapped to different possible strings for the "Turn" field.” Wherein with respect to “lower priority,” Van teaches prioritizing maneuver instructions when a critical navigation maneuver is imminent. As described in par. [0394]–[0395], upon determining that a maneuver location is approaching, Van causes presentation of maneuver instructions as the primary navigational output to direct the user’s immediate driving action. Van also identifies the presence of a stop sign at the intersection, which constitutes an uncontrolled intersection under the claim. However, this intersection information is presented only as contextual information supporting execution of the maneuver and does not supersede the maneuver instruction itself. Under the broadest reasonable interpretation, presenting maneuver instructions as the primary focus inherently renders the intersection warning lower in priority. It would have been obvious to a person of ordinary skill in the art to emphasize maneuver instructions over general intersection indications when a maneuver is imminent in order to avoid driver distraction and ensure safe execution of the maneuver.) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claim invention to modify Katsuyama to incorporate the teaching of Van. The modification would have been obvious because presenting instructions for an upcoming critical maneuver within a threshold distance notifies the user that the maneuver is imminent, thereby allowing the user to prepare in advance to safely execute the maneuver. Regarding to claim 2, the combination of Katsuyama and Van teaches the method of claim 1. Van further teaches wherein the causing presentation of the warning with the lower priority comprises causing presentation of the warning in a secondary position compared to a position of the navigation instructions associated with the critical navigation maneuver on the user interface. ([Par. 0394], “FIG. 52 illustrates an example of the synthesis of different instructions for a particular maneuver at a juncture according to some embodiments. FIGS. 53 and 54 then illustrate different scenarios in which these different instructions for the maneuver are used. As shown, the mapping application uses received route instructions and juncture data to identify specific aspects of maneuver instructions. The table 5205 conceptually illustrates how various strings might be generated for a juncture. Specifically, the maneuver instructions include an "At" field, a "Turn" field, an "Onto" field, a "Towards" field, and a "For" field. For each juncture, the application initially populates these string fields, in order to synthesize the instructions from the fields. [0395] In some embodiments, the "At" field is based on map information that includes traffic light and stop sign information, etc. For the examples shown in FIG. 52, the first juncture takes place "at the end of the road", while the second juncture takes place "at the next light". The "Turn" field describes the maneuver to be made; examples of this field include "turn right" (the maneuver performed at the first juncture), "exit freeway", "keep left", "slight left turn", "U-turn", or other maneuvers. The route directions that include a maneuver description may be mapped to different possible strings for the "Turn" field.” Wherein the mapping is understood as when a critical navigation maneuver is approaching, navigation instructions associated with the maneuver are presented as the primary focus of the user interface to direct the user’s immediate action. Van further identifies the presence of a stop sign at the intersection, which constitutes an uncontrolled intersection under the claim. However, this information is presented only as contextual information supporting execution of the maneuver. Under the broadest reasonable interpretation, presenting maneuver instructions as the primary focus inherently causes any warning indicating the uncontrolled intersection to be displayed in a secondary position relative to the maneuver instructions. Accordingly, Van teaches or renders obvious causing presentation of the warning in a secondary position compared to the navigation instructions associated with the critical navigation maneuver. Regarding to claim 5, the combination of Katsuyama and Van teaches the method of claim 1. Katsuyama further teaches in response to the detecting and based on a determination that the critical navigation maneuver is not occurring within the threshold distance, causing presentation of the warning in a primary position in a navigation instruction portion of the user interface. ([Col. 5, line 42 – 51], “As shown in FIG. 5, the in-vehicle system 10 generates a display of traffic information comprising the visual sign indicator 8 for a non-signalized intersection in a vicinity of the vehicle 1. The visual sign indicator is indicative of traffic signs, such as ‘STOP”, “YIELD”, “NO TURN ON RED”, “DO NOT ENTER”, and so forth.”) Regarding to claim 11, the combination of Katsuyama and Van teaches the method of claim 1. Katsuyama further teaches wherein the causing presentation of the warning comprises presenting one or more icons indicating the uncontrolled intersection in a map guidance portion of the user interface that is below a navigation instruction portion of the user interface. ([Col. 5, line 42 – 51], “As shown in FIG. 5, the in-vehicle system 10 generates a display of traffic information comprising the visual sign indicator 8 for a non-signalized intersection in a vicinity of the vehicle 1. The visual sign indicator is indicative of traffic signs, such as ‘STOP”, “YIELD”, “NO TURN ON RED”, “DO NOT ENTER”, and so forth.”) Regarding to claim 12, the combination of Katsuyama and Van teaches the method of claim 1. Van further teaches wherein the causing presentation of the warning further comprises providing a verbal or audio warning. ([Par. 0254], “Some embodiments include audio alerts, with the user device providing voice navigation to indicate that the juncture is approaching.”) Claim 13 recites a system with substantially similar scope as claim 1, thus being rejected for the same basis as claim 1 above. Katsuyama further teaches one or more hardware processors; (Fig. 3, “processor 13”) and memory storing instructions that, when executed by the one or more hardware processors, cause the one or more hardware processors to perform operations (Fig. 3, “memory device 16” and “processor 13”) Claims 14, 17– 19 recites the system with substantially similar scope as claims 2, 5, 11 – 12 respectively, thus being rejected for same basis as claims 2, 5, 11 – 12 respectively above. Claim 20 recites a machine-storage medium with substantially similar scope as claim 1, thus being rejected for the same basis as claim 1 above. Katsuyama further teaches A machine-storage medium storing instructions that, when executed by one or more hardware processors of a machine, cause the machine to perform operations (Fig. 3, “memory device 16” and “processor 13”) Claim(s) 6 is rejected under 35 U.S.C. 103 as being unpatentable over the combination of Katsuyama and Van in further view of Kadous et al. (Patent No. US 8694241 B1; hereafter Kadous). Regarding to claim 6, the combination of Katsuyama and Van teaches the method of claim 1. Katsuyama further teaches determining uncontrolled intersections by identifying intersections where not all roadways leading into the intersections have a stop sign or stop light; ([Col. 5, line 42 – 51], “As shown in FIG. 5, the in-vehicle system 10 generates a display of traffic information comprising the visual sign indicator 8 for a non-signalized intersection in a vicinity of the vehicle 1. The visual sign indicator is indicative of traffic signs, such as ‘STOP”, “YIELD”, “NO TURN ON RED”, “DO NOT ENTER”, and so forth.” wherein the non-signalized intersection with the sign “YIELD” means that the traffic of the other road of the intersection does not stop.) generating the integrated map data by integrating data regarding the uncontrolled intersections with existing map data. ([Col. 6, line 18 – 24 and Col. 6, line 59 – 63], “24) As the vehicle 1 approaches a signalized intersection, the traffic-information-collection interface 11 receives real-time traffic signal status information 2 for the signalized intersection and for signalized intersections located along the roadway of vehicle travel. The in-vehicle system 10 identifies the signalized intersections by correlating with the map information…the in-vehicle system 10 can generate a display of traffic information comprising the visual sign indicator 8 for non-signalized intersections in a vicinity of the vehicle 1.”; ([Col. 5, line 42 – 51], “As shown in FIG. 5, the in-vehicle system 10 generates a display of traffic information comprising the visual sign indicator 8 for a non-signalized intersection in a vicinity of the vehicle 1. The visual sign indicator is indicative of traffic signs, such as ‘STOP”, “YIELD”, “NO TURN ON RED”, “DO NOT ENTER”, and so forth.”) Katsuyama teaches to display the integrated map including the uncontrolled intersection that could be “stop”, “Yield”, or “Do not enter” as described above, but does not explicitly disclose validating the determined uncontrolled intersections by causing presentation of a further user interface to each of a plurality of users of the network system asking the plurality of users to verify whether there is a stop sign or a stop light at a particular intersection; and However, Kadous teaches validating the determined uncontrolled intersections by causing presentation of a further user interface to each of a plurality of users of the network system asking the plurality of users to verify whether there is a stop sign or a stop light at a particular intersection; ([Col. 11, line 7 – 16], “If so configured, the data module 124 can further initially classify the stop signal as either a stop sign or stop light, based on analysis of the periodicity and distribution of high quality samples. The user can then review the flagged stop signal (and initial type classification) using satellite or street level images of the intersection, to confirm its presence (as opposed, for example to the presence of a road hazard), and to determine (or confirm) whether the signal is a stop sign or stop light. In other embodiments, the map editor module 126 receives instructions from a user of the map to indicate the possible presence of a stop signal and its type.” Note that it should be understood that other users within the network or using the same navigation system would also be prompted to verify the presence of the stop sign or stop light.) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claim invention to modify the combination of Katsuyama and Van to incorporate the teaching of Kadous. The modification would have been evident, as it involves prompting users to confirm the presence of a stop sign or stop light. This enables the system to update the map data accurately, providing improved navigation accuracy. Claim(s) 8 – 9 are rejected under 35 U.S.C. 103 as being unpatentable over the combination of Katsuyama, Van, and Kadous in further view of Kumar, Vinay (Publication No. US 20220237956 A1; hereafter Kumar). Regarding to claim 8, the combination of Katsuyama, Van, and Kadous teaches the method of claim 6. The combination of Katsuyama, Van, and Kadous teaches to the determine the uncontrolled intersection as described in claim 6 above, but does not explicitly disclose wherein the determining the uncontrolled intersections comprises: accessing trip data from previous completed navigations using navigation routes generated by the network system; detecting trip characteristics from the trip data; and using heuristics and traffic patterns, identifying, for each intersection, whether each intersection has a stop sign based on the trip characteristics. However, Kumar teaches wherein the determining the uncontrolled intersections comprises: accessing trip data from previous completed navigations using navigation routes generated by the network system; ([Par. 0005], “The method may include: (i)aggregating historical location data and historical telematics data from a plurality of users, wherein the historical location data and historical telematics data are gathered by one or more devices associated with each user of the plurality of users, (ii) generating mapping data based at least in part upon the historical location data and the historical telematics data” this is interpreted as stop locations associated with completed trips from one or more users are aggregated and stored as mapping data.) detecting trip characteristics from the trip data; ([Par. 0005], “identifying one or more stop locations based at least in part upon the mapping data, (iv) storing, in the memory device, the identified stop locations, (v) receiving, from a user computing device associated with the driver of the vehicle, current location data and current telematics data after each trip taken by the driver,”) and using heuristics and traffic patterns, identifying, for each intersection, whether each intersection has a stop sign based on the trip characteristics. ([Par. 0016], “Collected GPS data and telematics data may be analyzed to reveal locations of stop signs, stop lights, and intersections. Further analysis may reveal certain driver behaviors, such as rolling stops, and provide real-time feedback to a driver. Some embodiments of the present disclosure may use, for example, GPS location data, accelerometer data and machine learning techniques to map stoplight and stop sign locations.”; [Par. 0017], “Example stopping locations may be stop signs, red lights, or the like.”) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claim invention to modify the combination of Katsuyama, Van, and Kadous to incorporate the teaching of Kumar. The modification would be considered obvious, as it involves utilizing historical data to identify the location of a stop sign at an intersection. This enables a more accurate prediction of stop signs along the route, utilizing data from previous trips. Regarding to claim 9, the combination of Katsuyama, Van, Kadous, and Kumar teaches the method of claim 8. Kumar further teaches wherein the identifying using heuristics and traffic patterns is based on one or more of: a stop duration distribution for each intersection; a speed distribution for each intersection; or a percentage of trips with detected stops for each intersection. ([Par. 0027], “For example, stopping events may occur at the same location, such as an intersection, by a significant number of drivers. In this scenario, it may be reasonable to conclude that the intersection has a stop sign. Alternatively, or additionally, a stopping location may be identified based at least in part upon a percentage of drivers that come to a stop at a certain location. The percentage drivers may be a certain percentage above a threshold. For example, if 99% of drivers come to a complete stop, or even a near-stop, at a certain intersection, the intersection may be identified as being a “stop location” (e.g., having a stop sign or a stoplight or some other stop indication).” Wherein the number of drivers detecting the stop sign at a same location indicates the frequency of trips where the stop sign location were identified. This at least mapped to “a percentage of trips with detected stops for each intersection.) Claim(s) 10 is rejected under 35 U.S.C. 103 as being unpatentable over the combination of Katsuyama and Van in further view of Kumar. Regarding to claim 10, the combination of Katsuyama and Van teaches the method of claim 1. Katsuyama further teaches generating the integrated map data by integrating data regarding the uncontrolled intersections with existing map data. ([Col. 6, line 18 – 24 and Col. 6, line 59 – 63], “24) As the vehicle 1 approaches a signalized intersection, the traffic-information-collection interface 11 receives real-time traffic signal status information 2 for the signalized intersection and for signalized intersections located along the roadway of vehicle travel. The in-vehicle system 10 identifies the signalized intersections by correlating with the map information…the in-vehicle system 10 can generate a display of traffic information comprising the visual sign indicator 8 for non-signalized intersections in a vicinity of the vehicle 1.”; ([Col. 5, line 42 – 51], “As shown in FIG. 5, the in-vehicle system 10 generates a display of traffic information comprising the visual sign indicator 8 for a non-signalized intersection in a vicinity of the vehicle 1. The visual sign indicator is indicative of traffic signs, such as ‘STOP”, “YIELD”, “NO TURN ON RED”, “DO NOT ENTER”, and so forth.”) The combination of Katsuyama and Van teaches to determine the uncontrolled intersection and generate the integrated map for displaying as described above, but does not explicitly disclose accessing trip data from previous completed navigations using navigation routes generated by the network system; detecting trip features from the trip data; applying the trip features to a machine learning model to identify stop sign locations, the uncontrolled intersections being identified from the identified stop sign locations; However, Kumar teaches accessing trip data from previous completed navigations using navigation routes generated by the network system; ([Par. 0005], “The method may include: (i)aggregating historical location data and historical telematics data from a plurality of users, wherein the historical location data and historical telematics data are gathered by one or more devices associated with each user of the plurality of users, (ii) generating mapping data based at least in part upon the historical location data and the historical telematics data” this is interpreted as stop locations associated with completed trips from one or more users are aggregated and stored as mapping data.) detecting trip features from the trip data; ([Par. 0005], “identifying one or more stop locations based at least in part upon the mapping data, (iv) storing, in the memory device, the identified stop locations, (v) receiving, from a user computing device associated with the driver of the vehicle, current location data and current telematics data after each trip taken by the driver,”) applying the trip features to a machine learning model to identify stop sign locations, the uncontrolled intersections being identified from the identified stop sign locations; ([Par. 0017], “Example stopping locations may be stop signs, red lights, or the like.”; [Par. 0027], “One or more types of data analysis, such as machine learning, or other techniques, may be used to identify instances of stopping events at certain locations. For example, stopping events may occur at the same location, such as an intersection, by a significant number of drivers. In this scenario, it may be reasonable to conclude that the intersection has a stop sign. Alternatively, or additionally, a stopping location may be identified based at least in part upon a percentage of drivers that come to a stop at a certain location. The percentage drivers may be a certain percentage above a threshold. For example, if 99% of drivers come to a complete stop, or even a near-stop, at a certain intersection, the intersection may be identified as being a “stop location” (e.g., having a stop sign or a stoplight or some other stop indication). Other percentage thresholds may be utilized, such as 75%.”) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claim invention to modify the combination of the combination of Katsuayama and Van incorporate the teaching of Kumar. The modification would be considered obvious, as it involves utilizing historical data to identify the location of a stop sign at an intersection. This enables a more accurate prediction of stop signs along the route, utilizing data from previous trips. Allowable Subject Matter Claims 3 – 4, 7, 15 - 16 objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to STEVEN V NGUYEN whose telephone number is (571)272-7320. The examiner can normally be reached Monday -Friday 11am - 7pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, James J Lee can be reached at (571) 270-5965. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /STEVEN VU NGUYEN/Examiner, Art Unit 3668
Read full office action

Prosecution Timeline

Jul 08, 2024
Application Filed
Dec 27, 2025
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12589883
FLIGHT RECORDER SYSTEM AND METHOD
2y 5m to grant Granted Mar 31, 2026
Patent 12567291
SETTING A MODE OF A VEHICLE
2y 5m to grant Granted Mar 03, 2026
Patent 12565100
IMMERSIVE VEHICLE COMPONENT CONFIGURATION AND OPERATION USING OPERATIONAL PROFILES
2y 5m to grant Granted Mar 03, 2026
Patent 12565118
METHOD OF PROVIDING INFORMATION WHEN AN ELECTRIC VEHICLE IS USED AS A BATTERY PACK
2y 5m to grant Granted Mar 03, 2026
Patent 12534092
AUTOMATED ADJUSTMENT OF VEHICLE DIRECTION BASED ON ENVIRONMENT ANALYSIS
2y 5m to grant Granted Jan 27, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
78%
Grant Probability
86%
With Interview (+7.9%)
2y 9m
Median Time to Grant
Low
PTA Risk
Based on 160 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month