Prosecution Insights
Last updated: April 19, 2026
Application No. 18/475,056

USING JUNCTION INFORMATION FOR MACHINE CONTROL IN AUTONOMOUS SYSTEMS AND APPLICATIONS

Non-Final OA §103
Filed
Sep 26, 2023
Examiner
GREINER, TRISTAN J
Art Unit
3656
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Nvidia Corporation
OA Round
3 (Non-Final)
78%
Grant Probability
Favorable
3-4
OA Rounds
2y 9m
To Grant
99%
With Interview

Examiner Intelligence

Grants 78% — above average
78%
Career Allow Rate
129 granted / 166 resolved
+25.7% vs TC avg
Strong +21% interview lift
Without
With
+21.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
12 currently pending
Career history
178
Total Applications
across all art units

Statute-Specific Performance

§101
13.7%
-26.3% vs TC avg
§103
53.0%
+13.0% vs TC avg
§102
12.8%
-27.2% vs TC avg
§112
17.3%
-22.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 166 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment Applicant’s amendments dated 02/13/2026 have been received and filed. Response to Arguments Applicant's arguments have been fully considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. The arguments now rely on Hartnett et al (US Pub 2022/0105959 A1) which stores contention rules and yield behaviors as well as traffic signals and map data for access for use during navigation. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 8-12 and 14 are rejected under 35 U.S.C. 103 as being unpatentable over Dax et al (US Pub 2021/0122373 A1) hereafter known as Dax in light of Hartnett et al (US Pub 2022/0105959 A1), hereafter known as Hartnett. For Claim 8, Dax teaches A system comprising: one or more processing units to perform operations comprising: ([0063] “The vehicle computing device(s) 504 can include one or more processors 516 and memory 518 communicatively coupled with the one or more processors 516.”) obtaining, based on at least on semantic information corresponding to a junction area, contention data associated with a junction area that corresponds to one or more potential yield scenarios associated with navigation of a machine; ([0012] “In some examples, sensor data captured by sensors of the autonomous vehicle may be used to determine a collision zone associated with the autonomous vehicle and one or more objects detected in the environment. A collision zone corresponds to an area of the environment where a collision between the autonomous vehicle and the object may occur, based on current trajectories (and/or variances, accelerations, decelerations, etc. associated with the current trajectories) of the autonomous vehicle and the object.” [0013] “For instance, the autonomous vehicle can pass through the environment safely if it can accurately determine whether to yield to an oncoming object or not. The autonomous vehicle may use the collision zone to determine whether to yield to the object. “ [0100] “The cost function may also include a second term associated with a distance overlap component, which evaluates a distance that the vehicle has proceeded into the junction proportionate to the entire length of the collision zone.”) wherein the contention data is obtained from one or more of; information about one or more traffic signals corresponding to the junction area; information about one or more paths corresponding to the junction area; ([0012] “In some examples, sensor data captured by sensors of the autonomous vehicle may be used to determine a collision zone associated with the autonomous vehicle and one or more objects detected in the environment. A collision zone corresponds to an area of the environment where a collision between the autonomous vehicle and the object may occur, based on current trajectories (and/or variances, accelerations, decelerations, etc. associated with the current trajectories) of the autonomous vehicle and the object.”) information about one or more contentions related to wait behavior with respect to the one or more paths corresponding to the junction area; or information semantic information about one or more conditions related to wait behavior at the junction area; determining a set of contention rules for the junction area based at least on the contention data, the set of contention rules indicating one or more respective wait behaviors corresponding to one or more individual contention points of the junction area that influence vehicle wait behavior with respect to the one or more potential yield scenarios; and (Figure 6, [0097] “An operation 606 includes determining an overlap area based at least in part on a first area associated with the vehicle following the first trajectory and a second area associated with the object following the second trajectory.” [0098] An operation 608 includes determining whether yielding to the object blocks the second trajectory of the object. For instance, if the vehicle stops or slows down, the vehicle may determine that the object's predicted trajectory is blocked by the vehicle as stopped or slowed.” [0099] "If it is determined that yielding to the object does not block the second trajectory (“No” at operation 608), the process may proceed to an operation 610, which includes controlling the vehicle to yield to the object. However, if it is determined that yielding to the object does not block the second trajectory (“Yes” at operation 608), the process may proceed to an operation 612, which includes determining a cost of the vehicle continuing to follow the first trajectory by evaluating a cost function based at least in part on the overlap area. “) causing performance of one or more control operations of the machine based at least on the set of contention rules. ([0099] “If it is determined that yielding to the object does not block the second trajectory (“No” at operation 608), the process may proceed to an operation 610, which includes controlling the vehicle to yield to the object. “ [0101] “An operation 614 includes controlling the vehicle to proceed along the first trajectory based at least in part on the cost. For instance, the vehicle may determine that following the original trajectory will not result in a collision, but yielding to the object will prevent the object from proceeding along the predicted trajectory of the object and/or block traffic at the junction.”) Dax does not explicitly teach that the semantic information is obtained independent from sensor observations made by sensors of the machine. Hartnett, however, does teach that the semantic information is obtained independent from sensor observations made by sensors of the machine. ([0043] Referring now to FIG. 3, there is provided an illustration of an example environment 300 in which an autonomous vehicle may travel. Environment 300 comprises a non-limiting example of a T-shaped intersection 301 that an autonomous vehicle 302.sub.1 that is traveling along a road in a semi-autonomous or autonomous manner is required to navigate. The autonomous vehicle 302.sub.1 is generally configured to detect objects 302.sub.2, 304, 306 in proximity thereto, and that are also navigating the intersection 301 around the same time as the autonomous vehicle 302.sub.1. The objects can include, but are not limited to, a vehicle 302.sub.2, cyclist 304 (such as a rider of a bicycle, electric scooter, motorcycle, or the like) and/or a pedestrian 306. The autonomous vehicle 302.sub.1 can also be configured to detect preliminary information for the intersection 301. The preliminary information can include map-based information such as the type of intersection 301 and the configuration of the intersection 301. For example, the map-based information can include the position of exits and entrances to the intersection 301, the lane structure of the intersection 301, the presence of, position of, and status of traffic signs, such as STOP signs 305a and 305b, associated with the intersection 301, the position of fixed objects in the intersection 301, etc. The preliminary information can also include traffic density information for the intersection 301. Traffic density information can indicate a general congestion level for the intersection 301 without identifying individual vehicles present at the intersection 301. This preliminary information can be accessed by the automated driving system from a remote location, for example, from a remote map database. [0028] The on-board computing device 212 may obtain, retrieve, and/or create map data that provides detailed information about the surrounding environment of the autonomous vehicle 201. The on-board computing device 212 may also determine the location, orientation, pose, etc. of the AV in the environment (localization) based on, for example, three dimensional position data (e.g., data from a GPS), three dimensional orientation data, predicted locations, or the like. For example, the on-board computing device 212 may receive GPS data to determine the AV's latitude, longitude and/or altitude position. Other location sensors or systems such as laser-based localization systems, inertial-aided GPS, or camera-based localization may also be used to identify the location of the vehicle. The location of the vehicle may include an absolute geographical location, such as latitude, longitude, and altitude as well as relative location information, such as location relative to other cars immediately around it which can often be determined with less noise than absolute geographical location. The map data can provide information regarding: the identity and location of different roadways, road segments, lane segments, buildings, or other items; the location, boundaries, and directions of traffic lanes (e.g., the location and direction of a parking lane, a turning lane, a bicycle lane, or other lanes within a particular roadway) and metadata associated with traffic lanes; traffic control data (e.g., the location and instructions of signage, traffic lights, or other traffic control devices); and/or any other map data that provides information that assists the on-board computing device 212 in analyzing the surrounding environment of the autonomous vehicle 201. [0029] The map data may also include information and/or rules for determining right of way of objects and/or vehicles in conflicted areas or spaces. A conflicted space (or conflicted area) refers to an area where more than one object and/or vehicle may be predicted to be present at the same time leading to a risk collision, unless one of the objects and/or vehicles is given precedence (i.e., right of way) to traverse the conflicted space. Examples of such conflicted spaces can include traffic light intersections, stop sign intersections, roundabouts, turns, crosswalks, pedestrian crossings etc. The right of way information and/or rules for a conflicted space may be derived from traffic laws and rules associated with a geographical area (and may not be the same for all spaces). For example, for a traffic light intersection, a vehicle that has a green light signal will have right of way over a vehicle that has a red light signal, a vehicle going straight will have right of way over a vehicle trying to turn left or right, a pedestrian will have right of way when there is a walk sign signal, etc. Similarly, a moving vehicle will have right of way over a stopped vehicle trying to merge into traffic and/or a vehicle moving in its lane will have right of way over a vehicle merging into another lane. In another example, a pedestrian will have right of way in a pedestrian crossing. At a stop sign, a vehicle that arrived at the stop sign first will have right of way over a vehicle that arrived at the stop sign later. [0043] Referring now to FIG. 3, there is provided an illustration of an example environment 300 in which an autonomous vehicle may travel. Environment 300 comprises a non-limiting example of a T-shaped intersection 301 that an autonomous vehicle 302.sub.1 that is traveling along a road in a semi-autonomous or autonomous manner is required to navigate. The autonomous vehicle 302.sub.1 is generally configured to detect objects 302.sub.2, 304, 306 in proximity thereto, and that are also navigating the intersection 301 around the same time as the autonomous vehicle 302.sub.1. The objects can include, but are not limited to, a vehicle 302.sub.2, cyclist 304 (such as a rider of a bicycle, electric scooter, motorcycle, or the like) and/or a pedestrian 306. The autonomous vehicle 302.sub.1 can also be configured to detect preliminary information for the intersection 301. The preliminary information can include map-based information such as the type of intersection 301 and the configuration of the intersection 301. For example, the map-based information can include the position of exits and entrances to the intersection 301, the lane structure of the intersection 301, the presence of, position of, and status of traffic signs, such as STOP signs 305a and 305b, associated with the intersection 301, the position of fixed objects in the intersection 301, etc. The preliminary information can also include traffic density information for the intersection 301. Traffic density information can indicate a general congestion level for the intersection 301 without identifying individual vehicles present at the intersection 301. This preliminary information can be accessed by the automated driving system from a remote location, for example, from a remote map database.) Therefore, it would be obvious to one of ordinary skill in the art prior to the effective filing date to modify Dax in light of Hartnett such that the semantic information is obtained independently of the vehicle sensors. It would be obvious to one of ordinary skill in the art prior to the effective filing date to modify Dax in this way because not all of this information changes depending on the vehicle and the specific passing of the intersection. Lights, signs, traffic rules, and lanes are likely to remain relatively static, and so if the data is acquired once, it should be useful until the intersection is modified. By getting the data from a data base of map data, it saves the vehicle’s computer computational efforts. For Claim 9, Dax teaches The system of claim 8, wherein the contention data indicates one or more of: one or more respective locations within the junction area of the one or more individual contention points; ([0012] “In some examples, sensor data captured by sensors of the autonomous vehicle may be used to determine a collision zone associated with the autonomous vehicle and one or more objects detected in the environment. A collision zone corresponds to an area of the environment where a collision between the autonomous vehicle and the object may occur, based on current trajectories (and/or variances, accelerations, decelerations, etc. associated with the current trajectories) of the autonomous vehicle and the object. “ [0024] “An operation 114 includes determining a collision zone based at least in part on the trajectory for the vehicle and one or more predicted trajectories of the object. As discussed above and in more detail below, a collision zone corresponds to an area of the environment where a collision between the autonomous vehicle 106 and the object 112 may occur, based on current trajectories (and/or variances, accelerations, decelerations, etc. associated with the current trajectories) of the autonomous vehicle 106 and the object 112.”) one or more potential maneuvers respectively corresponding to the one or more individual contention points; or (Figure 6, ([0099] “If it is determined that yielding to the object does not block the second trajectory (“No” at operation 608), the process may proceed to an operation 610, which includes controlling the vehicle to yield to the object. “ [0101] “An operation 614 includes controlling the vehicle to proceed along the first trajectory based at least in part on the cost. For instance, the vehicle may determine that following the original trajectory will not result in a collision, but yielding to the object will prevent the object from proceeding along the predicted trajectory of the object and/or block traffic at the junction.”. It should also be noted that because this limitation provides more details for an optional limitation of claim 2, it is not technically limiting.) one or more respective sets of one or more wait behavior rules that correspond to the one or more individual contention points. (Figure 6, ([0099] “If it is determined that yielding to the object does not block the second trajectory (“No” at operation 608), the process may proceed to an operation 610, which includes controlling the vehicle to yield to the object. “ [0101] “An operation 614 includes controlling the vehicle to proceed along the first trajectory based at least in part on the cost. For instance, the vehicle may determine that following the original trajectory will not result in a collision, but yielding to the object will prevent the object from proceeding along the predicted trajectory of the object and/or block traffic at the junction.”. It should also be noted that because this limitation provides more details for an optional limitation of claim 2, it is not technically limiting.) For Claim 10, Dax teaches The system of claim 9, wherein at least one of the one or more respective sets of wait behavior rules include respective wait behaviors that correspond to one or more individual potential maneuvers corresponding to the one or more individual contention points. (Figure 6, ([0099] “If it is determined that yielding to the object does not block the second trajectory (“No” at operation 608), the process may proceed to an operation 610, which includes controlling the vehicle to yield to the object. “ [0101] “An operation 614 includes controlling the vehicle to proceed along the first trajectory based at least in part on the cost. For instance, the vehicle may determine that following the original trajectory will not result in a collision, but yielding to the object will prevent the object from proceeding along the predicted trajectory of the object and/or block traffic at the junction.”. It should also be noted that because this limitation provides more details for an optional limitation of claim 2, it is not technically limiting.) For Claim 11, Dax teaches The system of claim 8, wherein the one or more individual contention points correspond to one or more of: merging vehicle paths; intersecting vehicle paths; ([0023]) an intersection between a pedestrian area and a first vehicle path; ([0023]) an intersection between a railroad and a second vehicle path; or an intersection between a driveway area and a third vehicle path. ([0023], Reference will generally be made to the object being a vehicle in the environment, but any object in the environment is considered without departing from the scope of this disclosure (e.g., pedestrian, motorcycle, bicycle, animal, train, and so forth). For instance, returning to the example 104, the autonomous vehicle 106 may detect an object 112 in the environment. In the example 104, the object 112 is approaching the junction in a lane intersecting the lane occupied by the autonomous vehicle 106.”) For Claim 12, Dax teaches The system of claim 8, wherein the determining of the set of contention rules is further based at least on perception data obtained using one or more sensors of the machine and indicating a state of the junction area. ([0012] “In some examples, sensor data captured by sensors of the autonomous vehicle may be used to determine a collision zone associated with the autonomous vehicle and one or more objects detected in the environment. A collision zone corresponds to an area of the environment where a collision between the autonomous vehicle and the object may occur, based on current trajectories (and/or variances, accelerations, decelerations, etc. associated with the current trajectories) of the autonomous vehicle and the object.”) For Claim 14, Dax teaches The system of claim 8, The system of claim 10, wherein the system is comprised in at least one of: a control system for an autonomous or semi-autonomous machine; ([0011] “This disclosure relates to modifying a trajectory of a vehicle, such as an autonomous vehicle,”) a perception system for an autonomous or semi-autonomous machine; ([0011] “This disclosure relates to modifying a trajectory of a vehicle, such as an autonomous vehicle,”) a system for performing simulation operations; ( [0019] “Additionally, the techniques described herein can be used with real data (e.g., captured using sensor(s)), simulated data (e.g., generated by a simulator), or any combination of the two.”) a system for performing digital twin operations; a system for performing light transport simulation; a system for performing collaborative content creation for 3D assets; a system for performing deep learning operations; ([0089]) a system for presenting at least one of augmented reality content, virtual reality content, or mixed reality content; a system for hosting one or more real-time streaming applications; a system implemented using an edge device; a system implemented using a robot; ([0063] “The vehicle computing device(s) 504 can include one or more processors 516 and memory 518 communicatively coupled with the one or more processors 516. In the illustrated example, the vehicle 502 is an autonomous vehicle; however, the vehicle 502 could be any other type of vehicle or robotic platform.”) a system for performing conversational Al operations; a system for implementing one or more large language models (LLMs); a system for generating synthetic data; a system incorporating one or more virtual machines (VMs); a system implemented at least partially in a data center; or ([0079] “The vehicle 502 can also include one or more communication connection(s) 510 that enable communication between the vehicle 502 and one or more other local or remote computing device(s). For instance, the communication connection(s) 510 can facilitate communication with other local computing device(s) on the vehicle 502 and/or the drive system(s) 514. Also, the communication connection(s) 510 can allow the vehicle to communicate with other nearby computing device(s) (e.g., other nearby vehicles, traffic signals, etc.). The communication connection(s) 510 also enable the vehicle 502 to communicate with a remote teleoperations computing device or other remote services.”) a system implemented at least partially using cloud computing resources. Claims 1-5, 7, 13 are rejected under 35 U.S.C. 103 as being unpatentable over Dax et al (US Pub 2021/0122373 A1) hereafter known as Dax in light of Zhang et al (US Pub 2019/0302768 A1), hereafter known as Zhang in light of Hartnett. For Claim 1, Dax teaches A method comprising: obtaining contention data associated with a junction area that corresponds to one or more potential yield scenarios associated with navigation of a machine, the contention data indicating information related to a plurality of contention points of the junction area that influence machine wait behavior with respect to the one or more potential yield scenarios, wherein the contention data is obtained from; ([0012] “In some examples, sensor data captured by sensors of the autonomous vehicle may be used to determine a collision zone associated with the autonomous vehicle and one or more objects detected in the environment. A collision zone corresponds to an area of the environment where a collision between the autonomous vehicle and the object may occur, based on current trajectories (and/or variances, accelerations, decelerations, etc. associated with the current trajectories) of the autonomous vehicle and the object.” [0013] “For instance, the autonomous vehicle can pass through the environment safely if it can accurately determine whether to yield to an oncoming object or not. The autonomous vehicle may use the collision zone to determine whether to yield to the object. “ [0100] “The cost function may also include a second term associated with a distance overlap component, which evaluates a distance that the vehicle has proceeded into the junction proportionate to the entire length of the collision zone.” a signal data structure including semantic information about one or more traffic signals corresponding to the junction area; semantic information about one or more paths corresponding to the junction area; ([0012] “In some examples, sensor data captured by sensors of the autonomous vehicle may be used to determine a collision zone associated with the autonomous vehicle and one or more objects detected in the environment. A collision zone corresponds to an area of the environment where a collision between the autonomous vehicle and the object may occur, based on current trajectories (and/or variances, accelerations, decelerations, etc. associated with the current trajectories) of the autonomous vehicle and the object.”) a path contention data structure including semantic information about one or more contentions related to wait behavior with respect to the one or more paths corresponding to the junction area; or a direction rule data structure including semantic information about one or more conditions related to wait behavior at the junction area; determining a set of contention rules for the junction area based at least on the contention data, the set of contention rules indicating one or more respective wait behaviors potentially applied to individual contention points of the plurality of contention points; and (Figure 6, [0097] “An operation 606 includes determining an overlap area based at least in part on a first area associated with the vehicle following the first trajectory and a second area associated with the object following the second trajectory.” [0098] An operation 608 includes determining whether yielding to the object blocks the second trajectory of the object. For instance, if the vehicle stops or slows down, the vehicle may determine that the object's predicted trajectory is blocked by the vehicle as stopped or slowed.” [0099] "If it is determined that yielding to the object does not block the second trajectory (“No” at operation 608), the process may proceed to an operation 610, which includes controlling the vehicle to yield to the object. However, if it is determined that yielding to the object does not block the second trajectory (“Yes” at operation 608), the process may proceed to an operation 612, which includes determining a cost of the vehicle continuing to follow the first trajectory by evaluating a cost function based at least in part on the overlap area. “ causing performance of one or more control operations of the machine based at least on the set of contention rules. ([0099] “If it is determined that yielding to the object does not block the second trajectory (“No” at operation 608), the process may proceed to an operation 610, which includes controlling the vehicle to yield to the object. “ [0101] “An operation 614 includes controlling the vehicle to proceed along the first trajectory based at least in part on the cost. For instance, the vehicle may determine that following the original trajectory will not result in a collision, but yielding to the object will prevent the object from proceeding along the predicted trajectory of the object and/or block traffic at the junction.”) Dax does not teach wherein the contention data is obtained from: a signal data structure including semantic information about one or more traffic signals corresponding to the junction area; a path data structure including semantic information about one or more paths corresponding to the junction area; a path contention data structure including semantic information about one or more contentions related to wait behavior with respect to the one or more paths corresponding to the junction area; or a direction rule data structure including semantic information about one or more conditions related to wait behavior at the junction area; Zhang, however does teach that contention data can be stored in data structures. ([0051] The information describing the critical region is then transmitted back to perception module 302 to allow perception module 302 to processing sensor data using different perception methods or models for the critical region and noncritical region. According to one embodiment, when transmitting the information concerning a critical region, a specific data structure is defined and utilized to store the critical region information. [0052] FIG. 6 is a block diagram illustrating an example of a data structure for storing feedback information for perception according to one embodiment. Referring to FIG. 6, data structure 600 includes a number of data members 601-605. Header 601 stores a timestamp indicating the time the corresponding trajectory and critical region were determined. Path length 602 stores the length of the trajectory or path (e.g., in meters). Path time 603 stores the time the ADV will take to complete the trajectory (e.g., in seconds). Trajectory point array 604 includes an array of data entries to store the information of each of the trajectory points that constitute the trajectory. The trajectory point information of each trajectory point includes at least the coordinates of the trajectory points (x, y, z), a heading direction of the trajectory point (0), and the time (t) the ADV will be at the trajectory point from the current location. Critical point array 605 includes an array of data entries to store coordinates (x, y) of points of a polygon in a form of vertexes defining a critical region. The critical points refer to the turning points of a polygon as shown in FIGS. 5A-5B (indicated as small circles of the turning corners). [0053] Once the data structure is received by perception module 302, perception module 302 can parse the data structure to determine the critical region and the non-critical region based on the trajectory points 604 and critical region points 605. Perception module can then apply different perception methods or models on different sensor data (e.g., 3D vs. 3D LIDAR data) to generate the perception information of the critical region and the non-critical region for the next planning cycle. As a result, the quality of the perception information and the processing time and resources required to generate the perception information are optimized.) Hartnett, however, does teach wherein the contention data is obtained from: a signal data structure including semantic information about one or more traffic signals corresponding to the junction area; ([0027] Geographic location information may be communicated from the location sensor 260 to the on-board computing device 212, which may then access a map of the environment that corresponds to the location information to determine known fixed features of the environment such as streets, buildings, stop signs and/or stop/go signals. Captured images from the cameras 262 and/or object detection information captured from sensors such as a LiDAR system 264 is communicated from those sensors) to the on-board computing device 212. The object detection information and/or captured images may be processed by the on-board computing device 212 to detect objects in proximity to the vehicle 201. In addition or alternatively, the vehicle 201 may transmit any of the data to a remote server system 103 (FIG. 1) for processing. Any known or to be known technique for making an object detection based on sensor data and/or captured images can be used in the embodiments disclosed in this document.) a path data structure including semantic information about one or more paths corresponding to the junction area; ([0028] The on-board computing device 212 may obtain, retrieve, and/or create map data that provides detailed information about the surrounding environment of the autonomous vehicle 201. The on-board computing device 212 may also determine the location, orientation, pose, etc. of the AV in the environment (localization) based on, for example, three dimensional position data (e.g., data from a GPS), three dimensional orientation data, predicted locations, or the like. For example, the on-board computing device 212 may receive GPS data to determine the AV's latitude, longitude and/or altitude position. Other location sensors or systems such as laser-based localization systems, inertial-aided GPS, or camera-based localization may also be used to identify the location of the vehicle. The location of the vehicle may include an absolute geographical location, such as latitude, longitude, and altitude as well as relative location information, such as location relative to other cars immediately around it which can often be determined with less noise than absolute geographical location. The map data can provide information regarding: the identity and location of different roadways, road segments, lane segments, buildings, or other items; the location, boundaries, and directions of traffic lanes (e.g., the location and direction of a parking lane, a turning lane, a bicycle lane, or other lanes within a particular roadway) and metadata associated with traffic lanes; traffic control data (e.g., the location and instructions of signage, traffic lights, or other traffic control devices); and/or any other map data that provides information that assists the on-board computing device 212 in analyzing the surrounding environment of the autonomous vehicle 201. [0029] The map data may also include information and/or rules for determining right of way of objects and/or vehicles in conflicted areas or spaces. A conflicted space (or conflicted area) refers to an area where more than one object and/or vehicle may be predicted to be present at the same time leading to a risk collision, unless one of the objects and/or vehicles is given precedence (i.e., right of way) to traverse the conflicted space. Examples of such conflicted spaces can include traffic light intersections, stop sign intersections, roundabouts, turns, crosswalks, pedestrian crossings etc. The right of way information and/or rules for a conflicted space may be derived from traffic laws and rules associated with a geographical area (and may not be the same for all spaces). For example, for a traffic light intersection, a vehicle that has a green light signal will have right of way over a vehicle that has a red light signal, a vehicle going straight will have right of way over a vehicle trying to turn left or right, a pedestrian will have right of way when there is a walk sign signal, etc. Similarly, a moving vehicle will have right of way over a stopped vehicle trying to merge into traffic and/or a vehicle moving in its lane will have right of way over a vehicle merging into another lane. In another example, a pedestrian will have right of way in a pedestrian crossing. At a stop sign, a vehicle that arrived at the stop sign first will have right of way over a vehicle that arrived at the stop sign later.) a path contention data structure including semantic information about one or more contentions related to wait behavior with respect to the one or more paths corresponding to the junction area; or ([0029] The map data may also include information and/or rules for determining right of way of objects and/or vehicles in conflicted areas or spaces. A conflicted space (or conflicted area) refers to an area where more than one object and/or vehicle may be predicted to be present at the same time leading to a risk collision, unless one of the objects and/or vehicles is given precedence (i.e., right of way) to traverse the conflicted space. Examples of such conflicted spaces can include traffic light intersections, stop sign intersections, roundabouts, turns, crosswalks, pedestrian crossings etc. The right of way information and/or rules for a conflicted space may be derived from traffic laws and rules associated with a geographical area (and may not be the same for all spaces). For example, for a traffic light intersection, a vehicle that has a green light signal will have right of way over a vehicle that has a red light signal, a vehicle going straight will have right of way over a vehicle trying to turn left or right, a pedestrian will have right of way when there is a walk sign signal, etc. Similarly, a moving vehicle will have right of way over a stopped vehicle trying to merge into traffic and/or a vehicle moving in its lane will have right of way over a vehicle merging into another lane. In another example, a pedestrian will have right of way in a pedestrian crossing. At a stop sign, a vehicle that arrived at the stop sign first will have right of way over a vehicle that arrived at the stop sign later.) a direction rule data structure including semantic information about one or more conditions related to wait behavior at the junction area; ([0029] The map data may also include information and/or rules for determining right of way of objects and/or vehicles in conflicted areas or spaces. A conflicted space (or conflicted area) refers to an area where more than one object and/or vehicle may be predicted to be present at the same time leading to a risk collision, unless one of the objects and/or vehicles is given precedence (i.e., right of way) to traverse the conflicted space. Examples of such conflicted spaces can include traffic light intersections, stop sign intersections, roundabouts, turns, crosswalks, pedestrian crossings etc. The right of way information and/or rules for a conflicted space may be derived from traffic laws and rules associated with a geographical area (and may not be the same for all spaces). For example, for a traffic light intersection, a vehicle that has a green light signal will have right of way over a vehicle that has a red light signal, a vehicle going straight will have right of way over a vehicle trying to turn left or right, a pedestrian will have right of way when there is a walk sign signal, etc. Similarly, a moving vehicle will have right of way over a stopped vehicle trying to merge into traffic and/or a vehicle moving in its lane will have right of way over a vehicle merging into another lane. In another example, a pedestrian will have right of way in a pedestrian crossing. At a stop sign, a vehicle that arrived at the stop sign first will have right of way over a vehicle that arrived at the stop sign later.) Therefore, it would be obvious to one of ordinary skill in the art prior to the effective filing date to modify Dax in light of Hartnett and Zhang such that data structures are used to store data regarding paths, traffic signals, behaviors, or other elements, because data structures are known to be effective at storing and making data available for computer programs, and it would be expected to be useful to the invention to have this option. All of these data types are useful for understanding the flow of traffic through an intersection, and how a vehicle may want to proceed to safely travel through the intersection. For Claim 2, Dax teaches The method of claim 1, wherein the contention data indicates one or more of: one or more respective locations within the junction area of one or more individual contention points of the plurality of contention points; ([0012] “In some examples, sensor data captured by sensors of the autonomous vehicle may be used to determine a collision zone associated with the autonomous vehicle and one or more objects detected in the environment. A collision zone corresponds to an area of the environment where a collision between the autonomous vehicle and the object may occur, based on current trajectories (and/or variances, accelerations, decelerations, etc. associated with the current trajectories) of the autonomous vehicle and the object. “ [0024] “An operation 114 includes determining a collision zone based at least in part on the trajectory for the vehicle and one or more predicted trajectories of the object. As discussed above and in more detail below, a collision zone corresponds to an area of the environment where a collision between the autonomous vehicle 106 and the object 112 may occur, based on current trajectories (and/or variances, accelerations, decelerations, etc. associated with the current trajectories) of the autonomous vehicle 106 and the object 112.”) one or more potential maneuvers respectively corresponding to the one or more individual contention points; or (Figure 6, ([0099] “If it is determined that yielding to the object does not block the second trajectory (“No” at operation 608), the process may proceed to an operation 610, which includes controlling the vehicle to yield to the object. “ [0101] “An operation 614 includes controlling the vehicle to proceed along the first trajectory based at least in part on the cost. For instance, the vehicle may determine that following the original trajectory will not result in a collision, but yielding to the object will prevent the object from proceeding along the predicted trajectory of the object and/or block traffic at the junction.”. It should also be noted that because this limitation provides more details for an optional limitation of claim 2, it is not technically limiting.) one or more respective sets of one or more wait behavior rules that correspond to the one or more individual contention points. (Figure 6, ([0099] “If it is determined that yielding to the object does not block the second trajectory (“No” at operation 608), the process may proceed to an operation 610, which includes controlling the vehicle to yield to the object. “ [0101] “An operation 614 includes controlling the vehicle to proceed along the first trajectory based at least in part on the cost. For instance, the vehicle may determine that following the original trajectory will not result in a collision, but yielding to the object will prevent the object from proceeding along the predicted trajectory of the object and/or block traffic at the junction.”. It should also be noted that because this limitation provides more details for an optional limitation of claim 2, it is not technically limiting.) For Claim 3, Dax teaches The method of claim 2, wherein at least one of the one or more respective sets of wait behavior rules include respective wait behaviors that correspond to one or more individual potential maneuvers corresponding to the one or more individual contention points. (Figure 6, ([0099] “If it is determined that yielding to the object does not block the second trajectory (“No” at operation 608), the process may proceed to an operation 610, which includes controlling the vehicle to yield to the object. “ [0101] “An operation 614 includes controlling the vehicle to proceed along the first trajectory based at least in part on the cost. For instance, the vehicle may determine that following the original trajectory will not result in a collision, but yielding to the object will prevent the object from proceeding along the predicted trajectory of the object and/or block traffic at the junction.”. It should also be noted that because this limitation provides more details for an optional limitation of claim 2, it is not technically limiting.) For Claim 4, Dax teaches The method of claim 1, wherein the plurality of contention points correspond to one or more of: merging vehicle paths; intersecting vehicle paths; ([0023]) an intersection between a pedestrian area and a first vehicle path; ([0023]) an intersection between a railroad and a second vehicle path; an intersection between a driveway area and a third vehicle path; an intersection between a bike lane and a fourth vehicle path; an intersection between a rail line and a fifth vehicle path; an intersection between a tram lane and a sixth vehicle path; or an intersection between a streetcar lane and a seventh vehicle path. ([0023], Reference will generally be made to the object being a vehicle in the environment, but any object in the environment is considered without departing from the scope of this disclosure (e.g., pedestrian, motorcycle, bicycle, animal, train, and so forth). For instance, returning to the example 104, the autonomous vehicle 106 may detect an object 112 in the environment. In the example 104, the object 112 is approaching the junction in a lane intersecting the lane occupied by the autonomous vehicle 106.” For Claim 5, Dax teaches The method of claim 1, wherein the determining of the set of contention rules is further based at least on perception data obtained using one or more sensors of the machine and indicating a state of the junction area. ([0012] “In some examples, sensor data captured by sensors of the autonomous vehicle may be used to determine a collision zone associated with the autonomous vehicle and one or more objects detected in the environment. A collision zone corresponds to an area of the environment where a collision between the autonomous vehicle and the object may occur, based on current trajectories (and/or variances, accelerations, decelerations, etc. associated with the current trajectories) of the autonomous vehicle and the object.”) For Claim 7, Dax teaches The method of claim 1, wherein the one or more control operations are based at least on an analysis as to which contention rules of the set of contention rules are applicable to the machine with respect to a state of the junction area. (Figure 6, ([0099] “If it is determined that yielding to the object does not block the second trajectory (“No” at operation 608), the process may proceed to an operation 610, which includes controlling the vehicle to yield to the object. “ [0101] “An operation 614 includes controlling the vehicle to proceed along the first trajectory based at least in part on the cost. For instance, the vehicle may determine that following the original trajectory will not result in a collision, but yielding to the object will prevent the object from proceeding along the predicted trajectory of the object and/or block traffic at the junction.”. It should also be noted that because this limitation provides more details for an optional limitation of claim 2, it is not technically limiting.) For Claim 13, Dax teaches The system of claim 8, wherein the contention data is included in one or more data sets including one or more of: a signal data structure that includes the information about the one or more traffic signals corresponding to the junction area; includes the information about the one or more paths corresponding to the junction area; ([0012] “In some examples, sensor data captured by sensors of the autonomous vehicle may be used to determine a collision zone associated with the autonomous vehicle and one or more objects detected in the environment. A collision zone corresponds to an area of the environment where a collision between the autonomous vehicle and the object may occur, based on current trajectories (and/or variances, accelerations, decelerations, etc. associated with the current trajectories) of the autonomous vehicle and the object.”) a path contention data structure that includes the information about the one or more contentions related to the wait behavior with respect to the one or more paths corresponding to the junction area; or a direction rule data structure that includes the information about the one or more conditions related to wait behavior at the junction area. object and/or block traffic at the junction.”) Dax does not teach the use of storing this data in data structures. Zhang, however does teach that contention data can be stored in data structures. ([0051] The information describing the critical region is then transmitted back to perception module 302 to allow perception module 302 to processing sensor data using different perception methods or models for the critical region and noncritical region. According to one embodiment, when transmitting the information concerning a critical region, a specific data structure is defined and utilized to store the critical region information. [0052] FIG. 6 is a block diagram illustrating an example of a data structure for storing feedback information for perception according to one embodiment. Referring to FIG. 6, data structure 600 includes a number of data members 601-605. Header 601 stores a timestamp indicating the time the corresponding trajectory and critical region were determined. Path length 602 stores the length of the trajectory or path (e.g., in meters). Path time 603 stores the time the ADV will take to complete the trajectory (e.g., in seconds). Trajectory point array 604 includes an array of data entries to store the information of each of the trajectory points that constitute the trajectory. The trajectory point information of each trajectory point includes at least the coordinates of the trajectory points (x, y, z), a heading direction of the trajectory point (0), and the time (t) the ADV will be at the trajectory point from the current location. Critical point array 605 includes an array of data entries to store coordinates (x, y) of points of a polygon in a form of vertexes defining a critical region. The critical points refer to the turning points of a polygon as shown in FIGS. 5A-5B (indicated as small circles of the turning corners). [0053] Once the data structure is received by perception module 302, perception module 302 can parse the data structure to determine the critical region and the non-critical region based on the trajectory points 604 and critical region points 605. Perception module can then apply different perception methods or models on different sensor data (e.g., 3D vs. 3D LIDAR data) to generate the perception information of the critical region and the non-critical region for the next planning cycle. As a result, the quality of the perception information and the processing time and resources required to generate the perception information are optimized.) Therefore, it would be obvious to one of ordinary skill in the art prior to the effective filing date to modify Dax in light of Zhang such that data structures are used to store data regarding paths, traffic signals, behaviors, or other elements, because data structures are known to be effective at storing and making data available for computer programs, and it would be expected to be useful to the invention to have this option. Claims 15-20 are rejected under 35 U.S.C. 103 as being unpatentable over Dax in light of Zhang in light of Wongpiromsarn et al (US Pub 2020/0189575 A1), hereafter known as Wongpiromsarn in light of Hartnett. For Claim 15, Dax teaches A processor comprising processing circuitry to perform operations comprising: ([0063] “The vehicle computing device(s) 504 can include one or more processors 516 and memory 518 communicatively coupled with the one or more processors 516.”) determining contention data associated with a junction area that corresponds to one or more potential yield scenarios associated with navigation of a machine, the contention data being determined based at least on one or more data sets that include semantic information corresponding to the junction area; ([0012] “In some examples, sensor data captured by sensors of the autonomous vehicle may be used to determine a collision zone associated with the autonomous vehicle and one or more objects detected in the environment. A collision zone corresponds to an area of the environment where a collision between the autonomous vehicle and the object may occur, based on current trajectories (and/or variances, accelerations, decelerations, etc. associated with the current trajectories) of the autonomous vehicle and the object.” [0013] “For instance, the autonomous vehicle can pass through the environment safely if it can accurately determine whether to yield to an oncoming object or not. The autonomous vehicle may use the collision zone to determine whether to yield to the object. “ [0100] “The cost function may also include a second term associated with a distance overlap component, which evaluates a distance that the vehicle has proceeded into the junction proportionate to the entire length of the collision zone.”) determining a set of contention rules for the junction area based at least on the contention data, the set of contention rules corresponding to one or more individual contention points of the junction area that influence vehicle wait behavior with respect to the one or more potential yield scenarios; and (Figure 6, [0097] “An operation 606 includes determining an overlap area based at least in part on a first area associated with the vehicle following the first trajectory and a second area associated with the object following the second trajectory.” [0098] An operation 608 includes determining whether yielding to the object blocks the second trajectory of the object. For instance, if the vehicle stops or slows down, the vehicle may determine that the object's predicted trajectory is blocked by the vehicle as stopped or slowed.” [0099] "If it is determined that yielding to the object does not block the second trajectory (“No” at operation 608), the process may proceed to an operation 610, which includes controlling the vehicle to yield to the object. However, if it is determined that yielding to the object does not block the second trajectory (“Yes” at operation 608), the process may proceed to an operation 612, which includes determining a cost of the vehicle continuing to follow the first trajectory by evaluating a cost function based at least in part on the overlap area. “) causing performance of one or more control operations of the machine while the machine is at the junction area based at least on the set of contention rules. ([0099] “If it is determined that yielding to the object does not block the second trajectory (“No” at operation 608), the process may proceed to an operation 610, which includes controlling the vehicle to yield to the object. “ [0101] “An operation 614 includes controlling the vehicle to proceed along the first trajectory based at least in part on the cost. For instance, the vehicle may determine that following the original trajectory will not result in a collision, but yielding to the object will prevent the object from proceeding along the predicted trajectory of the object and/or block traffic at the junction.”) Dax does not teach the use of storing this data in data structures. That the determination is made prior to the machine arriving at the junction area Zhang, however does teach that contention data can be stored in data structures. ([0051] The information describing the critical region is then transmitted back to perception module 302 to allow perception module 302 to processing sensor data using different perception methods or models for the critical region and noncritical region. According to one embodiment, when transmitting the information concerning a critical region, a specific data structure is defined and utilized to store the critical region information. [0052] FIG. 6 is a block diagram illustrating an example of a data structure for storing feedback information for perception according to one embodiment. Referring to FIG. 6, data structure 600 includes a number of data members 601-605. Header 601 stores a timestamp indicating the time the corresponding trajectory and critical region were determined. Path length 602 stores the length of the trajectory or path (e.g., in meters). Path time 603 stores the time the ADV will take to complete the trajectory (e.g., in seconds). Trajectory point array 604 includes an array of data entries to store the information of each of the trajectory points that constitute the trajectory. The trajectory point information of each trajectory point includes at least the coordinates of the trajectory points (x, y, z), a heading direction of the trajectory point (0), and the time (t) the ADV will be at the trajectory point from the current location. Critical point array 605 includes an array of data entries to store coordinates (x, y) of points of a polygon in a form of vertexes defining a critical region. The critical points refer to the turning points of a polygon as shown in FIGS. 5A-5B (indicated as small circles of the turning corners). [0053] Once the data structure is received by perception module 302, perception module 302 can parse the data structure to determine the critical region and the non-critical region based on the trajectory points 604 and critical region points 605. Perception module can then apply different perception methods or models on different sensor data (e.g., 3D vs. 3D LIDAR data) to generate the perception information of the critical region and the non-critical region for the next planning cycle. As a result, the quality of the perception information and the processing time and resources required to generate the perception information are optimized.) Therefore, it would be obvious to one of ordinary skill in the art prior to the effective filing date to modify Dax in light of Zhang such that data structures are used to store data regarding paths, traffic signals, behaviors, or other elements, because data structures are known to be effective at storing and making data available for computer programs, and it would be expected to be useful to the invention to have this option. Wongpiromsarn, however, does teach That a control determination is made prior to the machine arriving at the junction area ([0116] In one embodiment, the planning module 1336 generates a motion constraint that includes a minimum speed of the AV 1308 to avoid blocking an intersection by the AV 1308. For example, the AV 1308 may be stopped or moving slowly within an intersection. The planning module 1336 determines that a current speed of the vehicle 1320 is v1, and the vehicle 1320 is approaching the intersection and is a particular distance d.sub.4 from the intersection. The planning module 1336 further determines that the current speed of the AV 1308 is v and a collision is likely between the AV 1308 and the vehicle 1320 in t.sub.4 seconds. The planning module 1336 generates a minimum speed constraint based on d.sub.4, t.sub.4, v1, and v, such that the AV 1308 can speed up and safely avoid blocking the intersection before the vehicle 1320 arrives at the intersection. The planning module 1336 will either find a solution that satisfies both (1) a minimum speed constraint to avoid a potential collision with the vehicle 1320 in the intersection and (2) a maximum speed constraint for a potential collision with the construction zone 1308, or else select a different homotopy including crossing the intersection only after the vehicle 1320 clears the intersection. The different homotopy includes a corresponding maximum speed constraint based on d.sub.4, t.sub.4, v1, and v.) Hartnett, however, teaches determining contention rules on a database that can be accessed remotely. This would be before the machine perceives the junction area. ([0029] The map data may also include information and/or rules for determining right of way of objects and/or vehicles in conflicted areas or spaces. A conflicted space (or conflicted area) refers to an area where more than one object and/or vehicle may be predicted to be present at the same time leading to a risk collision, unless one of the objects and/or vehicles is given precedence (i.e., right of way) to traverse the conflicted space. Examples of such conflicted spaces can include traffic light intersections, stop sign intersections, roundabouts, turns, crosswalks, pedestrian crossings etc. The right of way information and/or rules for a conflicted space may be derived from traffic laws and rules associated with a geographical area (and may not be the same for all spaces). For example, for a traffic light intersection, a vehicle that has a green light signal will have right of way over a vehicle that has a red light signal, a vehicle going straight will have right of way over a vehicle trying to turn left or right, a pedestrian will have right of way when there is a walk sign signal, etc. Similarly, a moving vehicle will have right of way over a stopped vehicle trying to merge into traffic and/or a vehicle moving in its lane will have right of way over a vehicle merging into another lane. In another example, a pedestrian will have right of way in a pedestrian crossing. At a stop sign, a vehicle that arrived at the stop sign first will have right of way over a vehicle that arrived at the stop sign later.) Therefore, it would be obvious to one of ordinary skill in the art prior to the effective filing date to modify Dax in light of Wongpiromsarn and Hartnett, such that the determination is made before perceiving the intersection, because control steps need time to be implemented. If the system waits until reaching the intersection before making control decision determinations, then if the vehicle needs to yield, slow, or stop, it could be too late to take the corrective action. By ensuring that the determination is made prior, the system will be ready to make safe actions as it approaches the intersection. Since yielding decisions and rules may be consistent for vehicles approaching the intersection, then they could be determined beforehand and stored on a database for quick and easy access that would be usable by other vehicles as well. For Claim 16, Dax teaches The processor of claim 15, wherein the contention data indicates one or more of: one or more respective locations within the junction area of the one or more individual contention points; ([0012] “In some examples, sensor data captured by sensors of the autonomous vehicle may be used to determine a collision zone associated with the autonomous vehicle and one or more objects detected in the environment. A collision zone corresponds to an area of the environment where a collision between the autonomous vehicle and the object may occur, based on current trajectories (and/or variances, accelerations, decelerations, etc. associated with the current trajectories) of the autonomous vehicle and the object. “ [0024] “An operation 114 includes determining a collision zone based at least in part on the trajectory for the vehicle and one or more predicted trajectories of the object. As discussed above and in more detail below, a collision zone corresponds to an area of the environment where a collision between the autonomous vehicle 106 and the object 112 may occur, based on current trajectories (and/or variances, accelerations, decelerations, etc. associated with the current trajectories) of the autonomous vehicle 106 and the object 112.”) one or more potential maneuvers respectively corresponding to the one or more individual contention points; or (Figure 6, ([0099] “If it is determined that yielding to the object does not block the second trajectory (“No” at operation 608), the process may proceed to an operation 610, which includes controlling the vehicle to yield to the object. “ [0101] “An operation 614 includes controlling the vehicle to proceed along the first trajectory based at least in part on the cost. For instance, the vehicle may determine that following the original trajectory will not result in a collision, but yielding to the object will prevent the object from proceeding along the predicted trajectory of the object and/or block traffic at the junction.”. It should also be noted that because this limitation provides more details for an optional limitation of claim 2, it is not technically limiting.) one or more respective sets of wait behavior rules that correspond to the one or more individual contention points. (Figure 6, ([0099] “If it is determined that yielding to the object does not block the second trajectory (“No” at operation 608), the process may proceed to an operation 610, which includes controlling the vehicle to yield to the object. “ [0101] “An operation 614 includes controlling the vehicle to proceed along the first trajectory based at least in part on the cost. For instance, the vehicle may determine that following the original trajectory will not result in a collision, but yielding to the object will prevent the object from proceeding along the predicted trajectory of the object and/or block traffic at the junction.”. It should also be noted that because this limitation provides more details for an optional limitation of claim 2, it is not technically limiting.) For Claim 17, Dax teaches The processor of claim 16, wherein at least one of the one or more respective sets of wait behavior rules include respective wait behaviors that correspond to one or more individual potential maneuvers corresponding to the one or more individual contention points. (Figure 6, ([0099] “If it is determined that yielding to the object does not block the second trajectory (“No” at operation 608), the process may proceed to an operation 610, which includes controlling the vehicle to yield to the object. “ [0101] “An operation 614 includes controlling the vehicle to proceed along the first trajectory based at least in part on the cost. For instance, the vehicle may determine that following the original trajectory will not result in a collision, but yielding to the object will prevent the object from proceeding along the predicted trajectory of the object and/or block traffic at the junction.”. It should also be noted that because this limitation provides more details for an optional limitation of claim 2, it is not technically limiting.) For Claim 18, Dax teaches The processor of claim 15, wherein the one or more individual contention points correspond to one or more of: merging vehicle paths; intersecting vehicle paths; ([0023]) an intersection between a pedestrian area and a first vehicle path; ([0023]) an intersection between a railroad and a second vehicle path; or an intersection between a driveway area and a third vehicle path. ([0023], Reference will generally be made to the object being a vehicle in the environment, but any object in the environment is considered without departing from the scope of this disclosure (e.g., pedestrian, motorcycle, bicycle, animal, train, and so forth). For instance, returning to the example 104, the autonomous vehicle 106 may detect an object 112 in the environment. In the example 104, the object 112 is approaching the junction in a lane intersecting the lane occupied by the autonomous vehicle 106.”) For Claim 19, Dax teaches The processor of claim 15, wherein the determining of the set of contention rules is further based at least on perception data obtained using one or more sensors of the machine and indicating a state of the junction area. ([0012] “In some examples, sensor data captured by sensors of the autonomous vehicle may be used to determine a collision zone associated with the autonomous vehicle and one or more objects detected in the environment. A collision zone corresponds to an area of the environment where a collision between the autonomous vehicle and the object may occur, based on current trajectories (and/or variances, accelerations, decelerations, etc. associated with the current trajectories) of the autonomous vehicle and the object.”) For Claim 20, Dax teaches The processor of claim 15, wherein the one or more data structures include one or more of: a signal data structure; a path data structure; a path contention data set; or ([0012] “In some examples, sensor data captured by sensors of the autonomous vehicle may be used to determine a collision zone associated with the autonomous vehicle and one or more objects detected in the environment. A collision zone corresponds to an area of the environment where a collision between the autonomous vehicle and the object may occur, based on current trajectories (and/or variances, accelerations, decelerations, etc. associated with the current trajectories) of the autonomous vehicle and the object.”) a direction rule data structure. Dax does not teach wherein the contention data is obtained from a data structure; Zhang, however does teach that contention data can be stored in data structures. ([0051] The information describing the critical region is then transmitted back to perception module 302 to allow perception module 302 to processing sensor data using different perception methods or models for the critical region and noncritical region. According to one embodiment, when transmitting the information concerning a critical region, a specific data structure is defined and utilized to store the critical region information. [0052] FIG. 6 is a block diagram illustrating an example of a data structure for storing feedback information for perception according to one embodiment. Referring to FIG. 6, data structure 600 includes a number of data members 601-605. Header 601 stores a timestamp indicating the time the corresponding trajectory and critical region were determined. Path length 602 stores the length of the trajectory or path (e.g., in meters). Path time 603 stores the time the ADV will take to complete the trajectory (e.g., in seconds). Trajectory point array 604 includes an array of data entries to store the information of each of the trajectory points that constitute the trajectory. The trajectory point information of each trajectory point includes at least the coordinates of the trajectory points (x, y, z), a heading direction of the trajectory point (0), and the time (t) the ADV will be at the trajectory point from the current location. Critical point array 605 includes an array of data entries to store coordinates (x, y) of points of a polygon in a form of vertexes defining a critical region. The critical points refer to the turning points of a polygon as shown in FIGS. 5A-5B (indicated as small circles of the turning corners). [0053] Once the data structure is received by perception module 302, perception module 302 can parse the data structure to determine the critical region and the non-critical region based on the trajectory points 604 and critical region points 605. Perception module can then apply different perception methods or models on different sensor data (e.g., 3D vs. 3D LIDAR data) to generate the perception information of the critical region and the non-critical region for the next planning cycle. As a result, the quality of the perception information and the processing time and resources required to generate the perception information are optimized.) Therefore, it would be obvious to one of ordinary skill in the art prior to the effective filing date to modify Dax in light of Zhang such that data structures are used to store data regarding paths, traffic signals, behaviors, or other elements, because data structures are known to be effective at storing and making data available for computer programs, and it would be expected to be useful to the invention to have this option. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Green et al (US Pub 10,019,011 B1) relates to autonomous vehicles using machine learning to determine when to yield. Caldwell et al (US Pub 2024/0092398 A1) relates to decision trees for trajectory decisions. Beller et al (US Pub 2022/0261000 A1) relates to decisions to yield or accelerate at intersections. Silva et al (US Pub 2021/0370921 A1) relates to considering different vehicle trajectories for potential collisions. Any inquiry concerning this communication or earlier communications from the examiner should be directed to TRISTAN J GREINER whose telephone number is (571)272-1382. The examiner can normally be reached Mon - Fri 7:30-4:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Tran Khoi can be reached Monday-Thursday. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /T.J.G./Examiner, Art Unit 3656 /KHOI H TRAN/Supervisory Patent Examiner, Art Unit 3656
Read full office action

Prosecution Timeline

Sep 26, 2023
Application Filed
May 17, 2025
Non-Final Rejection — §103
Aug 12, 2025
Interview Requested
Aug 21, 2025
Response Filed
Aug 28, 2025
Applicant Interview (Telephonic)
Sep 03, 2025
Examiner Interview Summary
Nov 15, 2025
Final Rejection — §103
Feb 05, 2026
Interview Requested
Feb 13, 2026
Request for Continued Examination
Mar 06, 2026
Response after Non-Final Action
Mar 07, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601603
Carbon Footprint Optimized Timely E-Truck Transportation
2y 5m to grant Granted Apr 14, 2026
Patent 12583476
METHOD FOR CONTROLLING AN AUTONOMOUS VEHICLE
2y 5m to grant Granted Mar 24, 2026
Patent 12576886
MOTION PREDICTION
2y 5m to grant Granted Mar 17, 2026
Patent 12565241
CONTROLLER
2y 5m to grant Granted Mar 03, 2026
Patent 12558781
AUTOMATIC MACHINE CONTROL DEVICE AND AUTOMATIC MACHINE CONTROL METHOD
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
78%
Grant Probability
99%
With Interview (+21.4%)
2y 9m
Median Time to Grant
High
PTA Risk
Based on 166 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month