Prosecution Insights
Last updated: April 19, 2026
Application No. 17/926,980

Method and Device for the Automated Driving Mode of a Vehicle, and Vehicle

Non-Final OA §103
Filed
Nov 21, 2022
Examiner
CASS, JEAN PAUL
Art Unit
3666
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Daimler Truck AG
OA Round
5 (Non-Final)
73%
Grant Probability
Favorable
5-6
OA Rounds
3y 1m
To Grant
99%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
719 granted / 984 resolved
+21.1% vs TC avg
Strong +26% interview lift
Without
With
+25.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
83 currently pending
Career history
1067
Total Applications
across all art units

Statute-Specific Performance

§101
10.5%
-29.5% vs TC avg
§103
56.8%
+16.8% vs TC avg
§102
12.6%
-27.4% vs TC avg
§112
12.8%
-27.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 984 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to the applicant’s arguments The previous rejection is withdrawn. Applicant’s amendments are entered. Applicant’s remarks are also entered into the record. A new search was made necessitated by the applicant’s amendments. A new reference was found. A new rejection is made herein. Applicant’s arguments are now moot in view of the new rejection of the claims. PNG media_image1.png 794 1172 media_image1.png Greyscale The primary reference is silent but SILVA teaches “...determining that the visibility metric is below a threshold value but can be increased to above the threshold value by changing lanes to an outside lane of the bend having the same direction of travel as the vehicle; and (see paragraph 71 and 116-120 where in operation 918, the process can include controlling a vehicle (e.g., the vehicle 102) based at least in part on the occupancy. For example, by evaluating the occlusion grid 906 that represents a hill in the environment, the vehicle 102 can“see over” the crest of the hill despite the sensors not directly capturing ground data beyond the hill of the crest. Thus, the vehicle 102 can plan a trajectory and/or determine a velocity based on the occupancy of the occlusion grid 906 beyond the crest of the hill. As a non-limiting example, the vehicle 102 may adjust one or more of a velocity, orientation, or position such that information provided by the occlusion grid 906 is sufficient to safely traverse the hill.) in response to determining that the visibility metric is below the threshold value but can be increased to above the threshold value by changing lanes to the outside lane of the bend:” (See fig. 9 where the vehicle can use a lidar device to provide 1. An un-occluded region and 2. An occluded region where there is a real risk of a collision with an object within the region and in FIG. 9, block 918, the vehicles can be controlled to stay in the non-occluded by changing lanes or moving to stay in areas 908 and not 910; ) PNG media_image2.png 810 622 media_image2.png Greyscale It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the teachings of SILVA with the disclosure of PERKO with a reasonable expectation of success since SILVA teaches that an autonomous vehicle can determine an occluded area and a non-occluded area. The vehicle can be controlled to the non-occluded area where it is safe and avoid the occluded area when possible. See block 918. In block 1018, the vehicle can be controlled based on the clearance of the occlusion. For example, by evaluating the occlusion grid 906 that represents a hill in the environment, the vehicle 102 can “see over” the crest of the hill despite the sensors not directly capturing ground data beyond the hill of the crest. Thus, the vehicle 102 can plan a trajectory and/or determine a velocity based on the occupancy of the occlusion grid 906 beyond the crest of the hill. As a non-limiting example, the vehicle 102 may adjust one or more of a velocity, orientation, or position such that information provided by the occlusion grid 906 is sufficient to safely traverse the hill. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim 11 is rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Application Pub. No.: US20190019416A1 to Perko et al. that was filed in 2017 and in view of United States Patent App. Pub. No.: US 20160280224 A1 to Tatourian et al that was filed in 2015 and in view of International Patent Pub. No.: WO 2019/245982 A1 to Silva that was filed in 2019. PNG media_image1.png 794 1172 media_image1.png Greyscale The primary reference is silent but SILVA teaches “...determining that the visibility metric is below a threshold value but can be increased to above the threshold value by changing lanes to an outside lane of the bend having the same direction of travel as the vehicle; and (see paragraph 71 and 116-120 where in operation 918, the process can include controlling a vehicle (e.g., the vehicle 102) based at least in part on the occupancy. For example, by evaluating the occlusion grid 906 that represents a hill in the environment, the vehicle 102 can“see over” the crest of the hill despite the sensors not directly capturing ground data beyond the hill of the crest. Thus, the vehicle 102 can plan a trajectory and/or determine a velocity based on the occupancy of the occlusion grid 906 beyond the crest of the hill. As a non-limiting example, the vehicle 102 may adjust one or more of a velocity, orientation, or position such that information provided by the occlusion grid 906 is sufficient to safely traverse the hill.) in response to determining that the visibility metric is below the threshold value but can be increased to above the threshold value by changing lanes to the outside lane of the bend:” (See fig. 9 where the vehicle can use a lidar device to provide 1. An un-occluded region and 2. An occluded region where there is a real risk of a collision with an object within the region and in FIG. 9, block 918, the vehicles can be controlled to stay in the non-occluded by changing lanes or moving to stay in areas 908 and not 910; ) PNG media_image2.png 810 622 media_image2.png Greyscale It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the teachings of SILVA with the disclosure of PERKO with a reasonable expectation of success since SILVA teaches that an autonomous vehicle can determine an occluded area and a non-occluded area. The vehicle can be controlled to the non-occluded area where it is safe and avoid the occluded area when possible. See block 918. In block 1018, the vehicle can be controlled based on the clearance of the occlusion. For example, by evaluating the occlusion grid 906 that represents a hill in the environment, the vehicle 102 can “see over” the crest of the hill despite the sensors not directly capturing ground data beyond the hill of the crest. Thus, the vehicle 102 can plan a trajectory and/or determine a velocity based on the occupancy of the occlusion grid 906 beyond the crest of the hill. As a non-limiting example, the vehicle 102 may adjust one or more of a velocity, orientation, or position such that information provided by the occlusion grid 906 is sufficient to safely traverse the hill. PNG media_image3.png 680 524 media_image3.png Greyscale Perko discloses “...11. (Currently Amended) A method for autonomously driving a vehicle on a road, (see abstract where the vehicle can detect an occlusion in the road that affects the operation of the vehicle along the route and then a second vehicle can assist the first vehicle with traveling along the route) PNG media_image4.png 796 1358 media_image4.png Greyscale wherein the vehicle includes a computing unit electronically coupled to at least one detection unit, (See FIG. 1 where the first vehicle has a first sensor 108 and the second vehicle has a sensor 108 and each vehicle can communicate or can communicate with the cloud server to avoid occlusions in the road) PNG media_image5.png 666 506 media_image5.png Greyscale wherein the method is executed by the computing unit, the method comprising: detecting a bend in the road ahead of the vehicle based on data from the detection unit; (see FIG. 4 where the first vehicle can detect a bend in the road 407 and then detect that there is limited region where there is no line of sight and then the second vehicle 105 can detect the object with a second sensor and then can communicate this obstacle 411 to the first vehicle and see paragraph 71-74) calculating a visibility metric of the detection unit for the detected bend: (see paragraph 37 where a computing system can determine that an object occludes a sensor of an autonomous vehicle if the sensor is unable to perceive one or more region(s) in a surrounding environment of the autonomous vehicle because of the object (e.g., occluded region(s)). For example, a computing system can determine that a hill, blind-corner, and/or precipitation occludes a sensor of an autonomous vehicle because the hill, blind-corner, and/or precipitation occludes one or more region(s) in the surrounding environment from the sensor of the autonomous vehicle. In particular, when the autonomous vehicle is climbing one side of the hill, the hill can occlude the opposite side from one or more sensor(s) of the autonomous vehicle; when the autonomous vehicle is turning the blind-corner, the blind-corner can occlude the other side from one or more sensor(s) of the autonomous vehicle; and when the autonomous vehicle is travelling in precipitation, the precipitation can occlude other object(s) in the surrounding environment from one or more sensor(s) of the autonomous vehicle. As a result, the autonomous vehicle may be unable to safely execute one or more maneuver(s) to navigate over the hill, around the blind-corner, or through the precipitation.) determining that the visibility metric is below a threshold value and in response to determining that the visibility metric is below the threshold value, (see paragraph 37 where a computing system can determine that an object occludes a sensor of an autonomous vehicle if the sensor is unable to perceive one or more region(s) in a surrounding environment of the autonomous vehicle because of the object (e.g., occluded region(s)). For example, a computing system can determine that a hill, blind-corner, and/or precipitation occludes a sensor of an autonomous vehicle because the hill, blind-corner, and/or precipitation occludes one or more region(s) in the surrounding environment from the sensor of the autonomous vehicle. In particular, when the autonomous vehicle is climbing one side of the hill, the hill can occlude the opposite side from one or more sensor(s) of the autonomous vehicle; when the autonomous vehicle is turning the blind-corner, the blind-corner can occlude the other side from one or more sensor(s) of the autonomous vehicle; and when the autonomous vehicle is travelling in precipitation, the precipitation can occlude other object(s) in the surrounding environment from one or more sensor(s) of the autonomous vehicle. As a result, the autonomous vehicle may be unable to safely execute one or more maneuver(s) to navigate over the hill, around the blind-corner, or through the precipitation.) automatically maneuvering the vehicle to change lanes to an outside lane of the bend.” (see paragraph 70 that recites FIG. 4 depicts a diagram 400 that illustrates an example of deploying a designated autonomous vehicle to assist an occluded autonomous vehicle to safely navigate past an occlusion point. As shown in FIG. 4, a motion plan of vehicle 103 (e.g., occluded autonomous vehicle) can include a maneuver instructing the autonomous vehicle to travel around blind-corner 407. A computing system (e.g., of the vehicle 103, of the remote computing system(s) 104, of the additional vehicle(s) 105) can determine that the blind-corner 407 is an occlusion point because when the vehicle 103 is turning the blind-corner 407, the region 409 is occluded to the sensor(s) 108 on-board the vehicle 103. The computing system can select and deploy additional vehicle 105 (e.g., designated autonomous vehicle) to observe the occluded region 409 to assist vehicle 103 to safely navigate past the blind-corner 407. The additional vehicle 105 can be deployed such that the additional vehicle 105 can obtain data indicative of the occluded region 409 (e.g., via one or more sensor(s) on-board additional vehicle 105). The additional vehicle 105 can provide the data indicative of occluded region 409 to the vehicle 103. The vehicle 103 can obtain the data indicative of the occluded region 409 from the additional vehicle 105, and determine that the occluded region 409 includes object(s) 411. The vehicle 103 can adjust its motion plan to avoid a collision with object(s) 411, and safely navigate past the blind-corner 407. For example, the vehicle 103 can slow down and/or nudge as the vehicle 103 turns the blind-corner 407.) (see paragraph 38 where a lane change can be made, or stopping or slowing or to proceed) Perko is silent but Tatourian teaches “...generate a trajectory for the vehicle to change lanes to an outside lane of the bend, and transmit the trajectory to an actuation system of the vehicle that controls the vehicle in accordance with the trajectory so as to automatically maneuver the vehicle to change lanes to the outside lane of the bend”. (see paragraph 68 that recites at block 718, the vehicle assistance server 102 generates vehicle assistance data for the particular vehicle 108 based on the road condition data and the vehicle profile information. In some embodiments, the vehicle assistance server 102 first determines which tasks are to be included in the vehicle assistance data before generating the vehicle assistance data. Determining the tasks before generating the vehicle assistance data allows the vehicle assistance server 102 to deliver the vehicle assistance data to a particular vehicle 108 while conserving computing resources. In some embodiments, the vehicle assistance data may include cruise control data 720, refuel prediction data 722, driver assistance data 724, and/or notification data 726 as discussed above. .... In some embodiments, the refuel prediction data 722 is determined by the vehicle assistance server 102 only if the vehicle profile information indicates that the vehicle 108 actively outputs a distance-to-empty prediction to the driver. The driver assistance data 724 may include adjustments to any number of vehicle parameters controlled by an advanced driver assistance system. For example, an advanced driver vehicle assistance system can include a lane change assistant to assist the driver when changing lanes. In some embodiments, the driver assistance data 724 may include a vehicle control command to indicate to the driver that it is safe to change lanes based on the road condition data and the vehicle profile information of other vehicles 108. For example, when traveling down a two-lane road, visibility might be obstructed, so a driver may not know when it is safe to drive in the lane of on-coming traffic and pass a slower moving vehicle. The driver assistance data 724 may include a notification that informs a driver when it is safe to pass based on the location of other vehicles on the road, even if visibility of the driver is limited. Additionally, the vehicle assistance server 102 can send a notification to approaching vehicles 108 that another vehicle 108 is passing in their lane of traffic. The notification data 726 may indicate to the driver the contents of the vehicle assistance data received. The notification data 726 may include merely a recitation of vehicle parameters that the in-vehicle computing system 110 automatically adjusted, or the notification data 726 may include driving suggestions to the driver, such as, apply more throttle to the engine to prepare for an approaching uphill road grade or other information.) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of PERKO with the teachings of Tatourian with a reasonable expectation of success since Tatourian teaches that a vehicle may encounter an obstruction on a two lane road where they do not know if it is safe or not to pass a slower moving vehicle. See paragraph 68. For example, when traveling down a two-lane road, visibility might be obstructed, so a driver may not know when it is safe to drive in the lane of on-coming traffic and pass a slower moving vehicle. The driver assistance data 724 may include a notification that informs a driver when it is safe to pass based on the location of other vehicles on the road, even if visibility of the driver is limited. Additionally, the vehicle assistance server 102 can send a notification to approaching vehicles 108 that another vehicle 108 is passing in their lane of traffic. The user can then understand that it is now safe and move into the outside lane and pass the vehicle and then return to the vehicle without any collision risk from the server information and from the other vehicle information. See paragraph 68-70. Claims 12-16 and 18-20 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Application Pub. No.: US20190019416A1 to Perko et al. that was filed in 2017 and in view of United States Patent App. Pub. No.: US 20160280224 A1 to Tatourian et al that was filed in 2015 and SILVA. Perko discloses 12. (New) The method of 11, wherein the threshold value varies as a function of a current driving speed of the vehicle. (see paragraph 63 and 71-73 where the first vehicle 103 can be traveling at a speed and then the vehicle 105 can slow down the vehicle 103 or nudge the vehicle 103 or it can instruct the vehicle 103 to move through the region of occlusion first and then the second vehicle 105 can then move past secondly) Perko discloses 13. (New) The method of claim 11, further comprising: determining the outside lane from map data and/or image data from a camera of the vehicle. (see paragraph 62 and 70-73 where the perception system of the vehicle 105 can warn the vehicle 103 of the occluded zone region) Perko discloses 14. (New) The method of claim 11, wherein the lane change is performed as a function of a traffic density detected ahead of the vehicle. (see paragraph 36 and 70-73 where the second vehicle can warn the first vehicle that there is another vehicle around the bend and it can collide with that vehicle) Perko discloses 15. (New) The method of claim 11, wherein a current speed of the vehicle is automatically adjusted based on the visibility of the detection unit. (see paragraph 49 where the vehicle is slowed by the second vehicle as it moved into the occluded zone) Perko discloses 16. (New) The method of claim 11, wherein a required visibility of the detection unit is determined based on a predicted current braking distance of the vehicle. (see paragraph 27 and 65 to 73 where the vehicle can warn the second vehicle that it is approaching an occluded zone and provide the braking force and to apply it now) Claim 17 is cancelled. Perko discloses 18. (New) The method of claim 11, wherein the outside lane is on a left-hand side of the vehicle in the case of a right-hand bend, and wherein the outside lane is on a right-hand side of the vehicle in the case of a left-hand bend. (see FIG. 4 where the lanes have the same and an opposition direction of traffic) Perko discloses “...19. (Currently Amended) A system for autonomously driving a vehicle on a road, the system comprising: a detection unit of the vehicle; a computing unit of the vehicle, wherein the computing unit is electronically coupled to the detection unit and is configured to: detect a bend in the road ahead of the vehicle based on data from the detection unit, calculate a visibility metric of the detection unit for the detected bend, determine that the visibility metric is below a threshold value, and in response to determining that the visibility metric is below the threshold value, automatically maneuver the vehicle to change lanes to an outside lane of the bend. (see abstract where the vehicle can detect an occlusion in the road that affects the operation of the vehicle along the route and then a second vehicle can assist the first vehicle with traveling along the route) PNG media_image4.png 796 1358 media_image4.png Greyscale (See FIG. 1 where the first vehicle has a first sensor 108 and the second vehicle has a sensor 108 and each vehicle can communicate or can communicate with the cloud server to avoid occlusions in the road) PNG media_image5.png 666 506 media_image5.png Greyscale (see FIG. 4 where the first vehicle can detect a bend in the road 407 and then detect that there is limited region where there is no line of sight and then the second vehicle 105 can detect the object with a second sensor and then can communicate this obstacle 411 to the first vehicle and see paragraph 71-74) (see paragraph 37 where a computing system can determine that an object occludes a sensor of an autonomous vehicle if the sensor is unable to perceive one or more region(s) in a surrounding environment of the autonomous vehicle because of the object (e.g., occluded region(s)). For example, a computing system can determine that a hill, blind-corner, and/or precipitation occludes a sensor of an autonomous vehicle because the hill, blind-corner, and/or precipitation occludes one or more region(s) in the surrounding environment from the sensor of the autonomous vehicle. In particular, when the autonomous vehicle is climbing one side of the hill, the hill can occlude the opposite side from one or more sensor(s) of the autonomous vehicle; when the autonomous vehicle is turning the blind-corner, the blind-corner can occlude the other side from one or more sensor(s) of the autonomous vehicle; and when the autonomous vehicle is travelling in precipitation, the precipitation can occlude other object(s) in the surrounding environment from one or more sensor(s) of the autonomous vehicle. As a result, the autonomous vehicle may be unable to safely execute one or more maneuver(s) to navigate over the hill, around the blind-corner, or through the precipitation.) , (see paragraph 37 where a computing system can determine that an object occludes a sensor of an autonomous vehicle if the sensor is unable to perceive one or more region(s) in a surrounding environment of the autonomous vehicle because of the object (e.g., occluded region(s)). For example, a computing system can determine that a hill, blind-corner, and/or precipitation occludes a sensor of an autonomous vehicle because the hill, blind-corner, and/or precipitation occludes one or more region(s) in the surrounding environment from the sensor of the autonomous vehicle. In particular, when the autonomous vehicle is climbing one side of the hill, the hill can occlude the opposite side from one or more sensor(s) of the autonomous vehicle; when the autonomous vehicle is turning the blind-corner, the blind-corner can occlude the other side from one or more sensor(s) of the autonomous vehicle; and when the autonomous vehicle is travelling in precipitation, the precipitation can occlude other object(s) in the surrounding environment from one or more sensor(s) of the autonomous vehicle. As a result, the autonomous vehicle may be unable to safely execute one or more maneuver(s) to navigate over the hill, around the blind-corner, or through the precipitation.) (see paragraph 70 that recites FIG. 4 depicts a diagram 400 that illustrates an example of deploying a designated autonomous vehicle to assist an occluded autonomous vehicle to safely navigate past an occlusion point. As shown in FIG. 4, a motion plan of vehicle 103 (e.g., occluded autonomous vehicle) can include a maneuver instructing the autonomous vehicle to travel around blind-corner 407. A computing system (e.g., of the vehicle 103, of the remote computing system(s) 104, of the additional vehicle(s) 105) can determine that the blind-corner 407 is an occlusion point because when the vehicle 103 is turning the blind-corner 407, the region 409 is occluded to the sensor(s) 108 on-board the vehicle 103. The computing system can select and deploy additional vehicle 105 (e.g., designated autonomous vehicle) to observe the occluded region 409 to assist vehicle 103 to safely navigate past the blind-corner 407. The additional vehicle 105 can be deployed such that the additional vehicle 105 can obtain data indicative of the occluded region 409 (e.g., via one or more sensor(s) on-board additional vehicle 105). The additional vehicle 105 can provide the data indicative of occluded region 409 to the vehicle 103. The vehicle 103 can obtain the data indicative of the occluded region 409 from the additional vehicle 105, and determine that the occluded region 409 includes object(s) 411. The vehicle 103 can adjust its motion plan to avoid a collision with object(s) 411, and safely navigate past the blind-corner 407. For example, the vehicle 103 can slow down and/or nudge as the vehicle 103 turns the blind-corner 407.) (see paragraph 38 where a lane change can be made, or stopping or slowing or to proceed) PNG media_image1.png 794 1172 media_image1.png Greyscale The primary reference is silent but SILVA teaches “...determining that the visibility metric is below a threshold value but can be increased to above the threshold value by changing lanes to an outside lane of the bend having the same direction of travel as the vehicle; and (see paragraph 71 and 116-120 where in operation 918, the process can include controlling a vehicle (e.g., the vehicle 102) based at least in part on the occupancy. For example, by evaluating the occlusion grid 906 that represents a hill in the environment, the vehicle 102 can“see over” the crest of the hill despite the sensors not directly capturing ground data beyond the hill of the crest. Thus, the vehicle 102 can plan a trajectory and/or determine a velocity based on the occupancy of the occlusion grid 906 beyond the crest of the hill. As a non-limiting example, the vehicle 102 may adjust one or more of a velocity, orientation, or position such that information provided by the occlusion grid 906 is sufficient to safely traverse the hill.) in response to determining that the visibility metric is below the threshold value but can be increased to above the threshold value by changing lanes to the outside lane of the bend:” (See fig. 9 where the vehicle can use a lidar device to provide 1. An un-occluded region and 2. An occluded region where there is a real risk of a collision with an object within the region and in FIG. 9, block 918, the vehicles can be controlled to stay in the non-occluded by changing lanes or moving to stay in areas 908 and not 910; ) PNG media_image2.png 810 622 media_image2.png Greyscale It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the teachings of SILVA with the disclosure of PERKO with a reasonable expectation of success since SILVA teaches that an autonomous vehicle can determine an occluded area and a non-occluded area. The vehicle can be controlled to the non-occluded area where it is safe and avoid the occluded area when possible. See block 918. In block 1018, the vehicle can be controlled based on the clearance of the occlusion. For example, by evaluating the occlusion grid 906 that represents a hill in the environment, the vehicle 102 can “see over” the crest of the hill despite the sensors not directly capturing ground data beyond the hill of the crest. Thus, the vehicle 102 can plan a trajectory and/or determine a velocity based on the occupancy of the occlusion grid 906 beyond the crest of the hill. As a non-limiting example, the vehicle 102 may adjust one or more of a velocity, orientation, or position such that information provided by the occlusion grid 906 is sufficient to safely traverse the hill. Perko discloses 20. (Currently Amended) An autonomous vehicle, comprising: a detection unit; a computing unit electronically coupled to the detection unit, wherein the computing unit is configured to: detect a bend in a road ahead of the vehicle based on data from the detection unit, calculate a visibility metric of the detection unit for the detected bend, determine that the visibility metric of is below a threshold value, [[when a bend 1s detected ahead of the vehicle]]; and in response to determining that the visibility metric is below the threshold value, automatically maneuver the vehicle to change lanes to an outside lane of the bend. PNG media_image4.png 796 1358 media_image4.png Greyscale (See FIG. 1 where the first vehicle has a first sensor 108 and the second vehicle has a sensor 108 and each vehicle can communicate or can communicate with the cloud server to avoid occlusions in the road) PNG media_image5.png 666 506 media_image5.png Greyscale (see FIG. 4 where the first vehicle can detect a bend in the road 407 and then detect that there is limited region where there is no line of sight and then the second vehicle 105 can detect the object with a second sensor and then can communicate this obstacle 411 to the first vehicle and see paragraph 71-74) (see paragraph 37 where a computing system can determine that an object occludes a sensor of an autonomous vehicle if the sensor is unable to perceive one or more region(s) in a surrounding environment of the autonomous vehicle because of the object (e.g., occluded region(s)). For example, a computing system can determine that a hill, blind-corner, and/or precipitation occludes a sensor of an autonomous vehicle because the hill, blind-corner, and/or precipitation occludes one or more region(s) in the surrounding environment from the sensor of the autonomous vehicle. In particular, when the autonomous vehicle is climbing one side of the hill, the hill can occlude the opposite side from one or more sensor(s) of the autonomous vehicle; when the autonomous vehicle is turning the blind-corner, the blind-corner can occlude the other side from one or more sensor(s) of the autonomous vehicle; and when the autonomous vehicle is travelling in precipitation, the precipitation can occlude other object(s) in the surrounding environment from one or more sensor(s) of the autonomous vehicle. As a result, the autonomous vehicle may be unable to safely execute one or more maneuver(s) to navigate over the hill, around the blind-corner, or through the precipitation.) , (see paragraph 37 where a computing system can determine that an object occludes a sensor of an autonomous vehicle if the sensor is unable to perceive one or more region(s) in a surrounding environment of the autonomous vehicle because of the object (e.g., occluded region(s)). For example, a computing system can determine that a hill, blind-corner, and/or precipitation occludes a sensor of an autonomous vehicle because the hill, blind-corner, and/or precipitation occludes one or more region(s) in the surrounding environment from the sensor of the autonomous vehicle. In particular, when the autonomous vehicle is climbing one side of the hill, the hill can occlude the opposite side from one or more sensor(s) of the autonomous vehicle; when the autonomous vehicle is turning the blind-corner, the blind-corner can occlude the other side from one or more sensor(s) of the autonomous vehicle; and when the autonomous vehicle is travelling in precipitation, the precipitation can occlude other object(s) in the surrounding environment from one or more sensor(s) of the autonomous vehicle. As a result, the autonomous vehicle may be unable to safely execute one or more maneuver(s) to navigate over the hill, around the blind-corner, or through the precipitation.) (see paragraph 70 that recites FIG. 4 depicts a diagram 400 that illustrates an example of deploying a designated autonomous vehicle to assist an occluded autonomous vehicle to safely navigate past an occlusion point. As shown in FIG. 4, a motion plan of vehicle 103 (e.g., occluded autonomous vehicle) can include a maneuver instructing the autonomous vehicle to travel around blind-corner 407. A computing system (e.g., of the vehicle 103, of the remote computing system(s) 104, of the additional vehicle(s) 105) can determine that the blind-corner 407 is an occlusion point because when the vehicle 103 is turning the blind-corner 407, the region 409 is occluded to the sensor(s) 108 on-board the vehicle 103. The computing system can select and deploy additional vehicle 105 (e.g., designated autonomous vehicle) to observe the occluded region 409 to assist vehicle 103 to safely navigate past the blind-corner 407. The additional vehicle 105 can be deployed such that the additional vehicle 105 can obtain data indicative of the occluded region 409 (e.g., via one or more sensor(s) on-board additional vehicle 105). The additional vehicle 105 can provide the data indicative of occluded region 409 to the vehicle 103. The vehicle 103 can obtain the data indicative of the occluded region 409 from the additional vehicle 105, and determine that the occluded region 409 includes object(s) 411. The vehicle 103 can adjust its motion plan to avoid a collision with object(s) 411, and safely navigate past the blind-corner 407. For example, the vehicle 103 can slow down and/or nudge as the vehicle 103 turns the blind-corner 407.) (see paragraph 38 where a lane change can be made, or stopping or slowing or to proceed) PNG media_image1.png 794 1172 media_image1.png Greyscale The primary reference is silent but SILVA teaches “...determining that the visibility metric is below a threshold value but can be increased to above the threshold value by changing lanes to an outside lane of the bend having the same direction of travel as the vehicle; and (see paragraph 71 and 116-120 where in operation 918, the process can include controlling a vehicle (e.g., the vehicle 102) based at least in part on the occupancy. For example, by evaluating the occlusion grid 906 that represents a hill in the environment, the vehicle 102 can“see over” the crest of the hill despite the sensors not directly capturing ground data beyond the hill of the crest. Thus, the vehicle 102 can plan a trajectory and/or determine a velocity based on the occupancy of the occlusion grid 906 beyond the crest of the hill. As a non-limiting example, the vehicle 102 may adjust one or more of a velocity, orientation, or position such that information provided by the occlusion grid 906 is sufficient to safely traverse the hill.) in response to determining that the visibility metric is below the threshold value but can be increased to above the threshold value by changing lanes to the outside lane of the bend:” (See fig. 9 where the vehicle can use a lidar device to provide 1. An un-occluded region and 2. An occluded region where there is a real risk of a collision with an object within the region and in FIG. 9, block 918, the vehicles can be controlled to stay in the non-occluded by changing lanes or moving to stay in areas 908 and not 910; ) PNG media_image2.png 810 622 media_image2.png Greyscale It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the teachings of SILVA with the disclosure of PERKO with a reasonable expectation of success since SILVA teaches that an autonomous vehicle can determine an occluded area and a non-occluded area. The vehicle can be controlled to the non-occluded area where it is safe and avoid the occluded area when possible. See block 918. In block 1018, the vehicle can be controlled based on the clearance of the occlusion. For example, by evaluating the occlusion grid 906 that represents a hill in the environment, the vehicle 102 can “see over” the crest of the hill despite the sensors not directly capturing ground data beyond the hill of the crest. Thus, the vehicle 102 can plan a trajectory and/or determine a velocity based on the occupancy of the occlusion grid 906 beyond the crest of the hill. As a non-limiting example, the vehicle 102 may adjust one or more of a velocity, orientation, or position such that information provided by the occlusion grid 906 is sufficient to safely traverse the hill. A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JEAN PAUL CASS whose telephone number is (571)270-1934. The examiner can normally be reached Monday to Friday 7 am to 7 pm; Saturday 10 am to 12 noon. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Scott A. Browne can be reached on 571-270-0151. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JEAN PAUL CASS/Primary Examiner, Art Unit 3668
Read full office action

Prosecution Timeline

Nov 21, 2022
Application Filed
Sep 16, 2024
Non-Final Rejection — §103
Dec 11, 2024
Response Filed
Dec 23, 2024
Final Rejection — §103
Apr 03, 2025
Request for Continued Examination
Apr 08, 2025
Response after Non-Final Action
Apr 14, 2025
Non-Final Rejection — §103
Jul 16, 2025
Response Filed
Sep 23, 2025
Final Rejection — §103
Dec 19, 2025
Request for Continued Examination
Feb 12, 2026
Response after Non-Final Action
Feb 27, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12593752
SYSTEM AND METHOD FOR CONTROLLING HARVESTING IMPLEMENT OPERATION OF AN AGRICULTURAL HARVESTER BASED ON TILT ACTUATOR FORCE
2y 5m to grant Granted Apr 07, 2026
Patent 12596986
GLOBAL ADDRESS SYSTEM AND METHOD
2y 5m to grant Granted Apr 07, 2026
Patent 12590801
REAL TIME DETERMINATION OF PEDESTRIAN DIRECTION OF TRAVEL
2y 5m to grant Granted Mar 31, 2026
Patent 12583572
MARINE VESSEL AND MARINE VESSEL PROPULSION CONTROL SYSTEM
2y 5m to grant Granted Mar 24, 2026
Patent 12571183
EXCAVATOR
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
73%
Grant Probability
99%
With Interview (+25.9%)
3y 1m
Median Time to Grant
High
PTA Risk
Based on 984 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month