Prosecution Insights
Last updated: April 19, 2026
Application No. 18/896,849

DRIVING ASSISTANCE DEVICE AND DRIVING ASSISTANCE METHOD

Non-Final OA §103
Filed
Sep 25, 2024
Examiner
CODUROGLU, JALAL C
Art Unit
3665
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Honda Motor Co. Ltd.
OA Round
1 (Non-Final)
86%
Grant Probability
Favorable
1-2
OA Rounds
2y 6m
To Grant
92%
With Interview

Examiner Intelligence

Grants 86% — above average
86%
Career Allow Rate
262 granted / 305 resolved
+33.9% vs TC avg
Moderate +6% lift
Without
With
+6.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
21 currently pending
Career history
326
Total Applications
across all art units

Statute-Specific Performance

§101
4.2%
-35.8% vs TC avg
§103
58.1%
+18.1% vs TC avg
§102
20.1%
-19.9% vs TC avg
§112
5.7%
-34.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 305 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-4 & 8-9 are rejected under 35 U.S.C. 103 as being obvious over , Takada et al., Pub. No.: US 20180046193 A1 in view of Herman`308, Pub. No.: US 20200257308 A1. Regarding claims 1 & 8-9, Takada et al. discloses a driving assistance device ([0045] “the driving assistance server 100 is an example of a driving assistance device”) & a method & storage medium comprising: an acquisition unit (([0045] “a driving situation acquisition unit 110” & , “a driving characteristics acquisition unit 120”) configured to acquire, from a peripheral vehicle existing around a self-vehicle ([0045] “a plurality of vehicles (vehicles 1, 2, and 3”) on which the driving assistance device is mounted ([0089] “the vehicles that may collide with each other include a host vehicle (vehicle 1 in FIG. 20) on which the driving assistance ECU 400 is mounted.”), Takada et al. is not explicit on “vehicle-to-vehicle communication”, however, Herman`308, US 20200257308 A1, teaches Autonomous Vehicle Systems Utilizing Vehicle-to-vehicle Communication and discloses, peripheral vehicle information indicating a vehicle speed, a position, and a traveling track of the peripheral vehicle through vehicle-to-vehicle communication ([0024] “the host vehicle communicates with the adjacent vehicle via V2V communication. … the host vehicle performs autonomous maneuvers and/or emits an alert for its operator to manually take over control in response to (i) identifying a likelihood-of-collision of the secondary vehicle and (ii) determining that the likelihood-of-collision exceeds a threshold.” & [0026] The range-detection sensors 102 of the illustrated example are arranged on the vehicle 100 to monitor object(s) within a surrounding area of the vehicle 100.” & [0028] “DSRC systems incorporating infrastructure information is known as a “roadside” system. DSRC may be combined with other technologies, such as Global Position System (GPS), Visual Light Communications (VLC), Cellular Communications, and short range radar, facilitating the vehicles communicating their position, speed, heading, relative position to other objects and to exchange information with other vehicles or external computer systems.” & [0030] “the communication module 104”. Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to use these above mentioned features disclosed by Herman`308 with the system disclosed by Takada et al. in order to provide autonomous vehicle systems utilizing vehicle-to-vehicle communication (see Abstract and para. [0001]). Takada et al. further discloses; a prediction unit configured to predict a collision possibility between the self-vehicle and the peripheral vehicle based on self-vehicle information indicating a vehicle speed, a position, and a traveling track of the self-vehicle and the peripheral vehicle information ([0038] “the driving situation of each vehicle, … includes, for example, the position information on the vehicle, the behavior information on the vehicle, the peripheral information on the vehicle, and so on. ... The behavior information includes, for example, the speed vector (traveling direction and vehicle speed) of the vehicle… so on.” & [0061] The driving action instruction unit 140 instructs the vehicle 1 … the notification device mounted on the vehicle 1 ... The notification device mounted on the vehicle 2 can notify the driver 2a, via at least one of sound and display, that the driver should change the lane to the lane 22. These notifications cause the vehicles 1 and 2 to drive according to the indicated driving actions, thus avoiding collision at the intersection 51 [0039]-[0040] & [0047], “the driving situation management server 200, includes the future driving situation of each vehicle predicted from the current driving situation of the vehicle. For example, using the current driving situations (position information, behavior information, peripheral information, etc.) of the vehicles, the driving situation management server 200 detects the driving situations of the vehicles that may collide with each other and stores the detected driving situations in the driving situation storage unit 210.); and a notification unit configured to notify an occupant of the self-vehicle based on a prediction result by the prediction unit ([0061] “the notification device mounted on the vehicle 1 can notify the driver 1a, via at least one of sound and display, that the driver should watch for left and right. The notification device mounted on the vehicle 2 can notify the driver 2a, via at least one of sound and display, that the driver should change the lane to the lane 22. These notifications cause the vehicles 1 and 2 to drive according to the indicated driving actions, thus avoiding collision at the intersection 51.), Further, Takada et al. is not explicit on “a first range in front of the self-vehicle, set a first determination region & a second range on a side of the self-vehicle, set a second determination region”, however, Herman`308, US 20200257308 A1, teaches Autonomous Vehicle Systems Utilizing Vehicle-to-vehicle Communication and discloses, wherein the prediction unit is configured to: in a case where the peripheral vehicle exists within a first range in front of the self-vehicle, set a first determination region using a position where the self-vehicle accelerates after decelerating to a threshold value or less or a position where the self-vehicle temporarily stops as a first reference position, and predict a collision possibility between the self-vehicle and the peripheral vehicle in the first determination region; and in a case where the peripheral vehicle exists within a second range on a side of the self-vehicle, set a second determination region using an intersection between a predicted course of the self-vehicle and a predicted course of the peripheral vehicle as a second reference position, and predict a collision possibility between the self-vehicle and the peripheral vehicle in the second determination region (Herman`308 teaches “[0026], [0028], [0034]-[0038], [0040] : The range-detection sensors on the vehicle 100 to monitor object(s) within a surrounding area of the vehicle 100. ... The proximity sensors are configured to detect the presence, proximity, and/or location of object(s) near the vehicle 100. … to enable nearby object(s) to be identified and located. … the range-detection sensors 102 are located on each side of the vehicle 100 (e.g., front, rear, left, right) to enable the range-detection sensors 102 in monitoring each portion of the surrounding area of the vehicle 100.” & also para. [0040], [0047]-[0048] & [0075]: teaches first/second thresholds for “first/second range/determination region : Upon determining the collision probability of the adjacent vehicle, the ambient controller 110 compares the collision probability to one or more thresholds. For example, the ambient controller 110 compares the collision probability to a first threshold and a second threshold that is less than the first threshold. In response to the ambient controller 110 determining that the collision probability is greater than the first threshold, the autonomy unit 108 autonomously performs (e.g., for the ADAS) a defensive driving maneuver to prevent the vehicle 100 from being involved in a collision caused by the adjacent vehicle. For example, the autonomous defensive driving maneuver includes deceleration, emergency braking, changing of lanes, changing of position within a current lane of travel, etc. In some examples, the autonomy unit 108 is configured to initiate the defensive driving maneuver before the takeover time of the adjacent vehicle has been completed. That is, the ambient controller 110 is configured to cause the autonomy unit 108 to perform the defensive driving maneuver before the operator of the adjacent vehicle manually takes over control of the adjacent vehicle. … to determining that the collision probability is greater than a threshold (e.g., the second threshold, a third threshold).). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to use these above mentioned features disclosed by Herman`308 with the system disclosed by Takada et al. in order to provide a controller to determine the collision probability of the adjacent vehicle further based on data collected from the remote server configured to emit an alert to request manual takeover responsive to determining that the collision probability is less than the first threshold and greater than a second threshold. (see Abstract and para. [0007]-[0008]). Regarding claim 2, Takada et al. discloses the driving assistance device according to claim 1, wherein in a case where the peripheral vehicle exists within the first range, the prediction unit predicts that there is a possibility that the self-vehicle and the peripheral vehicle collide with each other based on an intersection between a predicted turning track preset for the self-vehicle and a predicted course of the peripheral vehicle being included in the first determination region ([0040] FIG. 2 is a diagram showing an example of the driving situation of the vehicles that may collide with each other. The driving situation management server 200 uses the current driving situations of the vehicles to predict the vehicles that may collide with each other, based on a predetermined collision prediction method. For example, the driving situation management server 200 predicts a first arrival time at which a vehicle 1 will reach the intersection, where the predicted course of the vehicle 1 and the predicted course of a vehicle 2 intersect, and a second arrival time at which the vehicle 2 will reach the intersection. If the difference between the first arrival time and the second arrival time is equal to or less than a predetermined time difference, the driving situation management server 200 predicts that the vehicle 1 and the vehicle 2 are in the driving situation in which they may collide at the intersection. Note that the method for predicting the vehicles that may collide with each other is not limited to the method described above but a known method may be applied.). Regarding claims 3-4, Takada et al. discloses the driving assistance device according to claim 2. Takada et al. is not explicit on “change in a steering angle”, however, Herman`308, US 20200257308 A1, teaches Autonomous Vehicle Systems Utilizing Vehicle-to-vehicle Communication and discloses, (claim 3) wherein the prediction unit updates the predicted turning track based on a change in a steering angle of the self-vehicle, (claim 4) wherein the prediction unit selects the predicted turning track from a plurality of candidates of the predicted turning track based on a steering angle of the self-vehicle at the first reference position ([0036] “the communication module 104 collects data from the adjacent vehicle that identifies …(vi) a (relative) position of the adjacent vehicle, … (viii) a steering angle rate-of-change of the adjacent vehicle … and/or any other information that facilitates the ambient controller 110 in monitoring the adjacent vehicle.” & [0039] “the ambient controller 110 is configured to determine the time-to-collision of the adjacent vehicle based on a velocity, an acceleration, a direction-of-travel, a distance to the object, a required steering angle to avoid the object, a steering angle rate-of-change, and/or other measured characteristics of the adjacent vehicle that the communication module 104 collects from the adjacent vehicle via V2V communication. Further, the ambient controller 110 is configured to determine a collision probability for the vehicle 100 based on the collision probability of the adjacent vehicle.”). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to use these above mentioned features disclosed by Herman`308 with the system disclosed by Takada et al. in order to determine the time-to-collision based on a required steering angle to avoid the object, and a steering angle rate-of-change of the adjacent vehicle (see Abstract and para. [0005]). Claim 5 is rejected under 35 U.S.C. 103 as being obvious over , Takada et al., Pub. No.: US 20180046193 A1 in view of Herman`308, Pub. No.: US 20200257308 A1, further in view of MORITA et al., pub. No.: US 20230192192 A1 . Regarding claim 5, Takada et al. discloses the driving assistance device according to claim 1. Takada et al. is not explicit on “determination region to include the first reference position and to be offset”, however, MORITA et al., US 20230192192 A1, teaches DRIVING ASSISTANCE DEVICE and discloses, wherein the prediction unit sets the first determination region to include the first reference position and to be offset to an opposite lane side with respect to the self-vehicle in a direction orthogonal to a predicted course of the self-vehicle ([0041]-[0047], [0060]-[0061] & [0070]: “The object recognition unit 11 acquires object detection information from the camera sensor 21 and the radar sensor 22, and uses the feature points obtained from the camera sensor 21 and the object position information obtained from the radar sensor 22 to recognize that an object exists at that position. The object recognition unit 11 also associates the position and speed of each object relative to the own vehicle, and calculates the lateral velocity of the object that is a relative velocity in a direction orthogonal to the traveling direction of the own vehicle and a longitudinal velocity that is a relative velocity in the traveling direction of the own vehicle, based on the associated relative position and relative speed. The object recognition unit 11 may be configured to recognize an object detected in a determination region set in a predetermined region around the own vehicle, as a target object with which a collision is to be avoided. The object recognition unit 11 is also capable of recognizing the positions and sizes of on-road structures and white lines.” & [0072] The secondary determination region setting unit 14 may be configured to set the region offset L5 based on the type of the primary target T1, or may be configured to set the region offset L5 based on the detection accuracy of objects around the own vehicle 50, or may be configured to set the region offset L5 based on a region where the primary target T1 may exist after a lapse of a time to collision TTC. For example, depending on the detection accuracy of objects, it may be detected that a plurality of objects exists for one primary target T1. In this case, an object detected at a shorter lateral distance from the primary target T1 than the region offset L5 can be treated as identical to the primary target T1 to compensate for the detection accuracy of objects, whereby it is possible to set the secondary determination region in a more appropriate manner.). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to use these above mentioned features disclosed by MORITA et al. with the system disclosed by Takada et al. in order to provide a driving assistance device that executes a collision avoidance control of an own vehicle if it is determined that an object around the own vehicle may collide with the own vehicle (see Abstract and para. [0002]). Claims 6-7 are rejected under 35 U.S.C. 103 as being obvious over , Takada et al., Pub. No.: US 20180046193 A1 in view of Herman`308, Pub. No.: US 20200257308 A1, further in view of JEONG et al., Pub. No.: US 20240149921 A1. Regarding claims 6-7, Takada et al. discloses the driving assistance device according to claim 1. Takada et al. is not explicit on “the second determination region having different shape … vehicle existing within the second range is on an opposite lane side and on an opposite side of an opposite lane side”, however, JEONG et al., US 20240149921 A1, teaches METHOD AND DEVICE WITH AUTONOMOUS DRIVING PLAN and discloses; (claim 6) wherein the prediction unit sets the second determination region having different shape between a case where the peripheral vehicle existing within the second range is on an opposite lane side with respect to the self-vehicle and a case where the peripheral vehicle existing within the second range is on an opposite side of an opposite lane side with respect to the self-vehicle, and a length of a part located closer to the self-vehicle than the second reference position in the second determination region in a case where the peripheral vehicle is on an opposite side of an opposite lane side with respect to the self-vehicle is longer than a length of a part located closer to the self-vehicle than the second reference position in the second determination region in a case where the peripheral vehicle is on an opposite lane side with respect to the self-vehicle (See para. [0060]-[0064, [0081]: “the electronic device may determine the ROI 320 based on the driving lane 381 of the moving object 310 (which is based on the driving plan) and/or a scheduled driving lane 382. ... based on the width of the lane or a shape following the road alignment of the lane … may include a rear region and/or a front region” & [0061] “the ROI 320 may be determined to be a region having a shape with a length following the road alignment of the driving lane 381 and a width of the driving lane 381. … the ROI 320 may be determined to be a region having a shape with a length following the road alignment and widths of the driving lane 381 and the scheduled driving lane 382.” & [0062] “the length of the front region may vary depending on the velocity of the moving object 310 and the length of the rear region may be fixed.” & [0064] “a region of a plane parallel with the ground and/or a region of a plane intersecting with the ground.” & [0065] “When the driving plan includes a turn (e.g., a left turn or a right turn) of the moving object 310, the electronic device may determine the ROI 320 (e.g., a region having a width corresponding to a single lane) to include a portion of the road 380 or a path through which a vehicle passes when the vehicle turns. When the driving plan includes a lane change of the moving object 310, the electronic device may determine the ROI 320 (e.g., a region having a width corresponding to two lanes)” & [0081] “the electronic device may select a target position offset in the adjustable driving range 520 based on the predicted position of the moving object. A plurality of position offsets 521 may be in the adjustable driving range 520. … an offset spaced apart in the front direction based on the predicted position of the moving object, an offset spaced apart in the rear direction, an offset spaced apart in a first side direction (e.g., the left side based on the moving direction of the moving object 510), and an offset spaced apart in a second side direction (e.g., the right side based on the moving direction of the moving object 510) opposite to the first side direction.” & [0128] “the electronic device may accurately determine a risk level by identifying a situation of the opposite lane through sufficiently securing the FOV. The electronic device may efficiently establish an autonomous driving plan including overtaking.); (claim 7) wherein in a case where the peripheral vehicle exists within the second range, and a first predicted time until the peripheral vehicle reaches the second reference position is smaller than a first time threshold value, the prediction unit predicts that there is a possibility that the self-vehicle and the peripheral vehicle collide with each other if a difference between a second predicted time until the self-vehicle reaches the second reference position and the first predicted time is smaller than a second time threshold value, and the prediction unit predicts that there is no possibility that the self-vehicle and the peripheral vehicle collide with each other if the difference between the second predicted time and the first predicted time is larger than the second time threshold value (see. Para. [0055], [0059], [0068], [0086], [0110] & [0135] : “The prediction position of the moving object may represent a position where the moving object is positioned when the moving object moves along the driving plan after a certain amount of time (e.g., an amount of time from the current time point). … the electronic device may predict the position 311 of the moving object 310 at a prediction time point based on a path along the driving plan of the moving object 310 and a speed (e.g., V) of the moving object 310. The prediction time point represents a time point where a time (e.g., time t.sub.sim) has elapsed from a reference time point (e.g., a current time point).” & [0135] “The predicted position of the moving object may be determined to be a position to which the moving object moves along the road alignment of a lane corresponding to the driving plan at a constant velocity from a reference time point (e.g., a current time point) to a predicted time point.). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to use these above mentioned features disclosed by JEONG et al. with the system disclosed by Takada et al. to provide a plurality of position offsets in the adjustable driving range may include an offset spaced apart from the sensor in a front direction / in a rear direction/ in a first side direction / in a second side direction of the moving object that may be opposite to the first side direction, wherein the processor may be configured to select the target position offset from the plurality of position offsets based on the predicted position of the moving object (see Abstract and para. [0016]). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. See Notice of References Cited. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Jalal C CODUROGLU whose telephone number is (408)918-7527. The examiner can normally be reached Monday -Friday 8-6 PT. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hunter Lonsberry can be reached at 571-272-7298. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Jalal C CODUROGLU/Examiner, Art Unit 3665
Read full office action

Prosecution Timeline

Sep 25, 2024
Application Filed
Jan 10, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600501
MOBILE ROBOTIC ARM FOR MOMENTUM UNLOADING AND ORBIT CONTROL
2y 5m to grant Granted Apr 14, 2026
Patent 12600489
ELECTRICAL MONITORING SYSTEM AND METHOD FOR VTOL AIRCRAFT
2y 5m to grant Granted Apr 14, 2026
Patent 12600466
LANDING GEAR ASSEMBLIES, ROTORCRAFT AND ROTORCRAFT METHODS
2y 5m to grant Granted Apr 14, 2026
Patent 12595045
SYSTEM AND METHOD TO MINIMIZE AN AIRCRAFT GROUND TURN RADIUS
2y 5m to grant Granted Apr 07, 2026
Patent 12589884
AIRCRAFT CONTROL SYSTEM FAILURE EVENT SEARCH
2y 5m to grant Granted Mar 31, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
86%
Grant Probability
92%
With Interview (+6.3%)
2y 6m
Median Time to Grant
Low
PTA Risk
Based on 305 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month