Prosecution Insights
Last updated: April 19, 2026
Application No. 18/330,334

AUTOMATED DRIVING CONTROL DEVICE AND STORAGE MEDIUM STORING AUTOMATED DRIVING CONTROL PROGRAM

Non-Final OA §103
Filed
Jun 06, 2023
Examiner
UNDERWOOD, BAKARI
Art Unit
3663
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
DENSO CORPORATION
OA Round
3 (Non-Final)
70%
Grant Probability
Favorable
3-4
OA Rounds
3y 3m
To Grant
89%
With Interview

Examiner Intelligence

Grants 70% — above average
70%
Career Allow Rate
137 granted / 196 resolved
+17.9% vs TC avg
Strong +19% interview lift
Without
With
+19.1%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
39 currently pending
Career history
235
Total Applications
across all art units

Statute-Specific Performance

§101
14.0%
-26.0% vs TC avg
§103
57.6%
+17.6% vs TC avg
§102
9.7%
-30.3% vs TC avg
§112
14.8%
-25.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 196 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 11/24/2025 has been entered. Status of Claims This is a Non-Final Action for Request for Continued Examination (RCE) application Serial No. 18/330,334. Claim(s) 1-13, 15-20 have been examined and fully considered. Claim(s) 1, 8, 10-13 and 17 has been amended. Claims 14-15 are have cancelled. Claim(s) 20 are newly added. Claim(s) 1-13, 15-20 are pending in Instant Application. Response to Arguments/Rejections Applicant’s arguments with respect to claim(s) 1, 8, 10-13 and 17 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 1-5, 8-13 and 16-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kaji et al. (Pub. No.: US 2021/0146962; previous recorded), hereinafter, referred to as “Kaji” in view of Okajima et al . (Pub. No.: US 2019/0354108; previously recorded), hereinafter, referred to as “Okajima”, and in view of Abe et al. (Pub. No .: US 2017/0337810 A1), hereinafter, referred to as “Abe”, and in view of Oniwa et al. (Pub. No.: US 2019/0311207), hereinafter, referred to as “Oniwa”. Regarding [claim 1]., Kaji discloses an automated driving control device (“an automated driving controller 300”) capable of performing eyes-off automated driving without periphery monitoring obligation by a driver (see at least Abstract; see at least Paragraph [0060]: “the vehicle control system is applied to an automated driving vehicle capable of auto mated driving (autonomous driving). In principle, the auto mated driving refers to causing a vehicle to travel in a state in which no operation of an occupant is required, and is considered to be a type of driving assistance. The automated driving vehicle can also be caused to travel through manual driving” and [0086]: “The monitoring determiner 140B determines whether the occupant in the driver's seat is monitoring the surroundings of the host vehicle M on the basis of the direction of the line of sight or the face detected by the image processor 140A. In the following description, a state in which the occupant is monitoring the surroundings of the host vehicle M will be referred to as "eyes on", and a state in which the occupant is not monitoring the surroundings of the host vehicle M will be referred to as "eyes off'".”), the device (“an automated driving controller 300”) comprising: a different vehicle grasping unit (“outside world recognizer 321” ***Interpreting as vehicle grasping unit***) configured to grasp at least existence of a front vehicle in a subject vehicle lane in which a subject vehicle is positioned and existence of a side vehicle that is adjacent to the subject vehicle and is positioned in an adjacent land adjacent to the subject vehicle lane (see at least Paragraph [0090]: “The outside world recognizer 321 recognizes a state such as a position, a speed, and an acceleration of a nearby vehicle on the basis of information input from the camera 10, the radar device 12, and the finder 14 via the object recognition device 16. The position of the nearby vehicle may be represented by a representative point such as a centroid or a corner of the nearby vehicle or may be represented by an area represented by a contour of the nearby vehicle” and [0136]: “an image showing the nearby vehicle m recognized by the outside world recognizer 321 to be displayed in the surroundings detection information display area 600-1. The HMI controller 120 causes an image showing all nearby vehicles m recognized by the outside world recognizer 321 to be displayed on the first display 450. Further, the HMI controller 120 causes only the nearby vehicle m affecting a future trajectory of the host vehicle M among all the nearby vehicles m recognized by the outside world recognizer 321 to be displayed on the first display 450.”); …recognizing a first traffic congestion state in which a vehicle speed of the subject vehicle is equal to or less than a predetermined speed and all of the front vehicle in the subject vehicle lane (see at least Paragraph [0199]: “In the example of FIG. 20, scene (4) is a scene in which the host vehicle M follows the nearby vehicle m in a traffic jam, and switching from the driving assistance at the second level to the driving assistance at the third level is performed.” and [0200]: “Scene (5) is a scene in which low-speed following traveling (TJP; Traffic Jam Pilot), which is an example of the driving assistance at the third level, is being executed. The low- speed following traveling is a control aspect in which the host vehicle follows a preceding vehicle at a predetermined speed or less. The predetermined speed is, for example, 60 [km/h ], as described above. The low-speed following traveling is executed when a speed of the preceding vehicle m is equal to or lower than a predetermined speed and an inter-vehicle distance between the host vehicle and the preceding vehicle m is smaller than a predetermined distance (for example, about 50 [m]).”)… and a permission controller (“HMI controller 120”) that is configured to permit a start of the eyes-off automated driving in the first traffic congestion state is configured not to permit the start of the eyes-off automated driving in the second traffic congestion state (see at least Paragraph [0137]: “the HMI controller 120 causes information indicating the level of the driving assistance (including automated driving) that can be executed by the host vehicle M to be displayed in the driving assistance state display area 620-1. In the example of FIG. 12, an image 621 showing three indicators “Assist”, “Hands Off ”, and “Eyes Off” is shown as information indicating the level of the driving assistance. The level of the driving assistance is represented by each indicator alone or a combination of a plurality of indicators”; [0140]: “The indicator “Eyes Off” is an indicator indicating a state (ON state) in which the driving assistance at the third level is being executed, or a state (OFF state) in which transition to the driving assistance at the third level can be made” and [0164]: “When the level of the driving assistance is the second level, the occupant status monitor 140 of the master controller 100 determines whether the occupant is in the eyes-ON state or in the eyes-OFF state on the basis of the captured image of the in-vehicle camera 90 in order to check whether or not the occupant fulfills the surroundings monitoring obligation” and [0169]: “Further, when the level of the driving assistance is the third level and the occupant is not obligated to monitor the surroundings, but switching from the driving assistance at the third level to the driving assistance in which the occupant is obliged to monitor the surroundings has been performed, it is necessary for the occupant to rapidly monitor the surroundings. Therefore, the monitoring determiner 140B continues monitoring to check how awake the occupant is. For example, when the level of the driving assistance is the third level”), wherein the permission controller (“HMI controller 120”) permits continuation of the eyes-off automated driving when a periphery of the subject vehicle transitions to the second traffic congestion state after the eyes-off automated driving starts in the first traffic congestion state (see at least Paragraphs [0163]: “the HMI controller 120 causes an indication that the driving assistance ("automated traveling" in FIG. 14) at the second level is started, but the occupant is caused to continuously monitor a surrounding traffic situation to be displayed in the surroundings detection information display area 600-3”; [0167]: “when the monitoring determiner 140B has determined that the occupant is in the eyes-OFF state, that is, when the occupant does not fulfill the surroundings monitoring obligation, the HMI controller 120 causes an image to be displayed on the first display 450 or the third display 470 of the HMI 400 or a sound to be output from the speaker , thereby warning the occupant so that the occupant monitors the surroundings. When the eyes-off state continues during a predetermined time or more, the switching controller 110 may cause the driving assistance controller 200 to perform the driving assistance control by switching the level of the driving assistance from the second level to the first level” and [0168]: “Further, when the eyes-off state continues during a predetermined time or more after the monitoring determiner 140B determines that the occupant is in the eyes-off state, the switching controller 110 may cause the automated driving controller 300 to perform alternative control instead of performing the automated driving control according to the second level. The alternative control is, for example, automated driving control for causing the host vehicle M to stop in an area in which the host vehicle M is allowed to stop, such as a road shoulder, while causing the host vehicle M to gradually decelerate”)… Kaji does not expressly discloses … the traffic congestion recognition unit determines whether a traffic congestion state in the periphery of the subject vehicle transitions from the first traffic congestion state to the second traffic congestion state, and the permission controller permits the continuation of the eyes-off automated driving even when the traffic congestion recognition unit determines the traffic congestion in the periphery of the subject vehicle transitions to the second traffic congestion state, and the first traffic congestion state is a state where the subject vehicle is not possible to perform a lane change, and the second traffic congestion state is a state where the subject vehicle is possible to perform the lane change. Additionally, Okajima teaches … a traffic congestion recognition unit (see at least Abstract “an external environment recognizer 121”) configured to recognize a first traffic congestion state in which a vehicle speed of the subject vehicle is equal to or less than a predetermined speed and all of the front vehicle in the subject vehicle lane and the side vehicle in the adjacent lane exist (see at least Figures 6 and 9; Paragraphs [0045]: “The external environment recognizer 121 recognizes states of a nearby vehicle (s) such as the position, speed and acceleration thereof on the basis of information that is input from the camera 10, the radar device 12, and the finder 14 directly or via the object recognition device 16. The position of the nearby vehicle may be represented by a representative point such as a center of gravity or a corner of the nearby vehicle or may be represented by a region expressed by a contour of the nearby vehicle . The “states” of the nearby vehicle may include an acceleration or jerk of the nearby vehicle or a “behavior state” (for example, whether or not the nearby vehicle is changing or is going to change lanes). The external environment recognizer 121 may also recognize the positions of guardrails or utility poles, parked vehicles, pedestrians, and other objects in addition to nearby vehicles” [0046] and [0055]: “Here , the automated driving which is performed mainly by the first controller 120 is executed in one of a plurality of automated driving modes. The automated driving modes include an automated driving mode which is executed at a second predetermined speed ( for example, 60 km/h) or less. An example of this is a low speed following travel (traffic jam pilot: TJP) in which the own vehicle M follows a preceding vehicle at the time of congestion . In the low speed following travel , safe automated driving can be realized by following a preceding vehicle on a congested freeway . It is to be noted that the first predetermined speed and the second predetermined speed may be equal or the first predetermined speed may be slightly higher than the second predetermined speed”) and a second traffic congestion state in which the vehicle speed of the subject vehicle is equal to or less than the predetermined speed, the front vehicle exists in the subject vehicle lane, and the side vehicle does not exist in the adjacent lane (see at least Paragraph [0045]: “The external environment recognizer 121 recognizes states of a nearby vehicle (s) such as the position, speed and acceleration thereof on the basis of information that is input from the camera 10, the radar device 12, and the finder 14 directly or via the object recognition device 16. The position of the nearby vehicle may be represented by a representative point such as a center of gravity or a corner of the nearby vehicle or may be represented by a region expressed by a contour of the nearby vehicle . The “states” of the nearby vehicle may include an acceleration or jerk of the nearby vehicle or a “behavior state” (for example, whether or not the nearby vehicle is changing or is going to change lanes). The external environment recognizer 121 may also recognize the positions of guardrails or utility poles, parked vehicles, pedestrians, and other objects in addition to nearby vehicles” [0046] and [0055]: “Here , the automated driving which is performed mainly by the first controller 120 is executed in one of a plurality of automated driving modes. The automated driving modes include an automated driving mode which is executed at a second predetermined speed ( for example, 60 km/h) or less. An example of this is a low speed following travel (traffic jam pilot: TJP) in which the own vehicle M follows a preceding vehicle at the time of congestion . In the low speed following travel , safe automated driving can be realized by following a preceding vehicle on a congested freeway . It is to be noted that the first predetermined speed and the second predetermined speed may be equal or the first predetermined speed may be slightly higher than the second predetermined speed”); … Accordingly, it would have been obvious to one of ordinary skill in the art before the filing of the invention to further to incorporate a traffic congestion recognition unit teaching the first and second state as taught by Okajima and combining an automated driving control device as taught by Kaji performing the same function with a reasonable expectation of success. One would be motivated to make this modification in order to provide a vehicle control system, a vehicle control method, and a vehicle control program which can enhance the adaptability of automated driving (see Okajima Paragraph [0005]). Neither Kaji nor Okajima …the traffic congestion recognition unit determines whether a traffic congestion state in the periphery of the subject vehicle transitions from the first traffic congestion state to the second traffic congestion state, and the permission controller permits the continuation of the eyes-off automated driving even when the traffic congestion recognition unit determines the traffic congestion in the periphery of the subject vehicle transitions to the second traffic congestion state, and the first traffic congestion state is a state where the subject vehicle is not possible to perform a lane change, and the second traffic congestion state is a state where the subject vehicle is possible to perform the lane change. Additionally, Abe teaches …the traffic congestion recognition unit determines whether a traffic congestion state in the periphery of the subject vehicle transitions from the first traffic congestion state to the second traffic congestion state, and the permission controller permits the continuation of the … automated driving even when the traffic congestion recognition unit determines the traffic congestion in the periphery of the subject vehicle transitions to the second traffic congestion state (see, Paragraphs [0013]-[0023]: “the automated drive section of the vehicle control system controls the vehicle according to an automated driving plan which reflects the estimated future traffic condition. Accordingly, the automated drive section organizes a plan to control the vehicle in accordance with the future traffic condition” and [0113]: “The customize processing of the customize table 520 for the automatic driving control is executed by the CPU executing a non-illustrated routine similar to the routine shown in FIG. 3. The automatic driving control is performed by the CPU executing a non-illustrated routine similar to the routine shown in the FIG. 4. Briefly, the CPU executes following processing…”; [0149]; and [0151]-[0153]). As Kaji teaches eyes-off state continuing in driving state where the nearby vehicle m in a traffic jam, and switching from the driving assistance, and Okajima teaching an adaptability of automated driving recognizing states of a nearby vehicle(s). Accordingly, it would have been obvious to one of ordinary skill in the art before the filing of the invention to incorporate the information concerning the presence or degree of traffic congestion based on the future traffic condition, thus providing information concerning traffic congestion as taught by Abe. One would be motivated to make this modification in order to implement automated travel safely even when the external situation has changed. Additionally, Oniwa teaches ….the first traffic congestion state is a state where the subject vehicle is not possible to perform a lane change, and the second traffic congestion state is a state where the subject vehicle is possible to perform the lane change (see, Paragraph [0071]: “In the second control state, it is possible to perform driving control such as adaptive cruise control (ACC), lane keeping assistance system (LKAS), auto lane changing (ALC), driver lane changing (DLC) and the like. The ACC is, for example, driving control for causing the own vehicle M to follow a preceding vehicle. The LKAS is, for example, driving control for maintaining a lane in which the own vehicle M travels. The ALC is driving control for performing, for example, lane change without requiring the occupant's tum signal operation on the basis of route setting of the navigation device 50. The ALC includes, for example, lane change in automated overtaking control and lane change at a branch.”; and [0078]: “During execution of the third control state, lane change or the like is not performed. If there is interruption of another vehicle during execution of the third control state, the control state changer 142 may allow the HMI controller 180 to cause the HMI 30 to output a request to monitor the surroundings of the own vehicle M (hereafter referred to as eyes-on). For example, if the speed limit of the speed sign of the lane in which the own vehicle M is traveling changes during the operation of one of the first to third control states, the control state changer 142 performs the processing described above on the basis of the changed speed limit and performs an operation of the corresponding control state.”). As Oniwa teaches the travel situations of the own vehicle where “travel situations of the own vehicle M include, for example, the position and speed of the own vehicle M, the positions and speeds of other vehicles, a congestion situation…” see, Paragraph [0061] and “the first control states “When the own vehicle M is traveling on a general road or the like and the speed limit is less than 40 km/h, the control state changer 142 changes the control state to the first control state (with control degree 1) and performs the driving control in the first control state. The first control state includes, for example, a state in which the steering and acceleration/deceleration of the own vehicle M are controlled by the occupant's driving operation to cause the own vehicle M to travel (so-called manual driving).”, see paragraph [0069]). Accordingly, it would have been obvious to one of ordinary skill in the art before the filing of the invention to implement traffic congestion state determining when possible to perform the lane change as taught by Oniwa. One would be motivated to make this modification in order to provide a vehicle control device, a vehicle control method, and a storage medium with which it is possible to perform driving control that matches traffic flow (see, Paragraph [0005]). As to [claim 2], the combination of Kaji, Okajima, Abe and Oniwa teaches the automated driving control device according to claim 1. Kaji discloses further comprising a time measurement unit (“monitoring determiner 140B”) configured to measure an elapsed time from the start of the eyes-off automated driving (see at least Paragraph [0168]: “Further, when the eyes-off state continues during a predetermined time or more after the monitoring determiner 140B determines that the occupant is in the eyes-off state, the switching controller 110 may cause the automated driving controller 300 to perform alternative control instead of performing the automated driving control”, wherein the permission controller (“HMI controller 120”) permits the continuation of the eyes-off automated driving in the second traffic congestion state when the elapsed time is within a predetermined time (see [0167]: “When the eyes-off state continues during a predetermined time or more, the switching controller 110 may cause the driving assistance controller 200 to perform the driving assistance control by switching the level of the driving assistance from the second level to the first level.”), and does not permit the continuation of the eyes-off automated driving in the second traffic congestion state when the elapsed time exceeds the predetermined time (see at least Figure 12; and Paragraph [0229]: “In this case, the action plan generator 323 of the automated driving controller 300 determines a target speed when the driving assistance at the third level is continued, to be a speed equal to or lower than a reference speed (that is, 80 [km/h] or 100 [km/h]) or a speed equal to or lower than a current speed of the host vehicle M (that is, a speed equal to or lower than 60 [km/h]). Accordingly, when the occupant is not in the eyes-on state, the host vehicle M can be caused to accelerate relatively gently without causing the host vehicle M to accelerate to an original speed determined as an upper limit speed of the driving assistance at the third level or the vehicle can be caused to travel so that a current vehicle speed is kept. A state in which the driving assistance at the third level (automated driving) is continued at the target speed equal to or lower than the reference speed or equal to or lower than the current speed of the host vehicle M is an example of a "third automated driving mode"”). As to [claim 3], the combination of Kaji, Okajima, Abe and Oniwa teaches the automated driving control device according to claim 1. Kaji discloses further comprising a posture grasping unit configured to grasp a driving posture of the driver (see at least Paragraph [0027]: “host vehicle position recognizer 322 recognizes a relative position and posture of a host vehicle M”; [0092]: “The host vehicle position recognizer 322 recognizes, for example, a lane (traveling lane) in which the host vehicle M is traveling, and a relative position and posture of the host vehicle M with respect to the traveling lane. The host vehicle position recognizer 322, for example, compares a pattern (for example, an arrangement of solid lines and broken lines) of road demarcation lines obtained from the second map information 62 with a pattern of road demarcation lines around the host vehicle M recognized from an image captured by the camera 10 to recognize a traveling lane”), wherein the permission controller (“HMI controller 120”) is configured to determine the continuation of the eyes-off automated driving in the second traffic congestion state according to the driving posture of the driver (see at least Paragraph [0163]: “the image showing the host vehicle M recognized by the host vehicle position recognizer 322, the image showing the nearby vehicle m recognized by the outside world recognizer 321, and a future trajectory image 602 showing a future trajectory of the host vehicle M generated by the action plan generator 323 to be displayed in a surroundings detection information display area 600-3. Further, the HMI controller 120 causes an indication that the driving assistance ("automated traveling" in FIG. 14) at the second level is started, but the occupant is caused to continuously monitor a surrounding traffic situation to be displayed in the surroundings detection information display area 600-3”). As to [claim 4], the combination of Kaji, Okajima, Abe and Oniwa teaches the automated driving control device according to claim 1. Kaji discloses further comprising a road type grasping unit (“a recommended lane determiner 61”) configured to grasp a road type of a road on which the subject vehicle is traveling (see at least Paragraphs [0071]-[0072]), wherein the permission controller is configured to determine the continuation of the eyes-off automated driving in the second traffic congestion state according to the road type (see at least Paragraph [0163]: “Further, the HMI controller 120 causes, for example, an image showing a shape of a road in front of the host vehicle M, which has been acquired from the second map information 62, the image showing the host vehicle M recognized by the host vehicle position recognizer 322, the image showing the nearby vehicle m recognized by the outside world recognizer 321, and a future trajectory image 602 showing a future trajectory of the host vehicle M generated by the action plan generator 323 to be displayed in a surroundings detection information display area 600-3. Further, the HMI controller 120 causes an indication that the driving assistance ("automated traveling" in FIG. 14) at the second level is started, but the occupant is caused to continuously monitor a surrounding traffic situation to be displayed in the surroundings detection information display area 600-3”). As to [claim 5], the combination of Kaji, Okajima, Abe and Oniwa teaches the automated driving control device according to claim 1. Kaji discloses further comprising a task grasping unit configured to grasp a content of a task other than driving performed by the driver during the eyes-off automated driving (see at least Paragraph [0269]: “the image processor 140A that detects the direction of the face or line of sight of the occupant of the host vehicle M from the captured image of the in-vehicle camera 90, the automated driving controller 300 that executes automated driving control, a switching controller 110 that switches the driving assistance executed by the automated driving controller 300 from the driving assistance at the second level in which a predetermined task is required to any one of a plurality of driving assistances including the driving assistance at the third level in which a level of a task required of the occupant is lower than the driving assistance at the second level, in which when the condition including the direction of the face or line of sight of the occupant detected by the image processor 140A being within the first angle range 401 is satisfied” and [0270]: “since a current speed of the host vehicle M is caused to be kept or decelerate or the host vehicle M is not caused to accelerate to an original upper limit speed of the driving assistance at the second level in a case in which the direction of the face or line of sight of the occupant is not in the first angle range 401 or the direction of the face or line of sight of the occupant is not detected, and a condition for switching the level of the driving assistance from the third level to the second level is not satisfied, it is possible to lower a level of difficulty of the automated driving control as compared with the driving assistance at the second level. As a result, it is possible to continue to perform the automated driving while decreasing a level of the task required of the occupant”; and [0273]: “Therefore, when a task of the occupant increases due to switching of the level of the driving assistance, it is possible to inform the occupant of the fact in advance”), wherein the permission controller (“HMI controller 120”) is configured to determine the continuation of the eyes-off automated driving in the second traffic congestion state according to the content of the task (see at least Paragraphs [0168]: “Further, when the eyes-off state continues during a predetermined time or more after the monitoring determiner 140B determines that the occupant is in the eyes-off state, the switching controller 110 may cause the automated driving controller 300 to perform alternative control instead of performing the automated driving control according to the second level. The alternative control is, for example, automated driving control for causing the host vehicle M to stop in an area in which the host vehicle M is allowed to stop, such as a road shoulder, while causing the host vehicle M to gradually decelerate” and [0273]: “the task of the occupant increases due to switching of the level of the driving assistance, the level of the driving assistance is switched in a case in which approval has been received from the occupant. Therefore, it is possible to cause the occupant to execute the task more reliably.”). Regarding [claim 8], recites analogous limitations that are present in claim 1, therefore claim 8 would be rejected for the same/similar premise above. Kaji discloses an automated driving control device (“an automated driving controller 300”) capable of performing eyes-off automated driving without periphery monitoring obligation by a driver (see at least Abstract; see at least Paragraph [0060]: “the vehicle control system is applied to an automated driving vehicle capable of auto mated driving (autonomous driving). In principle, the auto mated driving refers to causing a vehicle to travel in a state in which no operation of an occupant is required , and is considered to be a type of driving assistance. The automated driving vehicle can also be caused to travel through manual driving”), the device (“an automated driving controller 300”) comprising: … and does not permit the start of the eyes-off automated driving when the subject vehicle is traveling in the passing lane and the traffic congestion recognition unit has recognized the second traffic congestion state (see at least Paragraph [0259]: “ON/OFF of the driving assistance at the third level, and (I) monitoring of driving of the occupant required/not required with respect to the passage of time as switching relevant to the driving assistance are shown”)… As to [claim 9], the combination of Kaji, Okajima, Abe and Oniwa teaches the automated driving control device according to claim 8. Kaji further comprising a notification execution unit (“HMI 400”) configured to perform a notification (see at least Paragraph [0084]: “The HMI controller 120 causes the HMI 400 to output, for example, a notification relevant to switching of the level of the driving assistance. Further, the HMI controller 120 may cause information on determination results of one or both of the operator status determiner 130”)…however,, Kaji does not explicitly disclose where a notification execution unit configured to perform a notification for encouraging the driver to perform lane change to the traveling lane when the permission controller permits the start of the eyes-off automated driving based on the recognition of the traffic congestion state due to a lane change from the passing lane to the traveling lane. However, Okajima teaches a notification execution unit “a notification controller 130” configured to perform a notification for encouraging the driver to perform lane change to the traveling lane when the permission controller permits the start of the eyes-off automated driving based on the recognition of the traffic congestion state due to a lane change from the passing lane to the traveling lane (see at least Paragraph [0058]: “The notification controller 130 causes an output unit ( such as various display devices or a speaker) included in the HMI 30 or the navigation HMI 52 to output predetermined information, for example, when the own vehicle M is expected to reach a low speed section within a predetermined time or there is a low speed section within a predetermined distance in the travel direction of the own vehicle M. The predetermined information is, for example, information indicating that a low speed section will be reached in a predetermined time” and [0079]: “Further, the notification controller 130 may cause the HMI 30 to output the notification information only when the own vehicle M needs to change lanes . Here , the own vehicle M needs to change lanes when the own vehicle Mis not traveling in the lane L2 connected to the branch road L3 (i.e. the own vehicle M is traveling in the lane L1 ) if there is a branch point for entering the branch road L3 from the lane L2 of a main line when the own vehicle M is to travel along a predetermined route as shown in FIG. 8”; [0080] and [0097]). Accordingly, it would have been obvious to one of ordinary skill in the art before the filing of the invention to further to incorporate notification execution unit configured to perform a notification for encouraging the driver to perform lane change to the traveling lane as taught by Okajima and combining an automated driving control device as taught by Kaji One would be motivated to make this modification in order to provide a vehicle control system , a vehicle control method , and a vehicle control program which can enhance the adaptability of automated driving (see Okajima Paragraph [0005]). Regarding [claim 10], recites analogous limitations that are present in claim 1, therefore claim 8 would be rejected for the same/similar premise above. Kaji discloses an automated driving control device (“an automated driving controller 300”) capable of performing eyes-off automated driving without periphery monitoring obligation by a driver (see at least Abstract; Paragraph [0060]: “the vehicle control system is applied to an automated driving vehicle capable of auto mated driving (autonomous driving). In principle, the automated driving refers to causing a vehicle to travel in a state in which no operation of an occupant is required , and is considered to be a type of driving assistance. The automated driving vehicle can also be caused to travel through manual driving” and [0086]: “The monitoring determiner 140B determines whether the occupant in the driver's seat is monitoring the surroundings of the host vehicle M on the basis of the direction of the line of sight or the face detected by the image processor 140A. In the following description, a state in which the occupant is monitoring the surroundings of the host vehicle M will be referred to as "eyes on", and a state in which the occupant is not monitoring the surroundings of the host vehicle M will be referred to as "eyes off'".”), the device (“an automated driving controller 300”) comprising: a lane determination unit (“a recommended lane determiner 61”) configured to determine whether a subject vehicle is traveling in a passing lane (see at least Paragraph [0071]: “The recommended lane determiner 61 divides the route provided from the navigation device 50 into a plurality of blocks (for example, divides the route every 100 [m] in a progressing direction of the vehicle) and determines a recommended lane for each block by referring to the second map information 62. The recommended lane determiner 61 determines in which lane from the left the host vehicle M travels. The recommended lane determiner 61 determines the recommended lane so that the host vehicle M can travel on a reasonable route for progression to a branch destination when there is a branch place, a merging place, or the like in the route”); …recognizing a traffic congestion state unit at a periphery of the subject vehicle (see at least Paragraph [0163]: “the HMI controller 120 causes an indication that the driving assistance ("automated traveling" in FIG. 14) at the second level is started, but the occupant is caused to continuously monitor a surrounding traffic situation to be displayed in the surroundings detection information display area 600-3.”); a permission controller (“HMI controller 120”) configured to set a first permission condition to be stricter than a second permission condition (see at least Paragraph [0061]: “a level of the driving assistance include a first level, a second level with a higher level of control (automation rate) than the first level, and a third level with a higher level of control than the second level. In the driving assistance at the first level, for example, driving assistance control is executed by operating a driving assistance device such as an adaptive cruise control system (ACC) or a lane keeping assistance system (LKAS). In the driving assistance at the second level and the third level, for example, automated driving for automatically controlling both acceleration/deceleration and steering of the vehicle without requiring an operation of the occupant with respect to the driving operator is executed. With the execution of the driving assistance, the occupant is assigned a task (obligation) according to the level of the driving assistance. For example, in the driving assistance at the first level and the second level, the occupant is obliged to monitor the surroundings, whereas in the driving assistance at the third level, the occupant is not obligated to monitor the surroundings ( or a level of surroundings monitoring obligation is low).”), wherein the first permission condition is a condition that permits a start of the eyes-off automated driving based on the traffic congestion state when the subject vehicle travels in the passing lane (see at least Paragraph [0114]: “The main switch 412 is a switch for setting a state in which the driving assistance can be started (a standby state). In other words, the main switch 412 is a switch for starting a process (an internal process) in a preparatory stage before the driving assistance is executed or a switch enabling a determination whether or not the driving assistance can be started”), and the second permission condition is a condition that permits the eyes-off automated driving based on the traffic congestion state when the subject vehicle travels in a traveling lane different from the passing lane (see at least Paragraph [0152]: “the HMI controller 120 causes the requested motion notification image 622 schematically showing an operation content of the occupant for setting a state in which the hand of the occupant has been released from the steering wheel 82 to be displayed in the driving assistance state display area 620-2 of the third screen IM3-2, as information on a method in which the occupant operates to perform switching to the driving assistance at the second level when the driving assistance at the first level is being executed and the driving assistance at the second level can be executed”); and a different vehicle grasping unit (“outside world recognizer 321”) configured to grasp existence of a different vehicle at the periphery of the subject vehicle (see at least Paragraph [0090]: “The outside world recognizer 321 recognizes a state such as a position, a speed, and an acceleration of a nearby vehicle on the basis of information input from the camera 10, the radar device 12, and the finder 14 via the object recognition device 16. The position of the nearby vehicle may be represented by a representative point such as a centroid or a corner of the nearby vehicle or may be represented by an area represented by a contour of the nearby vehicle” and [0136]: “an image showing the nearby vehicle m recognized by the outside world recognizer 321 to be displayed in the surroundings detection information display area 600-1. The HMI controller 120 causes an image showing all nearby vehicles m recognized by the outside world recognizer 321 to be displayed on the first display 450. Further, the HMI controller 120 causes only the nearby vehicle m affecting a future trajectory of the host vehicle M among all the nearby vehicles m recognized by the outside world recognizer 321 to be displayed on the first display 450.”), Kaji does not explicitly discloses … a traffic congestion recognition unit…; … wherein the first permission condition to permit the start of the eyes-off automated driving includes a state where the subject vehicle has recognized a rear vehicle, the rear vehicle beinq the different vehicle traveling in a lane where the subject vehicle travels and follows the subject vehicle. However, Okajima teaches a traffic congestion recognition unit configured to recognize a traffic congestion state unit at a periphery of the subject vehicle (see at least Paragraph [0045]: “The external environment recognizer 121 recognizes states of a nearby vehicle (s) such as the position, speed and acceleration thereof on the basis of information that is input from the camera 10, the radar device 12, and the finder 14 directly or via the object recognition device 16. The position of the nearby vehicle may be represented by a representative point such as a center of gravity or a corner of the nearby vehicle or may be represented by a region expressed by a contour of the nearby vehicle . The “states” of the nearby vehicle may include an acceleration or jerk of the nearby vehicle or a “behavior state” (for example, whether or not the nearby vehicle is changing or is going to change lanes). The external environment recognizer 121 may also recognize the positions of guardrails or utility poles, parked vehicles, pedestrians, and other objects in addition to nearby vehicles” [0046] and [0055]: “Here , the automated driving which is performed mainly by the first controller 120 is executed in one of a plurality of automated driving modes. The automated driving modes include an automated driving mode which is executed at a second predetermined speed ( for example, 60 km/h) or less. An example of this is a low speed following travel (traffic jam pilot: TJP) in which the own vehicle M follows a preceding vehicle at the time of congestion . In the low speed following travel , safe automated driving can be realized by following a preceding vehicle on a congested freeway . It is to be noted that the first predetermined speed and the second predetermined speed may be equal or the first predetermined speed may be slightly higher than the second predetermined speed”); … Accordingly, it would have been obvious to one of ordinary skill in the art before the filing of the invention to further to incorporate a traffic congestion recognition unit teaching the first and second state as taught by Okajima and combining an automated driving control device as taught by Kaji One would be motivated to make this modification in order to provide a vehicle control system , a vehicle control method , and a vehicle control program which can enhance the adaptability of automated driving (see Okajima Paragraph [0005]). As Kaji and Okajima surrounding vehicles, neither Kaji nor Okajima or Abe teaches wherein the first permission condition to permit the start of the eyes-off automated driving includes a state where the subject vehicle has recognized a rear vehicle, the rear vehicle beinq the different vehicle traveling in a lane where the subject vehicle travels and follows the subject vehicle. However, Oniwa teaches the first permission condition to permit the start of the eyes-off automated driving includes a state where the subject vehicle has recognized a rear vehicle, the rear vehicle beinq the different vehicle traveling in a lane where the subject vehicle travels and follows the subject vehicle (see, Paragraphs [0074]-[0075]: “When the travel situation of the own vehicle M is a predetermined situation, the control state changer 142 executes the third control state (with control degree 4) on the basis of the speed limit of the traffic sign. The third control state (with control degree 4) is, for example, driving control in which the occupant is permitted to be hands-off and eyes-off. The third control state includes, for example, a state of driving control using a traffic jam pilot (hereinafter referred to as a TJP). The TJP is, for example, a control mode in which the own vehicle M follows a preceding vehicle at a predetermined speed (for example, 60 km/h) or less or a control mode in which the own vehicle M follows a preceding vehicle while traveling on an expressway.”). Accordingly, it would have been obvious to one of ordinary skill in the art before the filing of the invention to implement traffic congestion state determining when possible to perform the lane change as taught by Oniwa. One would be motivated to make this modification in order to provide a vehicle control device, a vehicle control method, and a storage medium with which it is possible to perform driving control that matches traffic flow (see, Paragraph [0005]). Regarding [claim 11], recites analogous limitations that are present in claim 1, 8 , therefore claim 11 would be rejected for the same/similar premise above. Kaji discloses an automated driving control device (“an automated driving controller 300”) capable of performing eyes-off automated driving without periphery monitoring obligation by a driver (see at least Abstract; Paragraph [0060]: “the vehicle control system is applied to an automated driving vehicle capable of auto mated driving (autonomous driving). In principle, the automated driving refers to causing a vehicle to travel in a state in which no operation of an occupant is required , and is considered to be a type of driving assistance. The automated driving vehicle can also be caused to travel through manual driving” and [0086]: “The monitoring determiner 140B determines whether the occupant in the driver's seat is monitoring the surroundings of the host vehicle M on the basis of the direction of the line of sight or the face detected by the image processor 140A. In the following description, a state in which the occupant is monitoring the surroundings of the host vehicle M will be referred to as "eyes on", and a state in which the occupant is not monitoring the surroundings of the host vehicle M will be referred to as "eyes off'".”), the device (“an automated driving controller 300”), comprising:… Regarding [claim 12], recites analogous limitations that are present in claim(s) 8 and 11, therefore claim 12 would be rejected for the same/similar premise above. Kaji discloses an automated driving control device (“an automated driving controller 300”) capable of, by using information of an autonomous sensor (see at least Paragraph [0068]: “The object recognition device 16 performs a sensor fusion process on detection results of some or all of the camera 10, the radar device 12, and the finder 14 to recognize a position, type, speed, and the like of the object. The object recognition device 16 outputs a recognition result to the automated driving controller 300.”) which monitors a periphery of a subject vehicle (see, Paragraph [0276]: “The predictor 150 predicts a future status of the host vehicle M or another vehicle on the basis of a recognition result of the object recognition device 16, a recognition result of the outside world recognizer 321 of the automated driving controller 300,”),, performing eyes-off automated driving without periphery monitoring obligation by a driver (see at least Abstract; Paragraph [0060]: “the vehicle control system is applied to an automated driving vehicle capable of auto mated driving (autonomous driving). In principle, the automated driving refers to causing a vehicle to travel in a state in which no operation of an occupant is required , and is considered to be a type of driving assistance. The automated driving vehicle can also be caused to travel through manual driving” and [0086]: “The monitoring determiner 140B determines whether the occupant in the driver's seat is monitoring the surroundings of the host vehicle M on the basis of the direction of the line of sight or the face detected by the image processor 140A. In the following description, a state in which the occupant is monitoring the surroundings of the host vehicle M will be referred to as "eyes on", and a state in which the occupant is not monitoring the surroundings of the host vehicle M will be referred to as "eyes off'".”), the device (“an automated driving controller 300”) comprising: …. Regarding [claim 13], recites analogous limitations that are present in claim(s) 8 and 11-12, therefore claim 13 would be rejected for the same/similar premise above. As [claim 14], the combination of Kaji, Okajima and Abe teaches the automated driving control device according to claim 1. Kaji discloses wherein the first traffic congestion state is a state where the subject vehicle is possible to perform a lane change, and the second traffic congestion state is a state where the subject vehicle is not possible to perform the lane change (see, Paragraph [0090]: “The outside world recognizer 321 recognizes a state such as a position, a speed, and an acceleration of a nearby vehicle on the basis of information input from the camera 10, the radar device 12, and the finder 14 via the object recognition device 16. The position of the nearby vehicle may be represented by a representative point such as a centroid or a corner of the nearby vehicle or may be represented by an area represented by a contour of the nearby vehicle. The “state” of the nearby vehicle may include an acceleration, a jerk, or an “action state” (for example, whether or not the nearby vehicle is changing lanes or is about to change lanes) of the nearby vehicle”). Additionally, Abe also teaches (see, Paragraphs [0088]: “[Mode C] Mode C is a mode in which the automated drive degree is the next highest to Mode B. When Mode C is in execution, the vehicle occupant needs to perform confirmation operation for the HMI 70 in accordance with the situation. In Mode C, automatic lane change is conducted when the vehicle occupant is notified of the time to change lanes and performs operation to change lanes for the HMI 70, for example. It is therefore necessary for the vehicle occupant to keep watch on the circumstances around the vehicle M and the state of the vehicle M”; and [0093]: “The position of each surrounding vehicle is represented by a representative point thereof, such as the center of gravity or corners of the vehicle or may be represented by a region expressed in the vehicle's outline. The conditions of each surrounding vehicle may include information on the acceleration of the same and whether the vehicle of interest is changing lanes or is going to change lanes. Such information is known based on the information from the above described various devices. In addition to the surrounding vehicles, the outside recognizing section 142 may recognize the positions of guardrails, telephone poles, parked vehicles, pedestrians, and other objects.”). Accordingly, it would have been obvious to one of ordinary skill in the art before the filing of the invention to incorporate the information concerning the presence or degree of traffic congestion based on the future traffic condition, thus providing information concerning traffic congestion as taught by Abe. One would be motivated to make this modification in order to implement automated travel safely even when the external situation has changed. As to [claim 16], the combination of Kaji, Okajima and Abe teaches the automated driving control device according to claim 10. Abe teaches wherein the first permission condition includes a condition that a vehicle speed of the subject vehicle is equal to or less than a predetermined speed and all of: a front vehicle in a subject vehicle lane where the subject vehicle travels; and a side vehicle that is adjacent to the subject vehicle and is in an adjacent lane with respect to the subject vehicle lane; the rear vehicle exist, and the second permission condition includes a condition that the vehicle speed of the subject vehicle is equal to less than the predetermined speed and all of the front vehicle and the side vehicle exist (see, Figure 9; and Paragraph [0102]). Accordingly, it would have been obvious to one of ordinary skill in the art before the filing of the invention to incorporate the information concerning the presence or degree of traffic congestion based on the future traffic condition, thus providing information concerning traffic congestion as taught by Abe. One would be motivated to make this modification in order to implement automated travel safely even when the external situation has changed. As to [claim 17], the combination of Kaji, Okajima and Abe teaches the automated driving control device according to claim 11. As Kaji teaches eyes-off automated driving, in addition, Abe teaches wherein the permission controller counts a number of occurrences of a state where the traffic congestion state has occurred again, and ends the … automated driving when the number of the occurrences exceeds a predetermined number (see, Paragraph [0148]: “The collected information 322B includes the identification information and position information of vehicles m and the information of the destination set in each vehicle m and further includes information concerning the behavior of the vehicle m traveling on the road (speed, time at which the vehicle m passes a certain point, and the like). For example, the collected information 322B includes the automated driving plan, the section where the vehicle m performs manual drive, the average speed during manual drive, and other information. The information concerning the behavior is another example of the auxiliary information. The automated driving plan includes the segments where the vehicle m is expected to perform automated drive, an action plan of automated drive, and the like. The average speed during manual drive may be acquired from the vehicle m or may be acquired from a device other than the vehicle m, for example. The traffic condition estimation apparatus 300 acquires the average vehicle speed in each segment through which the vehicle m is expected to pass, for example. Herein, the average vehicle speed is transmitted from the information providing server 250”; and [0149]-[0150]: “The tally section 308 tallies the number of vehicles m which are expected to pass through each of at least one road segment during each time period for each automated driving mode based on the time at which the vehicles m are expected to pass through the segment by the estimating section 306. FIG. 24 is a diagram illustrating an example of the tally result 328A of the tally section 308 of Modification 2”). Accordingly, it would have been obvious to one of ordinary skill in the art before the filing of the invention to incorporate the information concerning the presence or degree of traffic congestion based on the future traffic condition, thus providing information concerning traffic congestion as taught by Abe. One would be motivated to make this modification in order to implement automated travel safely even when the external situation has changed. As to [claim 18], the combination of Kaji, Okajima and Abe teaches the automated driving control device according to claim 12. Abe teaches wherein information of the autonomous sensor is information different from the traffic congestion information, and the traffic congestion information is acquired by wireless communication with a roadside device (See, Figure 12; and Paragraph [0112]: “FIG. 12 is a diagram illustrating an example of the configuration of a traffic condition estimation system 1. The traffic condition estimation system 1 includes plural vehicles m-1 to m-k (k is an arbitrary natural number), a traffic information providing server 250, and a traffic condition estimation apparatus 300. The vehicles m-1 to m-k are referred to as vehicles m if not distinguished in particular. Some or all of the vehicles m are provided with some or all of the configurations of the vehicle control system 100 and the other devices illustrated in FIG. 2”). Accordingly, it would have been obvious to one of ordinary skill in the art before the filing of the invention to incorporate the information concerning the presence or degree of traffic congestion based on the future traffic condition, thus providing information concerning traffic congestion as taught by Abe. One would be motivated to make this modification in order to implement automated travel safely even when the external situation has changed. As to [claim 19]. the combination of Kaji, Okajima and Abe teaches the automated driving control device according to claim 13. Abe teaches wherein the information of the autonomous sensor is information different from the traffic congestion information (See, Figure 12). As to [claim 20], the combination of Kaji, Okajima, Abe and Oniwa teaches automated driving control device according to claim 1. Oniwa teaches wherein in the state where the subject vehicle is not possible to perform the lane change, in all of the adjacent lane, the side vehicle adjacent to the subject vehicle exists, the front vehicle in front of the subject vehicle exists, and also an adjacent front vehicle exists in front of the adjacent vehicle, and in the state where the subject vehicle is possible to perform the lane change, the front vehicle and the adjacent front vehicle exist and the adjacent vehicle does not exist (see, Paragraph [0071]: “In the second control state, it is possible to perform driving control such as adaptive cruise control (ACC), lane keeping assistance system (LKAS), auto lane changing (ALC), driver lane changing (DLC) and the like. The ACC is, for example, driving control for causing the own vehicle M to follow a preceding vehicle. The LKAS is, for example, driving control for maintaining a lane in which the own vehicle M travels. The ALC is driving control for performing, for example, lane change without requiring the occupant's tum signal operation on the basis of route setting of the navigation device 50. The ALC includes, for example, lane change in automated overtaking control and lane change at a branch.”; and [0078]: “During execution of the third control state, lane change or the like is not performed. If there is interruption of another vehicle during execution of the third control state, the control state changer 142 may allow the HMI controller 180 to cause the HMI 30 to output a request to monitor the surroundings of the own vehicle M (hereafter referred to as eyes-on). For example, if the speed limit of the speed sign of the lane in which the own vehicle M is traveling changes during the operation of one of the first to third control states, the control state changer 142 performs the processing described above on the basis of the changed speed limit and performs an operation of the corresponding control state.”). As Oniwa teaches the travel situations of the own vehicle where “travel situations of the own vehicle M include, for example, the position and speed of the own vehicle M, the positions and speeds of other vehicles, a congestion situation…” see, Paragraph [0061] and “the first control states “When the own vehicle M is traveling on a general road or the like and the speed limit is less than 40 km/h, the control state changer 142 changes the control state to the first control state (with control degree 1) and performs the driving control in the first control state. The first control state includes, for example, a state in which the steering and acceleration/deceleration of the own vehicle M are controlled by the occupant's driving operation to cause the own vehicle M to travel (so-called manual driving).”, see paragraph [0069]). Accordingly, it would have been obvious to one of ordinary skill in the art before the filing of the invention to implement traffic congestion state determining when possible to perform the lane change as taught by Oniwa. One would be motivated to make this modification in order to provide a vehicle control device, a vehicle control method, and a storage medium with which it is possible to perform driving control that matches traffic flow (see, Paragraph [0005]). Claim(s) 6-7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kaji in view of Okajima, and in view of Kazutaka Yoshikawa (JP2005324661A; the NPL citations are based on the provided English Translation) hereinafter, referred to as “Yoshikawa”. As to [claim 6], the combination of Kaji, Okajima, Abe and Oniwa teaches the automated driving control device according to claim 1. Kaji does not explicitly discloses wherein the traffic congestion recognition unit includes a re-traffic congestion counter configured to predict that a traffic congestion is solved, and count reoccurrence of the traffic congestion after predicting that the traffic congestion is solved, and the permission controller does not permit the continuation of the eyes-off automated driving in the second traffic congestion state during the reoccurrence of the traffic congestion when a count of the reoccurrence of the traffic congestion is equal to or less than a predetermined value, and permits the continuation of the eyes-off automated driving in the second traffic congestion state during the reoccurrence of the traffic congestion when the count of the reoccurrence of the traffic congestion exceeds the predetermined value. However, Yoshikawa teaches wherein the traffic congestion recognition unit (“a traffic congestion determination unit”) includes a re-traffic congestion counter configured to predict that a traffic congestion is solved (see at least Paragraph [0035]: “The traffic congestion determination unit determines that traffic congestion has occurred, and when traffic congestion has occurred, the automatic driving start determination unit starts automatic driving if the length of the section where traffic congestion has occurred is equal to or greater than a predetermined value, and the automatic driving stop determination unit stops automatic driving if an automatic driving stop condition is satisfied after automatic driving has been started. In this case, the automatic driving determination processing unit 35 corresponds to the traffic congestion determination unit, the automatic driving start determination unit, and the automatic driving stop determination unit”), and count reoccurrence of the traffic congestion after predicting that the traffic congestion is solved (see at least Paragraph [0038]: “First, the vehicle control device repeatedly determines whether or not there is a traffic jam based on the traffic jam determination by the camera 31 at predetermined time intervals (for example, 16 msec). In other words, if the autonomous driving judgment processing unit 35 determines that a traffic jam has occurred based on the results of the image processing performed by the image processing unit 34, it determines that a traffic jam section exists ahead of the vehicle, and if the autonomous driving judgment processing unit 35 determines that a traffic jam has not occurred based on the results of the image processing performed by the image processing unit 34, it determines that no traffic jam section exists ahead of the vehicle” and [0040]: “When it is determined that there is a traffic jam, the vehicle control device determines whether or not there is traffic jam information at that location. That is, it is determined whether or not the traffic information acquired by the navigation device 11 includes congestion information regarding a congestion section that exists ahead of the vehicle. If there is congestion information at that point, the vehicle control device obtains the congestion length from the congestion information. That is, the length of the congestion section included in the congestion information is obtained. Moreover, if there is no congestion information at that point, the vehicle control device judges whether or not there is driving pattern data. If there is driving pattern data, the vehicle control device calculates the congestion length from the driving pattern data. That is, the length of the congestion section is calculated based on the driving pattern data”), and the permission controller does not permit the continuation of the eyes-off automated driving in the second traffic congestion state during the reoccurrence of the traffic congestion when a count of the reoccurrence of the traffic congestion is equal to or less than a predetermined value [0053]: “First, the vehicle control device judges whether or not there is congestion information at that location. That is, it is determined whether or not the traffic information acquired by the navigation device 11 includes congestion information regarding a congestion section that exists ahead of the vehicle. Generally, it can be considered that traffic information provided by systems such as VICS® is more accurate for expressways than for general roads. Therefore, when the vehicle is traveling on a highway, the vehicle control device does not determine whether or not there is a traffic jam by using the camera 31 to determine whether or not there is a traffic jam.”), and permits the continuation of the eyes-off automated driving in the second traffic congestion state during the reoccurrence of the traffic congestion when the count of the reoccurrence of the traffic congestion exceeds the predetermined value (see at least Paragraphs [0057]-[0058]; [0060]: “Step S28: Automatic operation is started. In step S29, it is determined whether the automatic driving stop condition is satisfied. If the automatic driving stop condition is not satisfied, the determination is repeated at a predetermined time interval (for example, 16 msec), and if the automatic driving stop condition is satisfied, the process ends” and [0061]: “the vehicle control device determines whether or not a traffic jam has occurred, and if a traffic jam has occurred, starts automatic driving according to the situation of the traffic jam. After the automatic driving starts, the automatic driving is stopped when the automatic driving stop condition is satisfied. This allows automatic starting and stopping of automated driving in response to traffic congestion appropriately, without causing inconvenience to the driver”). Accordingly, it would have been obvious to one of ordinary skill in the art before the filing of the invention to further implement re-traffic congestion counter configured to predict that a traffic congestion is solved as taught by Yoshikawa and combining the automated driving control device as taught by Kaji in view of Okajima. One would be motivated to make this modification in order to convey that the control device can reduce the inconvenience to the driver by performing automatic driving while the vehicle is traveling through a congested section. In addition, since vehicle speeds are low when traveling through congested sections, automated driving can easily be achieved even when traveling on public roads, where automated driving is generally considered difficult (see at Paragraph [0003]). As to [claim 7], the combination of Kaji, Okajima, Abe and Oniwa and Yoshikawa teaches the automated driving control device according to claim 6. Kaji discloses wherein in a case where the adjacent lane exists only on one side of the subject vehicle lane, the permission controller sets the predetermined value to be lower than a value in a case where the adjacent lane exists on both sides of the subject vehicle lane (see at least Figure 4; [0090]: “The outside world recognizer 321 recognizes a state such as a position, a speed, and an acceleration of a nearby vehicle on the basis of information input from the camera 10, the radar device 12, and the finder 14 via the object recognition device 16. The position of the nearby vehicle may be represented by a representative point such as a centroid or a corner of the nearby vehicle or may be represented by an area represented by a contour of the nearby vehicle”; [0097]-[0099]: “In the example of FIG. 4, a state in which the action plan generator 323 has set the lane change target position TAs is shown. In FIG. 4, mA denotes a preceding vehicle, mB denotes the front reference vehicle, and mC denotes the rear reference vehicle. Further, an arrow d indicates the progressing (traveling) direction of the host vehicle M. In the example of FIG. 4, the action plan generator 323 sets the lane change target position TAs between the front reference vehicle mB and the rear reference vehicle mC on the adjacent lane L2.”). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to BAKARI UNDERWOOD whose telephone number is (571)272-8462. The examiner can normally be reached M - F 8:00 TO 4:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abby Flynn can be reached (571) 272-9855. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /B.U./Examiner, Art Unit 3663 /JAMES M MCPHERSON/Examiner, Art Unit 3663
Read full office action

Prosecution Timeline

Jun 06, 2023
Application Filed
Mar 13, 2025
Non-Final Rejection — §103
Jun 06, 2025
Applicant Interview (Telephonic)
Jun 13, 2025
Examiner Interview Summary
Jun 26, 2025
Response Filed
Sep 12, 2025
Final Rejection — §103
Nov 24, 2025
Request for Continued Examination
Dec 06, 2025
Response after Non-Final Action
Feb 12, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12594987
ELECTRONIC POWER STEERING SYSTEM RACK FORCE OBSERVER VEHICLE DIAGNOSTICS
2y 5m to grant Granted Apr 07, 2026
Patent 12576690
REEFER POWER CONTROL
2y 5m to grant Granted Mar 17, 2026
Patent 12575493
SYSTEM AND METHOD FOR CONTROLLING MACHINE BASED ON COST OF HARVEST
2y 5m to grant Granted Mar 17, 2026
Patent 12576876
Method for Implementing Autonomous Driving, Medium, Vehicle-Mounted Computer, and Control System
2y 5m to grant Granted Mar 17, 2026
Patent 12546626
METHOD, APPARATUS, AND COMPUTER PROGRAM PRODUCT FOR PROBE DATA-BASED GEOMETRY GENERATION
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
70%
Grant Probability
89%
With Interview (+19.1%)
3y 3m
Median Time to Grant
High
PTA Risk
Based on 196 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month