Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This office action is in responsive to an amendment filed on 10/7/25. Claims 1-10 are pending.
Continued Examination under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous office action has been withdrawn pursuant to 37 CFR 1.114. Applicant’s submission filed on 10/7/25 has been entered.
Response to Amendment
Amendments filed on 6/17/25 are under consideration. Claims 1, 8, and 9 are amended. Claim interpretation under 35 USC 112 (f) has been upheld in response to applicant’s statement. Claim Objection to claims 1 and 8 are removed upon correction.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: position information acquiring unit configured to acquire position information in claim 1. Additionally, vehicle information acquiring unit configured to execute a communication in claim 1. Similarly, storage unit that stores a high-precision map in claim 1. Again, drawing processing unit configured to draw a region in claims 1, 2, 3, 4, 6, and 7. Additionally, driving control mode determining unit configured to determine in claim 1. Similarly, notification unit configured to notify in claim 1. Again, congestion time-zone information acquiring unit configured to acquire in claims 3 and 6. Additionally, a route searching unit configured to search in claims 4 and 7. Similarly, traffic information acquiring unit configured to acquire in claim 5. Again, a driving control unit configured to start the automated driving control in claim 1. Lastly, the display unit configured to display in claims 4, 5, and 7.
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. See at least, [0023] – “The position information acquiring unit 110 includes, for example, a GPS receiver, an azimuth sensor, a distance sensor, etc., and detects position information (longitude information and latitude information) of the own vehicle at a predetermined timing to acquire the position information of the own vehicle”
[0025] – “the notification unit 130 may be mounted not only on the in-vehicle device 100, but also on a mobile terminal such as a smartphone, a portable car navigation device, etc., for example”
[0053] – “The display unit 180 is configured by, for example, a liquid crystal panel, etc., and so displays information including the route searched by the route searching unit 171 as to be visible to the occupant.”
[0035] – “The server-side control unit 250 controls the server 200 as a whole, on the basis of a control program stored in unillustrated ROM, etc.… It should be noted that, in the present embodiment, the server-side control unit 250 controls an operation and a process of the drawing processing unit 220, the
another vehicle information acquiring unit 230,.”
[0056] – “the server 200 includes the high-precision map storage unit 210, a drawing processing unit 220A, the another vehicle information acquiring unit 230, the server-side communication unit 240, a database 251, the congestion time-zone information acquiring unit 260, and a server-side control unit 250A.”
[0029] – “The vehicle-side control unit 170 controls the in-vehicle device 100 as a whole, on the basis of a control program stored in unillustrated ROM (Read Only Memory), etc.”
[0022] – “in-vehicle device 100 includes a position information acquiring unit 110, a driving control mode determining unit 120, a notification unit 130, a driving control unit 140,”
[0051] – “in-vehicle device 100A includes the position information acquiring unit 110, the driving control mode determining unit 120, the notification unit 130, the driving control unit 140, the high-precision map storage unit 150, the vehicle-side communication unit 160, a route searching unit 171, a display unit 180, a traffic information acquiring unit 190, and a vehicle-side control unit 170A.”
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, and 3-10 are rejected under 35 U.S.C. 103 as being unpatentable over Sato et al. (JP2015141476A) in view of Kum et al. ( US 2021/0261167 Al) and in view of Ma et al. (US 2020/0324794 Al) and in view of Nanri et al. (US 20200111366Al) and in further view of Bauer et al. (US 2018/0181123 Al)
Regarding claim 1 Sato teaches A driving control system comprising: (Title: Automatic driving support system, automatic driving support method, and computer program) a position information acquiring unit configured to acquire position information of an own vehicle; ( Pg. 6 – [0013] - “As illustrated in FIG. 1, a navigation device 1 according to the present embodiment includes a current position detection unit 11 that detects a current position of a car in which the navigation device 1 is mounted” & See Also Pg. 7 – [0014] – “The current position detection unit 11 includes a GPS 22, a vehicle speed sensor 23, a steering sensor 24, a gyro sensor 25, and the like, and can detect the current vehicle position, direction, vehicle traveling speed, current time, and the like.” ) an another vehicle information acquiring unit configured to acquire position information of another vehicle (Pg. 4 – [0003] – “positions of other vehicles around the vehicle are detected as needed,”) a storage unit that stores a high-precision map; ( Pg. 7 – [0016] - “storage means stores map display data”) own vehicle has traveled in an automated driving mode or a driving assist mode (Pg. 5 – [0009] – “control content acquisition unit (41) configured to, when the vehicle travels in an autonomous driving section in which autonomous driving control of the vehicle is performed, acquire control content of the autonomous driving control of the vehicle in the autonomous driving section;” ) the position information of the own vehicle, (Pg. 3 – [0002] – “Here, the navigation device is a device capable of detecting a current position of the own vehicle by a GPS receiver or the like”) and information indicating a driving control mode of the own vehicle; (Pg. 5 – [0009] – “control content acquisition unit (41) configured to, when the vehicle travels in an autonomous driving section in which autonomous driving control of the vehicle is performed, acquire control content of the autonomous driving control of the vehicle in the autonomous driving section” ) a driving control mode determining unit configured to determine based on the position information of the own vehicle (Pg. 22 – [0086] – “CPU41 determines whether or not the car is within the autonomous driving section based on the current position of the car detected by the current position detection unit 11 and the map information” ) whether the own vehicle is inside or outside the region (Pg. 22 – [0086] – “CPU41 determines whether or not the car is within the autonomous driving section based on the current position of the car detected by the current position detection unit 11 and the map information” (equates to whether the own vehicle is inside or outside the region as the quote shows the cpu is determining whether or not the vehicle is inside an autonomous driving region based on a position.)) and the position information of the own vehicle; (Pg. 9 – [0021] – “the autonomous driving control is basically performed only while the vehicle travels in the autonomous driving section.” & See Also Pg. 9 – [0021] – “expressway, a freeway, a toll road, and an ordinary road may be set as the autonomous driving section.” & See Also Pg. 10 – [0023] – “For example, when the vehicle travels in an autonomous driving section in which the autonomous driving control of the vehicle is performed, the control content acquisition unit acquires the control content of the autonomous driving control of the vehicle in the autonomous driving section” (equates to ; a driving control mode determining unit configured to determine a driving control mode based on information including the position information of the own vehicle as the autonomous driving sections are described in the art as only being on certain types of road ways and when the location of the vehicle is identified and the roadway type is identified then the vehicle can travel autonomously and the state of it being noted by the control content acquisition unit of the third quote.)) a notification unit configured to notify an occupant of the own vehicle of start of automated driving control or that driving assist control is possible when the own vehicle is determined to have entered the region, (Pg. 10 – [0026] – “In addition, the speaker 16 outputs voice guidance for guiding traveling along the guide route based on an instruction from the navigation ECU13, and guidance of traffic information. In particular, in the present embodiment, while the vehicle travels in the autonomous driving section by the autonomous driving control, the control content of the autonomous driving control to be performed on the vehicle next time and thereafter is repeatedly output” & See Also (Pg. 22 – [0086] – “CPU41 determines whether or not the car is within the autonomous driving section based on the current position of the car detected by the current position detection unit 11 and the map information” (equates to a notification unit configured to notify an occupant of the own vehicle of start of automated driving control or that driving assist control is possible as when the own vehicle is determined to have entered region and is in autonomous driving control the speaker outputs information related to vehicle control which can include the starting of the autonomous driving. And the second quote shows how the CPU determines whether or not the vehicle is within the region.)) and notify the occupant of the own vehicle of an end of the automated driving control or the driving assist control when the own vehicle is determined to have exited the region; ((Pg. 10 – [0026] – “In addition, the speaker 16 outputs voice guidance for guiding traveling along the guide route based on an instruction from the navigation ECU13, and guidance of traffic information. In particular, in the present embodiment, while the vehicle travels in the autonomous driving section by the autonomous driving control, the control content of the autonomous driving control to be performed on the vehicle next time and thereafter is repeatedly output” & See Also Pg. 2 – [PROBLEM TO BE SOLVED] – “when the vehicle drives within the automatic driving section under the automatic driving control of the vehicle;” & See Also Pg. 22 – [0086] – “CPU41 determines whether or not the car is within the autonomous driving section based on the current position of the car detected by the current position detection unit 11 and the map information” (equates notify the occupant of the own vehicle of end of the automated driving control or the driving assist control when the own vehicle is determined to have exited the region as when the vehicle exits the autonomous driving section and is in autonomous driving control the speaker outputs information related to vehicle control and can include the end of the autonomous driving. And the last quote shows how the determination of whether or not the vehicle is within the region and thus whether or not the vehicle has exited the region is determined.))) and a driving control unit configured to start the automated driving control or the driving assist control in response to determining that the own vehicle has entered the region, (Pg. 12 – [0035] – “On the other hand, in a case where it is determined that the autonomous driving switch is turned ON, the user desires to perform the autonomous driving control in the autonomous driving section. Therefore, as will be described later, after the S5, the CPU41 On the assumption that the vehicle basically performs the autonomous driving control in the autonomous driving section, the control content of the autonomous driving control is set, and the travel guidance of the vehicle for the user who travels by the autonomous driving control is performed” & See Also Pg. 18 – [0067] – “In S25, the CPU41 provides guidance on the control content corresponding to the current section into which the car has newly entered” (equates to and a driving control unit configured to start the automated driving control or the driving assist control in response to determining that the own vehicle has entered the region, as the switch has been flipped to initiate autonomous driving within the vehicle and when the vehicle encounters an autonomous driving section the CPU would start the autonomous driving when the vehicle enters the region as seen by the detection of the current section the vehicle is travelling within in the second quote. )) and terminate the automated driving or the driving assist control in response to determining that the own vehicle has exited the region (Pg. 9 – [0021] – “…autonomous driving control is basically performed only while the vehicle travels in the autonomous driving section” & See Also Pg. 18 – [0067] – “In S25, the CPU41 provides guidance on the control content corresponding to the current section into which the car has newly entered” (equates to and terminate the automated driving or the driving assist control in response to determining that the own vehicle has exited the region as the autonomous driving control would only be performed when the vehicle is within the autonomous driving section thus ending automated control when not in the region, wherein the second quote shows how the guidance is provided based on the detected region and thus when it detected the vehicle exits the autonomous driving section the control in regards to that detected section would be terminated.)).
Yet Sato fails to teach and the region drawn by the drawing processing unit, a drawing processing unit configured to plot on the high precision map a traveling trajectory in which the own vehicle has traveled and a traveling trajectory in which the another vehicle has traveled in the automated driving mode or the driving assist mode based on the acquired information including the position information of the another vehicle and the driving control mode of the another vehicle, and draw, based on the plotted trajectories, a region defining an area in which used for controlling automated driving or driving assist control is permitted. As well as, based on information including at least information on the region drawn by the drawing processing unit.
Kum teaches a similar driving support system (abstract). Kum teaches and the region drawn by the drawing processing unit (Pg. 7 – Fig. 6A & See Also Pg. 14 – [0035] – “For example, the graph model may be represented as illustrated in FIG. 6(b), and may include a plurality of nodes and a plurality of edges that connect the plurality of nodes. The nodes may indicate the electronic device 100 and surrounding objects, respectively. In this case, the nodes may represent at least any one of a position, speed or a heading angle, for example, based on state information on the electronic device 100 or the surrounding objects” (equates to and the region drawn by the drawing processing unit as a graphical representation representing a region is drawn based on the position speed or heading of the ego and surrounding vehicles via the graphical model.)) a drawing processing unit configured to plot on the high precision map (Pg. 14 – [0035] – “For example, the graph model may be represented as illustrated in FIG. 6(b), and may include a plurality of nodes and a plurality of edges that connect the plurality of nodes. The nodes may indicate the electronic device 100 and surrounding objects, respectively. In this case, the nodes may represent at least any one of a position, speed or a heading angle, for example, based on state information on the electronic device 100 or the surrounding objects” (equates to a drawing processing unit configured to plot on the high precision map as the quote shows how a graphical representation of the ego vehicle and the surrounding vehicles are plotted on a map based on position, speed, or a heading angle.)) and a traveling trajectory in which the another vehicle has traveled are plotted on the high-precision map. (Pg. 7 – Fig. 6A & See Also Pg. 14 – [0034] – “Referring to FIG. 5, at operation 510, the electronic device 100 may configure a surrounding situation of the electronic device 100 as a graph model.” & See Also Pg. 14 – [0033] – “FIG. 6 is a diagram for describing an operation of configuring a graph model illustrated in FIG. 5.” & See Also Pg. 14 – [0032] – “Accordingly, the processor 180 may recognize historical trajectories of the surrounding objects based on the positions of the surrounding objects on the moving coordinate system based on a constant velocity model” (equates to a traveling trajectory in which the other vehicle has traveled in the automated driving mode or the driving assist mode are plotted on the high-precision map as the second quote shows an image map that generates past trajectories and the third quote shows how the other vehicles mentioned in quote two can be in an autonomous driving mode.)) based on information including at least information on the region drawn by the drawing processing unit (Pg. 16 – [0040] – “In one embodiment, perception module 302 may generate an image map that shows the current positions, current headings, and past trajectories of other vehicles or pedestrians in the environment of autonomous vehicle 101.”).
Yet both fail to teach based on the acquired information including the position information of the another vehicle and the driving control mode of the another vehicle. a traveling trajectory in which the own vehicle has traveled ; other vehicle has traveled in the automated driving mode or the driving assist mode; and draw, based on the plotted trajectories, a region defining an area in which used for controlling automated driving or driving assist control is permitted
Ma teaches a traveling trajectory in which the own vehicle has traveled ; (Pg. 12 – [0020] – “providing trajectory histories pertaining to the ego vehicle along with other vehicles and objects” (equates to a traveling trajectory in which the own vehicle has traveled as the quote shows historical or previously travelled upon trajectories by the own or ego vehicle. ) )
Yet Sato-Kum-Ma fail to teach based on the acquired information including the position information of the another vehicle and the driving control mode of the another vehicle. a traveling trajectory in which the own vehicle has traveled ; other vehicle has traveled in the automated driving mode or the driving assist mode; and draw, based on the plotted trajectories, a region defining an area in which used for controlling automated driving or driving assist control is permitted
Nanri teaches a similar driving support system (abstract). Nanri teaches based on the acquired information including the position information of the another vehicle and the driving control mode of the another vehicle (Pg. 1 – Abstract – “determines whether the other vehicle is in an autonomous driving mode depending on the driving characteristics of the other vehicle, and detects an action of the other vehicle in accordance with the determination result of whether the other vehicle is in the autonomous driving mode.” & See Also Pg. 12 – [0038] – “The object tracking unit 2b tracks each object detected by the object detection device 1. In particular, the object tracking unit 2b determines the sameness of the object (mapping) detected at intervals in accordance with the behavior of the object output at different times” (equates based on the acquired information including the position information of the another vehicle and the driving control mode of the another vehicle as the first quote shows the driving mode being captured of the other vehicle and the position being kept by an object tracking unit wherein mapping data is collected about objects including surrounding detected vehicles. )) other vehicle has traveled in the automated driving mode or the driving assist mode; (Pg. 1 – Abstract – “determines whether the other vehicle is in an autonomous driving mode depending on the driving characteristics of the other vehicle, and detects an action of the other vehicle in accordance with the determination result of whether the other vehicle is in the autonomous driving mode.” (equates to other vehicle has traveled in the automated driving mode or the driving assist mode as the art shows the other vehicles around the host vehicle being detected to be in an autonomous driving mode and thus travel in an autonomous driving mode if detected as doing as such. ))
Yet all fail to specifically teach and draw, based on the plotted trajectories, a region defining an area in which used for controlling automated driving or driving assist control is permitted.
Bauer teaches and draw, based on the plotted trajectories, a region defining an area in which used for controlling automated driving or driving assist control is permitted (Pg. 1 – Abstract – “A method and an apparatus for transitioning a motor vehicle from a manual operating mode to an automated or assisting operating mode for driving along a saved trajectory including a memory, in which the trajectory and a tolerance region of the trajectory are saved;… a transition trajectory from a current instantaneous position to the saved trajectory is calculated and a steering torque is generated that steers the motor vehicle in the direction of the calculated transition trajectory,” & See Also Pg. 4 – [0004] – “FIG. 1 shows an example of a situation for assisted driving onto a saved trajectory;” & See Also Pg. 4 – [0007] – “The region within which the trained trajectory is permitted to be followed in automated state is limited for safety reasons. This limitation represents a tolerance region within which the vehicle must remain for driving to be effected in automated state.” (equates to draw, based on the plotted trajectories, a region defining an area in which used for controlling automated driving or driving assist control is permitted as the fist quote shows the autonomous driving being initiated along a trajectory, switching the vehicle from manual control, and the second an d third quote showing the autonomous driving is only permitted in a region drawn by allowing a tolerance from the trajectory in which the autonomous control is permitted. )) It would have been an advantageous addition to the system disclosed by Sato-Kum-Ma-Nanri to include and draw, based on the plotted trajectories, a region defining an area in which used for controlling automated driving or driving assist control is permitted as this allows for a region to be defined that isn’t permitted to strictly defined roadways and instead allows any saved trajectory over any terrain to be included in the autonomous driving section for the user’s convenience.
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date to include and draw, based on the plotted trajectories, a region defining an area in which used for controlling automated driving or driving assist control is permitted as this allows for predefined trajectories to encompass a region in which the vehicle can travel autonomously allowing an area, rather than single line of control, for the vehicle to be autonomously actuated.
Regarding claim 3 Sato-Kum-Ma-Nanri-Bauer teaches The driving control system according to claim 1, further comprising: (Sato Teaches the following limitations:) a database configured to store information on the traveling trajectory in which the own vehicle has traveled in the automated driving mode or the driving assist mode, (Pg. 11 – [0030] – “…the planned travel route and the autonomous driving control to be performed on the planned travel route are set as described above, the navigation ECU20 transmits the planned travel route and the control table 32 to the vehicular control center via the CAN” ) and the traveling trajectory in which the other vehicle has traveled in the automated driving mode or the driving assist mode; (Pg. 23 – [0098] – “…the CPU41 determines whether or not it is connected to a traffic information server such as an external server or a VICS center in which the interruption history of the autonomous driving control of other vehicles is stored” (equates to database is configured to store information on traveling trajectory in which the other vehicle has traveled in the automated driving mode or the driving assist mode as the VICS connection shows that the automated driving control of other vehicles was active and thus a trajectory was travelled in automated driving mode for the CPU of this art to use as deemed fit for the host vehicle.)) and a congestion time-zone information acquiring unit configured to acquire, based on the database, (Pg. 10 – [0028] – “Further, the communication module 18 is a communication device for receiving traffic information, probe information, weather information, and the like transmitted from a traffic information center, for example, a VICS center…” & See Also Pg. 23 – [0098] – “CPU41 determines whether or not it is connected to a traffic information server such as an external server or a VICS center in which the interruption history of the autonomous driving control of other vehicles is stored.” (equates to congestion time-zone information acquiring unit configured to acquire, based on the database as the communication module of this art is connected to VICS (being the database) which gives congestion related information which can occur at specific time-zones.)) the traveling trajectory in which the own vehicle has traveled in the automated driving mode or the driving assist mode in a time zone of congestion (Pg. 15 – [0051] – “CPU41 determines whether or not the autonomous driving control is interrupted. Here, in the present embodiment, a section in which it is difficult to cause the vehicle to travel by the autonomous driving control is set as an interruption section in which the autonomous driving control of the vehicle” & See Also Pg. 10 – [0028] – “Further, the communication module 18 is a communication device for receiving traffic information, probe information, weather information, and the like transmitted from a traffic information center, for example, a VICS center…” (equates to the traveling trajectory in which the own vehicle has traveled in the automated driving mode or the driving assist mode in a time zone of congestion as the automatic driving support system of this art tracks the autonomous driving section within which the vehicle is traveling and is connected to the VICS to tell the system whether or not it’s trajectory is occurring within the congestion time zone)) and the traveling trajectory in which the other vehicle has traveled in the automated driving mode or the driving assist mode in the time zone of the congestion, (Pg. 23 – [0098] – “…CPU41 determines whether or not it is connected to a traffic information server such as an external server or a VICS center in which the interruption history of the autonomous driving control of other vehicles is stored.” (equates to the traveling trajectory in which the other vehicle has traveled in the automated driving mode or the driving assist mode in the time zone of the congestion as the automated driving of other vehicles is stored as seen above and the interruption history including congestion time zone information provided by the VICS would be given to the CPU of this art.)) wherein the database is configured to store information including passage time information (Pg. 25 – [0104] – “When traveling in an autonomous driving section in which autonomous driving control of a car is performed…” (equates to database is configured to store information including passage time information as the autonomous driving section has designated start and stop points within which the autonomous driving can occur.)) and information on a road type, (Pg. 8 – [0017] – “ Data representing a curvature radius, an intersection, a T-junction, an entrance and an exit of a corner, etc., data representing a downhill road, an uphill road, etc. with respect to a road attribute, and data representing general roads such as a national road, a prefectural road, a narrow street, etc. and toll roads such as a national expressway, an urban expressway, a motorway, a general toll road, a toll bridge, etc. with respect to a road type are respectively recorded.” (equates to database is configured to store road type)) a start point of automated driving, (Pg. 15 – [0050] – “After that, in S14, the CPU41 determines… start position of the autonomous driving control performed”) an end point of the automated driving, (Pg. 2 – [PROBLEM TO BE SOLVED] – “when the vehicle drives within the automatic driving section under the automatic driving control of the vehicle;” (equates to an end point of automated driving as the art describes automated driving sections and sections have start and end points in which the automated driving occurs within)) and an interruption point of the automated driving that are associated with each other, (Pg. 9 – [0022] - “…a section in which a situation in which it is difficult to cause the vehicle to travel by such automatic driving control occurs in the automatic driving section is set as an interruption section in which the automatic driving control of the vehicle is interrupted and the vehicle is caused to travel by manual driving”) and in which the own vehicle has traveled in the automated driving mode or the driving assist mode in the time zone of the congestion (Pg. 15 – [0051] – “CPU41 determines whether or not the autonomous driving control is interrupted. Here, in the present embodiment, a section in which it is difficult to cause the vehicle to travel by the autonomous driving control is set as an interruption section in which the autonomous driving control of the vehicle” & See Also Pg. 10 – [0028] – “Further, the communication module 18 is a communication device for receiving traffic information, probe information, weather information, and the like transmitted from a traffic information center, for example, a VICS center…” (equates to in which the own vehicle has traveled in the automated driving mode or the driving assist mode in the time zone of the congestion as the first quote shows the autonomous driving taking place until being interrupted and the CPU of this art is tracking the vehicle being in the autonomous driving mode and the second quote showing the VICS center which would give the system of this art the time zone congestion information. )) and in which the other vehicle has traveled in the automated driving mode or the driving assist mode in the time zone of the congestion. (Pg. 23 – [0098] – “…CPU41 determines whether or not it is connected to a traffic information server such as an external server or a VICS center in which the interruption history of the autonomous driving control of other vehicles is stored.” (equates to in which the other vehicle has traveled in the automated driving mode or the driving assist mode in the time zone of the congestion as the VICS center provides time zone congestion information as well as the autonomous driving state of the other vehicles in the congestion period.)).
Yet fails to teach the drawing processing unit is configured to draw the region corresponding to the traveling trajectory in which the own vehicle has traveled and the traveling trajectory in which the other vehicle has traveled.
Ma teaches the traveling trajectory in which the own vehicle has traveled (Pg. 12 – [0020] – “providing trajectory histories pertaining to the ego vehicle along with other vehicles and objects” (equates to a traveling trajectory in which the own vehicle has traveled as the quote shows historical or previously travelled upon trajectories by the own or ego vehicle. ) )
Yet all fails to teach the drawing processing unit is configured to draw the region corresponding to and the traveling trajectory in which the other vehicle has traveled.
Kum teaches a similar driving support system (abstract). Kum teaches the drawing processing unit is configured to draw the region (Pg. 14 – [0035] – “For example, the graph model may be represented as illustrated in FIG. 6(b), and may include a plurality of nodes and a plurality of edges that connect the plurality of nodes. The nodes may indicate the electronic device 100 and surrounding objects, respectively. In this case, the nodes may represent at least any one of a position, speed or a heading angle, for example, based on state information on the electronic device 100 or the surrounding objects” (equates to the drawing processing unit is configured to draw the region as the quote shows how a graphical representation of the ego vehicle and the surrounding vehicles are plotted on a map based on position, speed, or a heading angle.)) and the traveling trajectory in which the other vehicle has traveled. ((Pg. 7 – Fig. 6A & See Also Pg. 14 – [0034] – “Referring to FIG. 5, at operation 510, the electronic device 100 may configure a surrounding situation of the electronic device 100 as a graph model.” & See Also Pg. 14 – [0033] – “FIG. 6 is a diagram for describing an operation of configuring a graph model illustrated in FIG. 5.” & See Also Pg. 14 – [0032] – “Accordingly, the processor 180 may recognize historical trajectories of the surrounding objects based on the positions of the surrounding objects on the moving coordinate system based on a constant velocity model” (equates to a traveling trajectory in which the other vehicle has traveled in the automated driving mode or the driving assist mode are plotted on the high-precision map as the second quote shows an image map that generates past trajectories and the third quote shows how the other vehicles mentioned in quote two can be in an autonomous driving mode.)). It would have been an advantageous addition to the system disclosed by Sato-Ma-Nanri to include the drawing processing unit is configured to draw the region corresponding to the traveling trajectory in which the own vehicle has traveled and the traveling trajectory in which the other vehicle has traveled to ensure the occupants can see the trajectory of the vehicle they are within and see past paths other autonomously driven vehicles haven taken ensuring a degree of reliability and more trust can be put into the vehicle when not piloted by a human directly.
Therefor it would have been obvious for one of ordinary skills in the art to configure the driving control system of Sato to be configured to include the drawing processing unit is configured to draw the region corresponding to the traveling trajectory in which the own vehicle has traveled and the traveling trajectory in which the other vehicle has traveled as seeing the trajectories of the vehicle and other vehicles allows for a deeper sense of trust to be developed between the occupants and the vehicle they are relying upon to move them around.
Regarding claim 4 Sato-Kum-Ma-Nanri-Bauer teaches The driving control system according to claim 3, further comprising: (Sato teaches the following limitations:) a route searching unit configured to search for a route from a current position of the own vehicle to a destination, (Pg. 3 – [0002] – “Further, when a desired destination is input to the navigation device This navigation device is provided with a route search function for searching a recommended route from the position of one's own vehicle to a destination”) based on the high-precision map; (Pg. 3 – [0002] – “…navigation device is a device capable of detecting…by a GPS receiver or the like, acquiring map data… navigation device is provided with a route search function for searching a recommended route…”) and a display unit configured to display information including the searched route, (Pg. 3 – [0002] – “displays the guide route on a display screen”)
Yet Sato-Ma-Nanri fails to teach wherein the display unit is configured to display the region drawn by the drawing processing unit in a superimposed manner on the route.
Kum teaches a similar driving support system (abstract). Kum teaches wherein the display unit is configured to display the region drawn by the drawing processing unit in a superimposed manner on the route. (Pg. 13 – [0023] – “The output module 150 may provide information to the outside of the electronic device 100. In this case, the output module 150 may include at least any one of a display module or an audio module. The display module may visually output information. For example, the display module may include at least any one of a display, a hologram device, or a projector. In an embodiment, the display module may be assembled with at least any one of the touch circuitry or sensor circuitry of the input module 140, and may be implemented as a touch screen” & See Also Pg. 13 – [0028] – “As illustrated in FIG. 3, the processor 180 may predict future trajectories of surrounding objects in an integrated way based on the recognized historical trajectories of the surrounding objects. In this case, the processor 180 may predict the future trajectories by integrating and estimating interactions between the surrounding objects and the electronic device 100, and. In this case, the processor 180 may estimate the interactions in an integrated way based on characteristics of the interactions” & See Also Pg. 15 – [0042] – “Thereafter, the electronic device 100 may return to FIG. 5 and may perform operation 530. Referring back to FIG. 5, at operation 530, the electronic device 100 may plan the driving trajectory of the electronic device 100 based on the predicted future trajectories of the surrounding objects. The processor 180 may plan an optimal driving trajectory which may correspond to the predicted future trajectories of the surrounding objects.” (equates to wherein the display unit is configured to display the region drawn by the drawing processing unit in a superimposed manner on the route as the first quote shows the display unit being configured to display the surrounding of the host vehicle and the following quotes showing the trajectories being calculated and used for guiding the own or host vehicle which is based on the surrounding objects and associated trajectories .)). It would have been an advantageous addition to the system disclosed by Sato-Ma-Nanri to include wherein the display unit is configured to display the region drawn by the drawing processing unit in a superimposed manner on the route as this limitation allows for trajectories of the host vehicle and the other vehicles to be included with the route that was originally generated for the host vehicle allowing for a wide variety of options to be seen by the occupants.
Therefor it would have been obvious for one of ordinary skills in the art to configure the driving control system of Sato to be configured to include wherein the display unit is configured to display the region drawn by the drawing processing unit in a superimposed manner on the route as this allows for a plethora of options to be seen in terms of what route have been and will be taken by the host vehicle based on others travelling on a similar path.
Regarding claim 5 Sato-Kum-Ma-Nanri-Bauer teaches The driving control system according to claim 4, further comprising (Sato teaches the following limitations:) a traffic information acquiring unit configured to acquire road construction section information (Pg. 15 & 16 – [0051] – “Next, in the S15, the CPU41 determines whether or not the autonomous driving control is interrupted… a section in which it is difficult to cause the vehicle to travel by the autonomous driving control…A section in which a lane restriction has occurred due to…construction…”) and information on a traffic accident frequent occurrence point (Pg. 15 & 16 – [0051] – “Next, in the S15, the CPU41 determines whether or not the autonomous driving control is interrupted… a section in which it is difficult to cause the vehicle to travel by the autonomous driving control…A section in which a lane restriction has occurred due to an accident…”) and a congestion prediction section,( Pg. 23- [0099] – “…connection with the traffic information server such as the VICS center…” & See Also Pg. 23- [0099] – “The latest interruption history of the autonomous driving control and the latest traffic information in a section (hereinafter, referred to as a detection target section) in front of the traveling direction of the car of which the route information is acquired in the S52 from the server are acquired (S54).” (equates to traffic information acquiring unit configured to acquire a congestion prediction section as the automatic driving support system of this art is connected to the VICS center and is able to get traffic information based on the desired traveling section alerting the user to congestion possibilities.)) wherein the display unit is configured to further display the information acquired by the traffic information acquiring unit in the superimposed manner on the route. (Pg. 10 – [0025] – “On the liquid crystal display 15, a map image including roads, traffic information… are displayed” & See Also Pg. 3 – [0002] – “…displays the guide route on a display screen…”)
Regarding claim 8 Sato teaches A driving control system comprising circuitry (Title: Automatic driving support system, automatic driving support method, and computer program) storing a high-precision map, (Pg. 3 – [0002] – “…navigation device is a device capable of detecting…by a GPS receiver or the like, acquiring map data…”) the circuitry being configured to acquire position information of an own vehicle,( Pg. 6 – [0013] - “As illustrated in FIG. 1, a navigation device 1 according to the present embodiment includes a current position detection unit 11 that detects a current position of a car in which the navigation device 1 is mounted” & See Also Pg. 7 – [0014] – “The current position detection unit 11 includes a GPS 22, a vehicle speed sensor 23, a steering sensor 24, a gyro sensor 25, and the like, and can detect the current vehicle position, direction, vehicle traveling speed, current time, and the like.” ) and acquire information including position information of the another vehicle at that position ( Pg. 4 – [0003] - “In the automatic driving control, for example, a current position of the vehicle, a lane in which the vehicle travels, and positions of other vehicles around the vehicle are detected as needed,…”) own vehicle has traveled in an automated driving mode or a driving assist mode (Pg. 5 – [0009] – “control content acquisition unit (41) configured to, when the vehicle travels in an autonomous driving section in which autonomous driving control of the vehicle is performed, acquire control content of the autonomous driving control of the vehicle in the autonomous driving section;” ) the position information of the own vehicle, (Pg. 3 – [0002] – “Here, the navigation device is a device capable of detecting a current position of the own vehicle by a GPS receiver or the like”) and information indicating a driving control mode of the own vehicle; (Pg. 5 – [0009] – “control content acquisition unit (41) configured to, when the vehicle travels in an autonomous driving section in which autonomous driving control of the vehicle is performed, acquire control content of the autonomous driving control of the vehicle in the autonomous driving section” ) determine based on the position information of the own vehicle and the region, whether the own vehicle is inside or outside the region, (Pg. 22 – [0086] – “CPU41 determines whether or not the car is within the autonomous driving section based on the current position of the car detected by the current position detection unit 11 and the map information” ) notify an occupant of the own vehicle of start of automated driving control or that driving assist control is possible when the own vehicle is determined to have entered the region the region, (Pg. 10 – [0026] – “In addition, the speaker 16 outputs voice guidance for guiding traveling along the guide route based on an instruction from the navigation ECU13, and guidance of traffic information. In particular, in the present embodiment, while the vehicle travels in the autonomous driving section by the autonomous driving control, the control content of the autonomous driving control to be performed on the vehicle next time and thereafter is repeatedly output” & See Also (Pg. 22 – [0086] – “CPU41 determines whether or not the car is within the autonomous driving section based on the current position of the car detected by the current position detection unit 11 and the map information” (equates to a notification unit configured to notify an occupant of the own vehicle of start of automated driving control or that driving assist control is possible as when the own vehicle is determined to have entered region and is in autonomous driving control the speaker outputs information related to vehicle control which can include the starting of the autonomous driving. And the second quote shows how the CPU determines whether or not the vehicle is within the region.)) and notify the occupant of the own vehicle of end of the automated driving control or the driving assist control when the own vehicle is determined to have exited the region; ((Pg. 10 – [0026] – “In addition, the speaker 16 outputs voice guidance for guiding traveling along the guide route based on an instruction from the navigation ECU13, and guidance of traffic information. In particular, in the present embodiment, while the vehicle travels in the autonomous driving section by the autonomous driving control, the control content of the autonomous driving control to be performed on the vehicle next time and thereafter is repeatedly output” & See Also Pg. 2 – [PROBLEM TO BE SOLVED] – “when the vehicle drives within the automatic driving section under the automatic driving control of the vehicle;” & See Also Pg. 22 – [0086] – “CPU41 determines whether or not the car is within the autonomous driving section based on the current position of the car detected by the current position detection unit 11 and the map information” (equates notify the occupant of the own vehicle of end of the automated driving control or the driving assist control when the own vehicle is determined to have exited the region as when the vehicle exits the autonomous driving section and is in autonomous driving control the speaker outputs information related to vehicle control and can include the end of the autonomous driving. And the last quote shows how the determination of whether or not the vehicle is within the region and thus whether or not the vehicle has exited the region is determined.))) and start the automated driving control or the driving assist control in response to determining that the own vehicle has entered the region, (Pg. 12 – [0035] – “On the other hand, in a case where it is determined that the autonomous driving switch is turned ON, the user desires to perform the autonomous driving control in the autonomous driving section. Therefore, as will be described later, after the S5, the CPU41 On the assumption that the vehicle basically performs the autonomous driving control in the autonomous driving section, the control content of the autonomous driving control is set, and the travel guidance of the vehicle for the user who travels by the autonomous driving control is performed” & See Also Pg. 18 – [0067] – “In S25, the CPU41 provides guidance on the control content corresponding to the current section into which the car has newly entered” (equates to and a driving control unit configured to start the automated driving control or the driving assist control in response to determining that the own vehicle has entered the region, as the switch has been flipped to initiate autonomous driving within the vehicle and when the vehicle encounters an autonomous driving section the CPU would start the autonomous driving when the vehicle enters the region as seen by the detection of the current section the vehicle is travelling within in the second quote. ))and terminate the automated driving or the driving assist control in response to determining that the own vehicle has exited the region (Pg. 9 – [0021] – “…autonomous driving control is basically performed only while the vehicle travels in the autonomous driving section” & See Also Pg. 18 – [0067] – “In S25, the CPU41 provides guidance on the control content corresponding to the current section into which the car has newly entered” (equates to and terminate the automated driving or the driving assist control in response to determining that the own vehicle has exited the region as the autonomous driving control would only be performed when the vehicle is within the autonomous driving section thus ending automated control when not in the region, wherein the second quote shows how the guidance is provided based on the detected region and thus when it detected the vehicle exits the autonomous driving section the control in regards to that detected section would be terminated.)).
Yet Sato fails to teach and a driving control mode of the another vehicle and a driving control mode of the another vehicle at that position; configured to plot, on the high-precision map a traveling trajectory in which the own vehicle has traveled in an automated driving mode or a driving assist mode and a traveling trajectory in which the other vehicle has traveled in the automated driving mode or the driving assist mode based on the acquired information including the position information of the another vehicle and the driving control mode of the another vehicle, and draw, based on the plotted trajectories, a region defining an area in which automated driving or driving assist. As well as, based on information including at least information on the region drawn by the drawing processing unit.
Kum teaches a similar driving support system (abstract). Jiang teaches a drawing processing unit configured to draw a region (Pg. 7 – Fig. 6A & See Also Pg. 14 – [0035] – “For example, the graph model may be represented as illustrated in FIG. 6(b), and may include a plurality of nodes and a plurality of edges that connect the plurality of nodes. The nodes may indicate the electronic device 100 and surrounding objects, respectively. In this case, the nodes may represent at least any one of a position, speed or a heading angle, for example, based on state information on the electronic device 100 or the surrounding objects” (equates to and the region drawn by the drawing processing unit as a graphical representation representing a region is drawn based on the position speed or heading of the ego and surrounding vehicles via the graphical model.)) plot, on the high-precision map (Pg. 7 – Fig. 6A & See Also Pg. 14 – [0035] – “For example, the graph model may be represented as illustrated in FIG. 6(b), and may include a plurality of nodes and a plurality of edges that connect the plurality of nodes. The nodes may indicate the electronic device 100 and surrounding objects, respectively. In this case, the nodes may represent at least any one of a position, speed or a heading angle, for example, based on state information on the electronic device 100 or the surrounding objects” (equates to plot, on the high-precision map as the quote shows the graph being formed to show trajectories of host vehicle and surroundings and each having representations based on position, etc. )) formed when a traveling trajectory in which the own vehicle (. (Pg. 7 – Fig. 6A & See Also Pg. 14 – [0034] – “Referring to FIG. 5, at operation 510, the electronic device 100 may configure a surrounding situation of the electronic device 100 as a graph model.” & See Also Pg. 14 – [0033] – “FIG. 6 is a diagram for describing an operation of configuring a graph model illustrated in FIG. 5.” & See Also Pg. 14 – [0032] – “Accordingly, the processor 180 may recognize historical trajectories of the surrounding objects based on the positions of the surrounding objects on the moving coordinate system based on a constant velocity model” (equates to a traveling trajectory in which the other vehicle has traveled in the automated driving mode or the driving assist mode are plotted on the high-precision map as the second quote shows an image map that generates past trajectories and the third quote shows how the other vehicles mentioned in quote two can be in an autonomous driving mode.)) and a traveling trajectory in which the other vehicle has traveled in the automated driving mode or the driving assist mode are plotted on the high-precision map. (Pg. 15 – [0030] – “Some or all of the functions of autonomous vehicle 101 may be controlled or managed by perception and planning system 110, especially when operating in an autonomous driving mode.” & See Also Pg. 16 – [0040] – “In one embodiment, perception module 302 may generate an image map that shows the current positions, current headings, and past trajectories of other vehicles or pedestrians in the environment of autonomous vehicle 101.” & See Also Pg. 13 – [0001] – “More particularly, embodiments of the disclosure relate to methods for evaluating the accuracy of simulation model used to emulate the behavior of autonomous driving vehicles (ADVs).” (equates to a traveling trajectory in which the other vehicle has traveled in the automated driving mode or the driving assist mode are plotted on the high-precision map as the second quote shows an image map that generates past trajectories and the third quote shows how the other vehicles mentioned in quote two can be in an autonomous driving mode.)) based on information including at least information on the region drawn by the drawing processing unit (Pg. 16 – [0040] – “In one embodiment, perception module 302 may generate an image map that shows the current positions, current headings, and past trajectories of other vehicles or pedestrians in the environment of autonomous vehicle 101.”).
Yet Sato-Kum fail to teach based on the acquired information including the position information of the another vehicle and the driving control mode of the another vehicle. a traveling trajectory in which the own vehicle has traveled ; other vehicle has traveled in the automated driving mode or the driving assist mode; and draw, based on the plotted trajectories, a region defining an area in which automated driving or driving assist control is permitted
Ma teaches a traveling trajectory in which the own vehicle has traveled ; (Pg. 12 – [0020] – “providing trajectory histories pertaining to the ego vehicle along with other vehicles and objects” (equates to a traveling trajectory in which the own vehicle has traveled as the quote shows historical or previously travelled upon trajectories by the own or ego vehicle. ) )
Yet all fail to teach based on the acquired information including the position information of the another vehicle and the driving control mode of the another vehicle. a traveling trajectory in which the own vehicle has traveled ; other vehicle has traveled in the automated driving mode or the driving assist mode; and draw, based on the plotted trajectories, a region defining an area in which automated driving or driving assist control is permitted
Nanri teaches a similar driving support system (abstract). Nanri teaches based on the acquired information including the position information of the another vehicle and the driving control mode of the another vehicle (Pg. 1 – Abstract – “determines whether the other vehicle is in an autonomous driving mode depending on the driving characteristics of the other vehicle, and detects an action of the other vehicle in accordance with the determination result of whether the other vehicle is in the autonomous driving mode.” & See Also Pg. 12 – [0038] – “The object tracking unit 2b tracks each object detected by the object detection device 1. In particular, the object tracking unit 2b determines the sameness of the object (mapping) detected at intervals in accordance with the behavior of the object output at different times” (equates based on the acquired information including the position information of the another vehicle and the driving control mode of the another vehicle as the first quote shows the driving mode being captured of the other vehicle and the position being kept by an object tracking unit wherein mapping data is collected about objects including surrounding detected vehicles. )) other vehicle has traveled in the automated driving mode or the driving assist mode; (Pg. 1 – Abstract – “determines whether the other vehicle is in an autonomous driving mode depending on the driving characteristics of the other vehicle, and detects an action of the other vehicle in accordance with the determination result of whether the other vehicle is in the autonomous driving mode.” (equates to other vehicle has traveled in the automated driving mode or the driving assist mode as the art shows the other vehicles around the host vehicle being detected to be in an autonomous driving mode and thus travel in an autonomous driving mode if detected as doing as such. ))
Yet all fail to specifically teach and draw, based on the plotted trajectories, a region defining an area in which used for controlling automated driving or driving assist control is permitted.
Bauer teaches and draw, based on the plotted trajectories, a region defining an area in which used for controlling automated driving or driving assist control is permitted (Pg. 1 – Abstract – “A method and an apparatus for transitioning a motor vehicle from a manual operating mode to an automated or assisting operating mode for driving along a saved trajectory including a memory, in which the trajectory and a tolerance region of the trajectory are saved;… a transition trajectory from a current instantaneous position to the saved trajectory is calculated and a steering torque is generated that steers the motor vehicle in the direction of the calculated transition trajectory,” & See Also Pg. 4 – [0004] – “FIG. 1 shows an example of a situation for assisted driving onto a saved trajectory;” & See Also Pg. 4 – [0007] – “The region within which the trained trajectory is permitted to be followed in automated state is limited for safety reasons. This limitation represents a tolerance region within which the vehicle must remain for driving to be effected in automated state.” (equates to draw, based on the plotted trajectories, a region defining an area in which used for controlling automated driving or driving assist control is permitted as the fist quote shows the autonomous driving being initiated along a trajectory, switching the vehicle from manual control, and the second an d third quote showing the autonomous driving is only permitted in a region drawn by allowing a tolerance from the trajectory in which the autonomous control is permitted. )) It would have been an advantageous addition to the system disclosed by Sato-Kum-Ma-Nanri to include and draw, based on the plotted trajectories, a region defining an area in which used for controlling automated driving or driving assist control is permitted as this allows for a region to be defined that isn’t permitted to strictly defined roadways and instead allows any saved trajectory over any terrain to be included in the autonomous driving section for the user’s convenience.
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date to include and draw, based on the plotted trajectories, a region defining an area in which used for controlling automated driving or driving assist control is permitted as this allows for predefined trajectories to encompass a region in which the vehicle can travel autonomously allowing an area, rather than single line of control, for the vehicle to be autonomously actuated.
Regarding Claim 9 Sato - Kum - Ma - Nanri-Bauer teaches The driving control system according to claim 1, (Sato discloses the following limitations:) in which the own vehicle has traveled in the automated driving mode or the driving assist mode (Pg. 5 – [0009] – “control content acquisition unit (41) configured to, when the vehicle travels in an autonomous driving section in which autonomous driving control of the vehicle is performed, acquire control content of the autonomous driving control of the vehicle in the autonomous driving section;” )
Sato-Nanri fails to teach wherein the drawing processing unit is configured to: plot, on the high-precision map, only the traveling trajectory in which the own vehicle has traveled; and the traveling trajectory in which the other vehicle has traveled in the automated driving mode or the driving assist mode and draw, based on the plotted trajectories, the region used for controlling automated driving or driving assist.
Kum teaches wherein the drawing processing unit is configured to: plot, on the high-precision map, (Pg. 14 – [0035] – “For example, the graph model may be represented as illustrated in FIG. 6(b), and may include a plurality of nodes and a plurality of edges that connect the plurality of nodes. The nodes may indicate the electronic device 100 and surrounding objects, respectively. In this case, the nodes may represent at least any one of a position, speed or a heading angle, for example, based on state information on the electronic device 100 or the surrounding objects” (equates to a drawing processing unit configured to plot on the high precision map as the quote shows how a graphical representation of the ego vehicle and the surrounding vehicles are plotted on a map based on position, speed, or a heading angle.)) and the traveling trajectory in which the other vehicle has traveled in the automated driving mode or the driving assist mode (Pg. 7 – Fig. 6A & See Also Pg. 14 – [0034] – “Referring to FIG. 5, at operation 510, the electronic device 100 may configure a surrounding situation of the electronic device 100 as a graph model.” & See Also Pg. 14 – [0033] – “FIG. 6 is a diagram for describing an operation of configuring a graph model illustrated in FIG. 5.” & See Also Pg. 14 – [0032] – “Accordingly, the processor 180 may recognize historical trajectories of the surrounding objects based on the positions of the surrounding objects on the moving coordinate system based on a constant velocity model” (equates to a traveling trajectory in which the other vehicle has traveled in the automated driving mode or the driving assist mode are plotted on the high-precision map as the second quote shows an image map that generates past trajectories and the third quote shows how the other vehicles mentioned in quote two can be in an autonomous driving mode.)) and draw, based on the plotted trajectories, the region used for controlling automated driving or driving assist. (Pg. 6 – Fig. 5 & See Also Pg. 6 – Fig. 5 & See Also Pg. 14 – [0034] – “the electronic device 100 may configure a surrounding situation of the electronic device 100 as a graph model.” & See Also Pg. 15 – [0042] – “Thereafter, the electronic device 100 may return to FIG. 5 and may perform operation 530. Referring back to FIG. 5, at operation 530, the electronic device 100 may plan the driving trajectory of the electronic device 100 based on the predicted future trajectories of the surrounding objects.” & See Also Pg. 14 – [0034] – “For example, as illustrated in FIG. 6(a), the processor 180 may configure, as the graph model, the electronic device 100 and the surrounding objects and interactions between the electronic device 100 and the surrounding objects.” (equates to and draw, based on the plotted trajectories, a region used for controlling automated driving or driving assist as the first and second quote show how the gathered trajectories are used to form a graphical model to control the vehicle via planning a driving trajectory. And the figure 6a shows the region being formed by the surrounding vehicles and the trajectories gathered therein based on the last quote with the interactions between the device of the host vehicle and the surrounding vehicles. ) )
Yet all fail to teach , based on only the traveling trajectory in which the own vehicle has traveled.
Ma teaches (Pg. 12 – [0020] – “providing trajectory histories pertaining to the ego vehicle along with other vehicles and objects” (equates to a traveling trajectory in which the own vehicle has traveled as the quote shows historical or previously travelled upon trajectories by the own or ego vehicle.)) It would have been an advantageous addition to the system disclosed by Sato-Kum-Nanri to include based on only the traveling trajectory in which the own vehicle has traveled a this allows the region for autonomous or assisted driving control to be based upon the own vehicle’s past trajectories rather than just based on other vehicular data.
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date to include based on only the traveling trajectory in which the own vehicle has traveled as the addition of the own vehicle’s driving history allows for more data to be considered, than just external vehicle information, for the control of the own vehicle within the designated region.
Regarding Claim 10 Sato - Kum - Ma - Nanri-Bauer teaches (Sato discloses the following limitations:) The driving control system according to claim 8, wherein the circuitry (Title: Automatic driving support system, automatic driving support method, and computer program) in which the own vehicle has traveled in the automated driving mode or the driving assist mode (Pg. 5 – [0009] – “control content acquisition unit (41) configured to, when the vehicle travels in an autonomous driving section in which autonomous driving control of the vehicle is performed, acquire control content of the autonomous driving control of the vehicle in the autonomous driving section;” )
Yet Sato - Nanri fails to teach is configured to: plot, on the high-precision map, only the traveling trajectory in which the own vehicle has traveled in and the traveling trajectory in which the other vehicle has traveled in the automated driving mode or the driving assist mode; and draw, based on the plotted trajectories, the region used for controlling automated driving or driving assist.
Kum teaches wherein the drawing processing unit is configured to: plot, on the high-precision map, (Pg. 14 – [0035] – “For example, the graph model may be represented as illustrated in FIG. 6(b), and may include a plurality of nodes and a plurality of edges that connect the plurality of nodes. The nodes may indicate the electronic device 100 and surrounding objects, respectively. In this case, the nodes may represent at least any one of a position, speed or a heading angle, for example, based on state information on the electronic device 100 or the surrounding objects” (equates to a drawing processing unit configured to plot on the high precision map as the quote shows how a graphical representation of the ego vehicle and the surrounding vehicles are plotted on a map based on position, speed, or a heading angle.)) and the traveling trajectory in which the other vehicle has traveled in the automated driving mode or the driving assist mode (Pg. 7 – Fig. 6A & See Also Pg. 14 – [0034] – “Referring to FIG. 5, at operation 510, the electronic device 100 may configure a surrounding situation of the electronic device 100 as a graph model.” & See Also Pg. 14 – [0033] – “FIG. 6 is a diagram for describing an operation of configuring a graph model illustrated in FIG. 5.” & See Also Pg. 14 – [0032] – “Accordingly, the processor 180 may recognize historical trajectories of the surrounding objects based on the positions of the surrounding objects on the moving coordinate system based on a constant velocity model” (equates to a traveling trajectory in which the other vehicle has traveled in the automated driving mode or the driving assist mode are plotted on the high-precision map as the second quote shows an image map that generates past trajectories and the third quote shows how the other vehicles mentioned in quote two can be in an autonomous driving mode.)) and draw, based on the plotted trajectories, the region used for controlling automated driving or driving assist. (Pg. 6 – Fig. 5 & See Also Pg. 6 – Fig. 5 & See Also Pg. 14 – [0034] – “the electronic device 100 may configure a surrounding situation of the electronic device 100 as a graph model.” & See Also Pg. 15 – [0042] – “Thereafter, the electronic device 100 may return to FIG. 5 and may perform operation 530. Referring back to FIG. 5, at operation 530, the electronic device 100 may plan the driving trajectory of the electronic device 100 based on the predicted future trajectories of the surrounding objects.” & See Also Pg. 14 – [0034] – “For example, as illustrated in FIG. 6(a), the processor 180 may configure, as the graph model, the electronic device 100 and the surrounding objects and interactions between the electronic device 100 and the surrounding objects.” (equates to and draw, based on the plotted trajectories, a region used for controlling automated driving or driving assist as the first and second quote show how the gathered trajectories are used to form a graphical model to control the vehicle via planning a driving trajectory. And the figure 6a shows the region being formed by the surrounding vehicles and the trajectories gathered therein based on the last quote with the interactions between the device of the host vehicle and the surrounding vehicles. ) )
Yet all fail to teach , based on only the traveling trajectory in which the own vehicle has traveled.
Ma teaches (Pg. 12 – [0020] – “providing trajectory histories pertaining to the ego vehicle along with other vehicles and objects” (equates to a traveling trajectory in which the own vehicle has traveled as the quote shows historical or previously travelled upon trajectories by the own or ego vehicle.)) It would have been an advantageous addition to the system disclosed by Sato-Kum-Nanri to include based on only the traveling trajectory in which the own vehicle has traveled a this allows the region for autonomous or assisted driving control to be based upon the own vehicle’s past trajectories rather than just based on other vehicular data.
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date to include based on only the traveling trajectory in which the own vehicle has traveled as the addition of the own vehicle’s driving history allows for more data to be considered, than just external vehicle information, for the control of the own vehicle within the designated region.
Claims 2, 6, and 7 are rejected under 35 U.S.C. 103 as being unpatentable over Sato - Kum - Ma - Nanri-Bauer as applied above, and further in view of Erikksson (US 20200003907Al).
Regarding claim 2 Sato - Kum - Ma - Nanri-Bauer teaches The driving control system according to claim 1, (Sato discloses the following limitations) (Title: Automatic driving support system, automatic driving support method, and computer program) (Jiang discloses the following limitation) wherein the drawing processing unit is configured to draw, on the high-precision map, a region in which the region formed upon the plotting on the high-precision map (Pg. 16 – [0040] – “In one embodiment, perception module 302 may generate an image map that shows the current positions, current headings, and past trajectories of other vehicles or pedestrians in the environment of autonomous vehicle 101.” (equates to wherein the drawing processing unit is configured to draw, on the high-precision map, a region in which the region formed upon the plotting on the high-precision map as the art shows an image map being generated including trajectories
Yet all fail to teach wherein the region formed upon the plotting on the high-precision map is approximated to a rectangle.
Erikkson teaches a similar driving control system (abstract). Eriksson teaches wherein the region formed upon the plotting on the high-precision map is approximated to a rectangle (Pg. 1 – Abstract – “The transceiver may be configured to send/receive data messages to/from a plurality of vehicles. The processor may be configured to (i) determine a plurality of selected vehicles from the plurality of vehicles based on a selection criterion and (ii) calculate relative coordinates of the plurality of vehicles based on the data messages from the selected vehicles. The selection criteria may comprise determining (i) a target vehicle and (ii) at least two complementary vehicles. A predicted trajectory of the target vehicle may cross paths with a predicted trajectory of the apparatus” & See Also Pg. 25 – [0137] - “ If one of the vehicles 30a-30n forms a comer of a rectangle, the method 600 ” & See Also Pg. 6 - Fig. 6 (equates to wherein the region formed upon the plotting on the high-precision map is approximated to a rectangle as the trajectories of the host vehicles and the other vehicles are considered and a region forming a rectangle based on the mentioned trajectories is created)). It would have been an advantageous addition to the system disclosed by Sato - Kum - Ma - Nanri to include wherein the region formed upon the plotting on the high-precision map is approximated to a rectangle as this allows for an area to be understood by the driving control system that is for the autonomous driving and allows for a wider area of autonomous driving to occur within rather than only being on the lines of the trajectories allowing for more robust control within the environment to be had by the host vehicle when determining routes it can take within the region.
Therefor it would have been obvious to one of ordinary skill in the art before the effective filing date to configure the system disclosed by Sato - Kum - Ma - Nanri to include wherein the region formed upon the plotting on the high-precision map is approximated to a rectangle as the autonomous vehicle now has a wider area to guide the vehicle through rather than having to stick to a specific trajectory within a region, this allows for more driving area to be covered in case the trajectory cannot be precisely followed due to potential obstacles that may lay in the path of transit.
Regarding claim 6 Sato- Kum – Ma – Nanri - Bauer - Erikkson teaches The driving control system according to claim 2, further comprising: (Sato teaches the following limitations:) a database configured to store information on the traveling trajectory in which the own vehicle has traveled in the automated driving mode or the driving assist mode, (Pg. 11 – [0030] – “…the planned travel route and the autonomous driving control to be performed on the planned travel route are set as described above, the navigation ECU20 transmits the planned travel route and the control table 32 to the vehicular control center via the CAN” ) and the traveling trajectory in which the other vehicle has traveled in the automated driving mode or the driving assist mode; (Pg. 23 – [0098] – “…CPU41 determines whether or not it is connected to a traffic information server such as an external server or a VICS center in which the interruption history of the autonomous driving control of other vehicles is stored.” (equates to the traveling trajectory in which the other vehicle has traveled in the automated driving mode or the driving assist mode in the time zone of the congestion as the automated driving of other vehicles is stored as seen above and the interruption history including congestion time zone information provided by the VICS would be given to the CPU of this art.)) and a congestion time-zone information acquiring unit configured to acquire, based on the database, the traveling trajectory in which the own vehicle has traveled in the automated driving mode or the driving assist mode in a time zone of congestion (Pg. 23 – [0098] – “…CPU41 determines whether or not it is connected to a traffic information server such as an external server or a VICS center in which the interruption history of the autonomous driving control of other vehicles is stored.” (equates to the traveling trajectory in which the own vehicle has traveled in the automated driving mode or the driving assist mode in a time zone of congestion as the automated driving of other vehicles is stored as seen above and the interruption history including congestion time zone information provided by the VICS would be given to the CPU of this art.)) and the traveling trajectory in which the other vehicle has traveled in the automated driving mode or the driving assist mode in the time zone of the congestion, (Pg. 23 – [0098] – “…CPU41 determines whether or not it is connected to a traffic information server such as an external server or a VICS center in which the interruption history of the autonomous driving control of other vehicles is stored.” (equates to the traveling trajectory in which the other vehicle has traveled in the automated driving mode or the driving assist mode in the time zone of the congestion as the automated driving of other vehicles is stored as seen above and the interruption history including congestion time zone information provided by the VICS would be given to the CPU of this art.)) wherein the database is configured to store information including passage time information (pg. 25 – [0104] – “When traveling in an autonomous driving section in which autonomous driving control of a car is performed…” (equates to database is configured to store information including passage time information as the autonomous driving section has designated start and stop points within which the autonomous driving can occur.)) and information on a road type, (Pg. 8 – [0017] – “ Data representing a curvature radius, an intersection, a T-junction, an entrance and an exit of a corner, etc., data representing a downhill road, an uphill road, etc. with respect to a road attribute, and data representing general roads such as a national road, a prefectural road, a narrow street, etc. and toll roads such as a national expressway, an urban expressway, a motorway, a general toll road, a toll bridge, etc. with respect to a road type are respectively recorded.” (equates to database is configured to store road type)) a start point of automated driving, (Pg. 15 – [0050] – “After that, in S14, the CPU41 determines… start position of the autonomous driving control performed”) an end point of the automated driving, (Pg. 2 – [PROBLEM TO BE SOLVED] – “when the vehicle drives within the automatic driving section under the automatic driving control of the vehicle;” (equates to an end point of automated driving as the art describes automated driving sections and the sections have start and end points in which the automated driving occurs within)) and an interruption point of the automated driving that are associated with each other, (Pg. 9 – [0022] - “…a section in which a situation in which it is difficult to cause the vehicle to travel by such automatic driving control occurs in the automatic driving section is set as an interruption section in which the automatic driving control of the vehicle is interrupted and the vehicle is caused to travel by manual driving”) in which the own vehicle has traveled in the automated driving mode or the driving assist mode in the time zone of the congestion (Pg. 15 – [0051] – “CPU41 determines whether or not the autonomous driving control is interrupted. Here, in the present embodiment, a section in which it is difficult to cause the vehicle to travel by the autonomous driving control is set as an interruption section in which the autonomous driving control of the vehicle” & See Also Pg. 10 – [0028] – “Further, the communication module 18 is a communication device for receiving traffic information, probe information, weather information, and the like transmitted from a traffic information center, for example, a VICS center…” (equates to in which the own vehicle has traveled in the automated driving mode or the driving assist mode in the time zone of the congestion as the first quote shows the autonomous driving taking place until being interrupted and the CPU of this art is tracking the vehicle being in the autonomous driving mode and the second quote showing the VICS center which would give the system of this art the time zone congestion information. )) in which the other vehicle has traveled in the automated driving mode or the driving assist mode in the time zone of the congestion. (Pg. 23 – [0098] – “…CPU41 determines whether or not it is connected to a traffic information server such as an external server or a VICS center in which the interruption history of the autonomous driving control of other vehicles is stored.” (equates to in which the other vehicle has traveled in the automated driving mode or the driving assist mode in the time zone of the congestion as the VICS center provides time zone congestion information as well as the autonomous driving state of the other vehicles in the congestion period.)).
Ma teaches the traveling trajectory in which the own vehicle has traveled (Pg. 12 – [0020] – “providing trajectory histories pertaining to the ego vehicle along with other vehicles and objects” (equates to a traveling trajectory in which the own vehicle has traveled as the quote shows historical or previously travelled upon trajectories by the own or ego vehicle. ) )
Yet all fails to teach the drawing processing unit is configured to draw the region corresponding to and the traveling trajectory in which the other vehicle has traveled.
Kum teaches a similar driving support system (abstract). Jiang teaches the drawing processing unit is configured to draw the region (Pg. 14 – [0035] – “For example, the graph model may be represented as illustrated in FIG. 6(b), and may include a plurality of nodes and a plurality of edges that connect the plurality of nodes. The nodes may indicate the electronic device 100 and surrounding objects, respectively. In this case, the nodes may represent at least any one of a position, speed or a heading angle, for example, based on state information on the electronic device 100 or the surrounding objects” (equates to the drawing processing unit is configured to draw the region as the quote shows how a graphical representation of the ego vehicle and the surrounding vehicles are plotted on a map based on position, speed, or a heading angle.)) and the traveling trajectory in which the other vehicle has traveled. ((Pg. 7 – Fig. 6A & See Also Pg. 14 – [0034] – “Referring to FIG. 5, at operation 510, the electronic device 100 may configure a surrounding situation of the electronic device 100 as a graph model.” & See Also Pg. 14 – [0033] – “FIG. 6 is a diagram for describing an operation of configuring a graph model illustrated in FIG. 5.” & See Also Pg. 14 – [0032] – “Accordingly, the processor 180 may recognize historical trajectories of the surrounding objects based on the positions of the surrounding objects on the moving coordinate system based on a constant velocity model” (equates to a traveling trajectory in which the other vehicle has traveled in the automated driving mode or the driving assist mode are plotted on the high-precision map as the second quote shows an image map that generates past trajectories and the third quote shows how the other vehicles mentioned in quote two can be in an autonomous driving mode.)). It would have been an advantageous addition to the system disclosed by Sato to include the drawing processing unit is configured to draw the region corresponding to the traveling trajectory in which the own vehicle has traveled and the traveling trajectory in which the other vehicle has traveled to ensure the occupants can see the trajectory of the vehicle they are within and see past paths other autonomously driven vehicles haven taken ensuring a degree of reliability and more trust can be put into the vehicle when not piloted by a human directly.
Therefor it would have been advantageous to for one of ordinary skills in the art to configure the driving control system of Sato to be configured to include the drawing processing unit is configured to draw the region corresponding to the traveling trajectory in which the own vehicle has traveled and the traveling trajectory in which the other vehicle has traveled as seeing the trajectories of the vehicle and other vehicles allows for a deeper sense of trust to be developed between the occupants and the vehicle they are relying upon to move them around.
Regarding claim 7 Sato- Kum – Ma - Nanri –Bauer- Erikkson teaches The driving control system according to claim 6, further comprising: (Sato discloses the following limitations:) a route searching unit configured to search for a route from a current position of the own vehicle to a destination, (Pg. 3 – [0002] – “Further, when a desired destination is input to the navigation device This navigation device is provided with a route search function for searching a recommended route from the position of one's own vehicle to a destination”) based on the high-precision map; (Pg. 3 – [0002] – “…navigation device is a device capable of detecting…by a GPS receiver or the like, acquiring map data… navigation device is provided with a route search function for searching a recommended route…”) and a display unit configured to display information including the searched route, (Pg. 3 – [0002] – “displays the guide route on a display screen”)
Yet Sato– Ma - Nanri - Erikkson fails to teach wherein the display unit is configured to display the region drawn by the drawing processing unit in a superimposed manner on the route.
Kum teaches a similar driving support system (abstract). Jiang teaches wherein the display unit is configured to display the region drawn by the drawing processing unit in a superimposed manner on the route. (Pg. 13 – [0023] – “The output module 150 may provide information to the outside of the electronic device 100. In this case, the output module 150 may include at least any one of a display module or an audio module. The display module may visually output information. For example, the display module may include at least any one of a display, a hologram device, or a projector. In an embodiment, the display module may be assembled with at least any one of the touch circuitry or sensor circuitry of the input module 140, and may be implemented as a touch screen” & See Also Pg. 13 – [0028] – “As illustrated in FIG. 3, the processor 180 may predict future trajectories of surrounding objects in an integrated way based on the recognized historical trajectories of the surrounding objects. In this case, the processor 180 may predict the future trajectories by integrating and estimating interactions between the surrounding objects and the electronic device 100, and. In this case, the processor 180 may estimate the interactions in an integrated way based on characteristics of the interactions” & See Also Pg. 15 – [0042] – “Thereafter, the electronic device 100 may return to FIG. 5 and may perform operation 530. Referring back to FIG. 5, at operation 530, the electronic device 100 may plan the driving trajectory of the electronic device 100 based on the predicted future trajectories of the surrounding objects. The processor 180 may plan an optimal driving trajectory which may correspond to the predicted future trajectories of the surrounding objects.” (equates to wherein the display unit is configured to display the region drawn by the drawing processing unit in a superimposed manner on the route as the first quote shows the display unit being configured to display the surrounding of the host vehicle and the following quotes showing the trajectories being calculated and used for guiding the own or host vehicle which is based on the surrounding objects and associated trajectories .)). It would have been an advantageous addition to the system disclosed by Sato– Ma - Nanri - Erikkson to include wherein the display unit is configured to display the region drawn by the drawing processing unit in a superimposed manner on the route as this limitation allows for trajectories of the host vehicle and the other vehicles to be included with the route that was originally generated for the host vehicle allowing for a wide variety of options to be seen by the occupants.
Therefor it would have been advantageous to for one of ordinary skills in the art to configure the driving control system of Sato to be configured to include wherein the display unit is configured to display the region drawn by the drawing processing unit in a superimposed manner on the route as this allows for a plethora of options to be seen in terms of what route have been and will be taken by the host vehicle based on others travelling on a similar path.
Response to Argument
Response to 35 U.S.C. § 103 rejection of claims 1-10 applicant’s amendments to the claim changes the scope. Applicant’s arguments have been considered but are not persuasive.
Applicant argues on pages 2-3, “Absence of the claimed feature in Sato
Sato does not teach the above limitations of claim 1.
Sato discloses predefined "autonomous driving sections" stored in a map database. For example, Sato teaches that "an expressway, a freeway, a toll road, and an ordinary road may be set as the autonomous driving section" (Sato 1[0021]) and that "CPU41 determines whether or not the car is within the autonomous driving section based on the current position of the car detected by the current position detection unit 11 and the map information" (Sato 1[0086]).
These passages make clear that Sato's "region" is a fixed roadway section, predetermined and stored in the map. Sato contains no disclosure of plotting trajectories of the own vehicle or other vehicles, nor of drawing a region based on such plotted trajectories segmented by driving control mode, as recited in claim 1. Indeed, the Office acknowledges this deficiency, stating that Sato "fails to teach...a drawing processing unit configured to plot...trajectories...and draw, based on the plotted trajectories, a region" (the Office Action, p. 10). ” – In response to point (a) the argument is made moot in light of new grounds of rejection as supplied above.
Applicant argues on page 3, “None of Kum, Ma, and Nanri teaches the deficiencies of Sato
The additional references do not cure the above deficiency.
Kum discloses constructing a graph model of the ego vehicle and surrounding objects: "the graph model ... may include a plurality of nodes and a plurality of edges... the nodes may represent at least any one of a position, speed or a heading angle" (Kum 1[0035]). While this paragraph describes an abstract relational structure, it does not disclose drawing a control-use region derived from trajectories.
Ma states that "trajectory histories pertaining to the ego vehicle along with other vehicles and objects" may be provided (Ma 1[0020]). This describes storing past trajectories, but there is no teaching of drawing a region, nor of filtering trajectory segments based on automated or assist driving modes.
Nanri teaches detecting whether another vehicle is in an autonomous driving mode: "determines whether the other vehicle is in an autonomous driving mode depending on the driving characteristics of the other vehicle, and detects an action of the other vehicle" (Nanri Abstract; see also 1[0038]). However, Nanri contains no disclosure of incorporating such mode determinations into a map-based region generation process.
Accordingly, none of Kum, Ma, or Nanri discloses or suggests the missing element of Sato, namely: a drawing processing unit is configured to: plot, on the high-precision map, a traveling trajectory in which the own vehicle has traveled in an automated driving mode or a driving assist mode and a traveling trajectory in which the another vehicle has traveled in the automated driving mode or the driving assist mode, based on the acquired information including the position information of the another vehicle and the driving control mode of the another vehicle, the position information of the own vehicle, and information indicating a driving control mode of the own vehicle; and draw, based on the plotted trajectories, a region defining an area in which automated driving or driving assist control is permitted; [and] a driving control mode determining unit configured to determine, based on the position information of the own vehicle and a region drawn by the drawing processing unit, whether the own vehicle is inside or outside the region. ” – As to point A the examiner respectfully disagrees. Applicant asserts that Sato, Kum, Ma, or Nanri does not teach “a drawing processing unit is configured to: plot, on the high-precision map, a traveling trajectory in which the own vehicle has traveled in an automated driving mode or a driving assist mode and a traveling trajectory in which the another vehicle has traveled in the automated driving mode or the driving assist mode, based on the acquired information including the position information of the another vehicle and the driving control mode of the another vehicle, the position information of the own vehicle, and information indicating a driving control mode of the own vehicle;”. During Patent Examination, pending claims must be given their broadest reasonable interpretation consistent with the specification (see MPEP 2111). The broadest reasonable interpretation of the aforementioned claim is plotting on a map a path of travel in which a user and another vehicle has travelled in driver assisted mode of transit. Sato teaches acquiring the trajectory of the user vehicle when it is determined to be travelling in an autonomous driving mode. Nanri teaches acquiring other vehicles around the user vehicle that travel in an autonomous mode and tracking the position of the other vehicles. (as mapped above in claim 1). Therefor the Examiner respectfully disagrees with the applicants arguments and assert that Sato-Nanri teaches ““a drawing processing unit is configured to: plot, on the high-precision map, a traveling trajectory in which the own vehicle has traveled in an automated driving mode or a driving assist mode and a traveling trajectory in which the another vehicle has traveled in the automated driving mode or the driving assist mode, based on the acquired information including the position information of the another vehicle and the driving control mode of the another vehicle, the position information of the own vehicle, and information indicating a driving control mode of the own vehicle;”. Sato teaches : own vehicle has traveled in an automated driving mode or a driving assist mode (Pg. 5 – [0009] – “control content acquisition unit (41) configured to, when the vehicle travels in an autonomous driving section in which autonomous driving control of the vehicle is performed, acquire control content of the autonomous driving control of the vehicle in the autonomous driving section;” ) (Nanri teaches:) based on the acquired information including the position information of the another vehicle and the driving control mode of the another vehicle (Pg. 1 – Abstract – “determines whether the other vehicle is in an autonomous driving mode depending on the driving characteristics of the other vehicle, and detects an action of the other vehicle in accordance with the determination result of whether the other vehicle is in the autonomous driving mode.” & See Also Pg. 12 – [0038] – “The object tracking unit 2b tracks each object detected by the object detection device 1. In particular, the object tracking unit 2b determines the sameness of the object (mapping) detected at intervals in accordance with the behavior of the object output at different times” (equates based on the acquired information including the position information of the another vehicle and the driving control mode of the another vehicle as the first quote shows the driving mode being captured of the other vehicle and the position being kept by an object tracking unit wherein mapping data is collected about objects including surrounding detected vehicles. )) other vehicle has traveled in the automated driving mode or the driving assist mode; (Pg. 1 – Abstract – “determines whether the other vehicle is in an autonomous driving mode depending on the driving characteristics of the other vehicle, and detects an action of the other vehicle in accordance with the determination result of whether the other vehicle is in the autonomous driving mode.” (equates to other vehicle has traveled in the automated driving mode or the driving assist mode as the art shows the other vehicles around the host vehicle being detected to be in an autonomous driving mode and thus travel in an autonomous driving mode if detected as doing as such. ))
Applicant argues on page 7, “Lack of motivation to modify Sato
Even assuming arguendo that Kum, Ma, or Nanri disclose elements related to trajectories or vehicle modes, there is no teaching or suggestion that would motivate a skilled artisan to modify Sato's system.
Sato's use of autonomous driving sections is complete and self-sufficient. The system determines entry and exit based on map data (Sato 1[0086]) and controls automated driving accordingly. The Office has not identified any problem in Sato's approach that would lead a person skilled in the art to discard these predefined sections in favor of dynamically drawn regions based on trajectory data to arrive at the claimed system.
As established in KSR Int'l v. Teleflex, 550 U.S. 398 (2007), an obviousness determination rests on "articulated reasoning with some rational underpinning." Here, no such reasoning has been provided. Simply assembling unrelated disclosures from Kum (graph models), Ma (trajectory histories), and Nanri (mode detection) does not amount to a teaching or suggestion to replace Sato's autonomous sections with the claimed trajectory-based regions.
” - As to point (c) see point (a)
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Kolbe ( JP2023519617A ) – “The present invention relates to a method for planning a target trajectory (TSoll) to be traveled in an automated manner by a vehicle (1), a discrete set of trajectories (T) as candidate target trajectories (TSoll). determined, each of the trajectories (T) is composed of a plurality of arranged track segments (TR, TR(0,0)(1,1)), and the plan determines the trajectory ( T), said selection comprises evaluating the trajectory (T) with a predefined cost function (K) and identifying the trajectory (T) evaluated as most cost effective. is based on According to the invention, each track segment (TR, TR(0,0)(1,1)) is associated with a plurality of predefined sub-trajectories each having the same position specification and different dynamics specifications”.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to REECE ANTHONY WAKELY whose telephone number is (571)272-3783. The examiner can normally be reached Monday - Friday 8:30am-6:00pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hitesh Patel can be reached on (571) 270-5442. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/R.A.W./ Examiner, Art Unit 3667C
/Hitesh Patel/ Supervisory Patent Examiner, Art Unit 3667
2/5/26