DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 01/15/2026 has been entered.
Status of Claims
Claims 1-2, & 4-8 of U.S. Application No. 18/228299 filed on 01/02/2026 have been examined.
Office Action is in response to the Applicant's amendments and remarks filed01/02/2026. Claims 1, & 7 are presently amended, and Claim 3 is cancelled. Claims 1-2, & 4-8 are presently pending and are presented for examination.
Response to Arguments
In regards to the previous rejection under 35 U.S.C. § 103: Applicant’s arguments with respect to the independent claim(s) have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. A new grounds of rejection is made in view of US 2020/0377116A1 (“Abrashov”).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1-2, & 4-5, & 7 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 2018/0141570A1 (“Kimura”), in view of US 2021/0064030A1 (“Jiang”), in view of US 2021/0016771A1 (“Ginther”), in view of US 2020/0377116A1 (“Abrashov”).
As per claim 1 Kimura discloses
A computation apparatus for computing a travel route of a vehicle (see at least Kimura, para. [0051]: The autonomous driving ECU 3 generates a travel plan for allowing the vehicle 2 to travel along the target route.), the computation apparatus comprising:
at least one processor circuit with a memory comprising instructions that cause, when executed by the processor circuit, the at least one processor circuit to at least (see at least Kimura, para. [0048]: The autonomous driving ECU 3 is an electronic control unit having a computing unit such as a Central Processing Unit (CPU), storage units such as a Read Only Memory (ROM) and a Random Access Memory (RAM), and a Controller Area Network (CAN) communication circuit.):
acquire vehicle state information (i1) indicating a state of the vehicle (see at least Kimura, para. [0049]: The autonomous driving ECU 3is connected to a map database (not shown) that stores map information, a positioning unit (not shown) that determines the position of the vehicle 2 on the map by the Global Positioning System(GPS), various sensors (not shown) that detect the traveling state of the vehicle 2, and various actuators (not shown) that cause the vehicle 2 to travel.):
acquire vehicle periphery information (i2) indicating a state around the vehicle (see at least Kimura, para. [0053]: The radar sensor 5, provided, for example, at the front end of the vehicle 2, uses radio waves(or light) to detect an obstacle ahead of the vehicle 2 (including an obstacle obliquely in front of the vehicle 2). The radar sensor 5 detects an obstacle by sending radio waves in the forward direction of the vehicle 2 then receiving radio waves reflected by the obstacle such as another vehicle. The radar sensor 5 sends the obstacle information on the detected obstacle to the ECU 10.);
acquire driver state information (i3) indicating a state of a driver of the vehicle (see at least Kimura, para. [0054]: The driver monitor camera 6, provided, for example, on the cover of the steering column of the vehicle 2 and in the position in front of the driver, captures the image of the driver's face (see FIG. 2A).FIG. 2A shows the capturing range Dp of the driver monitor camera 6. The driver monitor camera 6 sends the captured information on the driver Dr to the ECU 10.);
calculate an actual route (RTa) of the vehicle based on the vehicle state information, the vehicle periphery information (see at least Kimura, para. [0051]: The autonomous driving ECU 3 generates a travel plan for allowing the vehicle 2 to travel along the target route. The travel plan includes, for example, a steering target value and a vehicle speed target value that are set for each predetermined distance on the target route. The autonomous driving ECU 3 uses a known method to generate a travel plan. The autonomous driving ECU 3 autonomously drives the vehicle 2 according to the travel plan based on the position information on the vehicle 2 on the map positioned by the positioning unit. The autonomous driving ECU 3 sends control signals to various actuators to control the vehicle 2 for autonomously driving the vehicle.);
the signal includes: a first notification signal for notifying the driver (see at least Kimura, para. [0116]: When performing the first warning, the warning control unit 106B of the vehicle system 1Bcauses the warning unit 20 to perform the first warning that is either a vibration generated by the vibration generation unit 203 or a sound output by the sound output unit 202,), and
a second notification signal for notifying the driver, having a higher notification level than the first notification signal (see at least Kimura, para. [0117]: The second warning is a warning that includes both a vibration, generated by the vibration generation unit203, and a sound output by the sound output unit 202…On the other hand, the warning control unit 106B determines a driver who does not turn the face toward the front in response to a light stimulus and is not holding the steering wheel as being a driver whose wakefulness level is extremely low or as a driver whose driving awareness level is extremely low. For such a driver, the warning control unit 106B outputs the second warning stronger than the first warning. The second warning is a warning that has a warning level higher than that of the first warning or a warning that excites more senses than the first warning.),
which is output in a case where the confirmation motion is not made for a predetermined period (see at least Kimura, para. [0117]: The warning control unit 106B causes the warning unit 20 to perform a second warning while alight stimulus is being presented if it is determined by the first determination unit 107 that the driver is not facing forward and if it is determined by the second determination unit 108 that the driver does not hold the steering wheel before a predetermined time elapses after the first warning is started.).
However Kimura does not explicitly disclose
obtain a predicted route (RTb) by correcting the actual route based on the driver state information; and
output a signal (SIG1) to a display apparatus (7) in a case where the predicted route satisfies a predetermined condition;
wherein the driver state information includes acquires confirmation motion information (i33) indicating whether or not a confirmation motion is made by the driver,
the output of the signal is suppressed based on the confirmation motion information after outputting the signal,
the predetermined condition includes:
an object around the vehicle indicated by the vehicle periphery information is located on the predicted route, and/or
in a case where the object is another vehicle, its travel route calculated based on the vehicle periphery information intersects the predicted route, and
the actual route and the predicted route are individually displayed on the display apparatus.
Jiang teaches
obtain a predicted route (RTb) by correcting the actual route based on the driver state information (see at least Jiang, para. [0051]: For example, during traveling along the planned route, the driver's intention had shifted to another one, the processing unit determines whether to update/change the en-route goal to the second target according to, e.g., whether the original target is closer than the second target, whether it is feasible/safe to change to the second target, whether it is urgent to change the goal, whether it is quicker to move to the original target or the second target, or the combination of the above….Alternatively, the en-route goal could be changed/updated to the second target immediately, and therefore a new route is planned, and the original route is abandoned.); and
output a signal (SIG1) to a display apparatus (7) in a case where the predicted route satisfies a predetermined condition (see at least Jiang, para. [0022]: The processing unit 130 is coupled to the driver interface 110, and the sensing unit 120. The processing unit 130 may process the input signals, data and instructions. para. [0030]: In some other embodiments, the driver assistance system 100 further includes an audible unit configured to warn, notify or acknowledge the driver regarding the creation or update of the en-route goal. para. [0053]: Taking FIG. 9 for example, the vehicle 900 is traveling on the road 990. The en-route goal is determined to be the shop 960, and the route 972 is planned. At the time of the planning, it is feasible and safe to switch lanes from lane L1 to lane L2. However, during the vehicle is moving, the nearby vehicle 940 is approaching such that it is not safe for the driver to switch lanes. As such, the en-route goal is updated to the next shop 962; and thus the updated route 974 is planned. It is noted that these scenarios are for illustration purpose only, the en-route goal determination and the route planning process are not limited thereto.);
the predetermined condition includes: an object around the vehicle indicated by the vehicle periphery information is located on the predicted route, and/or in a case where the object is another vehicle, its travel route calculated based on the vehicle periphery information intersects the predicted route (see at least Jiang, para. [0053]: Taking FIG. 9 for example, the vehicle 900 is traveling on the road 990. The en-route goal is determined to be the shop 960, and the route 972 is planned. At the time of the planning, it is feasible and safe to switch lanes from lane L1 to lane L2. However, during the vehicle is moving, the nearby vehicle 940 is approaching such that it is not safe for the driver to switch lanes. As such, the en-route goal is updated to the next shop 962; and thus the updated route 974 is planned. It is noted that these scenarios are for illustration purpose only, the en-route goal determination and the route planning process are not limited thereto.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kimura to incorporate the teaching of obtain a predicted route (RTb) by correcting the actual route based on the driver state information; and output a signal (SIG1) to a display apparatus (7) in a case where the predicted route satisfies a predetermined condition, the predetermined condition includes an object around the vehicle indicated by the vehicle periphery information is located on the predicted route, and/or in a case where the object is another vehicle, its travel route calculated based on the vehicle periphery information intersects the predicted route of Jiang, with a reasonable expectation of success, in order to estimate the driver's intention and provides the updated route such that that the operation could be smoothly executed, and thus enables a more efficient communication between the driver and the vehicle (see at least Jiang, para. [0039]).
Ginther teaches
output a signal in a case where the predicted route satisfies a predetermined condition (see at least Ginther, para. [0015]: The one or more forward travel sensors 68 are operable to detect a detrimental riding situation in the motorcycle's travel path (e.g., a vehicle, animal, or other object, or various road-based hazards such as potholes or bridge grates within a predetermined range of the motorcycle's forward travel path) and output a corresponding signal to the controller 64.);
wherein the driver state information includes acquires confirmation motion information (i33) indicating whether or not a confirmation motion is made by the driver (see at least Ginther, para. [0036-0037]: Once the controller 64 determines that the rider is positively physically engaged with the motorcycle 10 at step 216, the method proceeds to step 218 where it must be determined whether the rider is alert and paying attention. In other words, the controller 64 performs or acts upon a cognitive analysis of the rider.),
the output of the signal is suppressed based on the confirmation motion information after outputting the signal (see at least Ginther, para. [0037]: If at step 218, the rider is found to have sufficient cognitive engagement in the riding activity, i.e., the rider is determined to be vigilant, the method proceeds to step 220 where the controller 64 sends a signal to the throttle actuator 72 to automatically close the engine throttle ( e.g., an override of a rider-input throttle position) and to automatically engage a brake actuator ( e.g., brake actuator 56 or other without any required input from the rider) to actuate the brake 48 to achieve maximum deceleration (e.g., to engage ABS).).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kimura to incorporate the teaching of output a signal in a case where the predicted route satisfies a predetermined condition, wherein the driver state information includes acquires confirmation motion information (i33) indicating whether or not a confirmation motion is made by the driver, the output of the signal is suppressed based on the confirmation motion information after outputting the signal, of Ginther, with a reasonable expectation of success, in order to limit the actual implementation of the autonomous braking to times when the rider is judged to be capable of managing the consequences and maintaining control of the motorcycle throughout the actual autonomous braking event (see at least Ginther, para. [0041]).
Abrashov teaches
the actual route and the predicted route are individually displayed on the display apparatus (see at least Abrashov, para. [0058-0060]: Then, the optimum trajectory TO and actual current trajectory TE are materialized on a medium EA with an aspect that depends on the determined value of the parameter. This materialization is triggered by the calculation means MC and performed in the sub-step 30 of the algorithm of FIG. 2… Thus, it can be done by displaying different lines on a medium EA which is selected from a display screen equipping the vehicle VA and a windshield of the vehicle VA. In the non-limiting example shown in FIGS. 1 and 3, the materialization is accomplished by displaying of images IA on a screen EA of the central screen CC located in (or on) the dashboard PB of the vehicle VA. Alternatively or in addition, it could be accomplished by displaying images IA on a screen of the vehicle dashboard VA or on the windshield when the vehicle VA has a “head up” display device.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kimura to incorporate the teaching of the actual route and the predicted route are individually displayed on the display apparatus, of Abrashov, with a reasonable expectation of success, in order to inform the driver of the influence of their manual action on the actual trajectory (see at least Abrashov, para. [0060]).
As per claim 2 Kimura discloses
driver line-of-sight information indicating a direction of a line of sight of the driver (see at least Kimura, para. [0058]: FIG. 2A shows the driver monitor camera 6, the display projection unit 201, a driver Dr, a ground line G corresponding to the ground, a height Eh of a driver's eye point Ep, a straight line Hp extending in the longitudinal direction of the vehicle 2 via the driver's eye point Ep, a light stimulus P, a straight line Hu joining the driver's eye point Ep and the upper end of the light stimulus P, an angle θe formed by the straight line Hp and the straight line Hu, and a distance Lp from the driver's eye point Ep to lire frontend of the vehicle 2. The driver's eye point Ep is, for example, a virtual point (one point) representing the position of the eyes of the driver Dr in the normal driving state.).
However Kimura does not explicitly disclose
wherein the driver state information further includes driver posture information indicating a posture of the driver.
Ginther teaches
wherein the driver state information includes driver posture information indicating a posture of the driver (see at least Ginther, para. [0035]: Optionally, the check at step 216 can also require detection of the rider being seated on the seat 28. This can include determining whether a weight above a minimum threshold is exerted upon the seat 28 to result in a positive check.), and
driver line-of-sight information indicating a direction of a line of sight of the driver (see a at least Ginther, para. [0018]: The rider cognition sensor 76 can be a camera operable with the controller 64 to perform facial recognition and interpretation, and/or identifying and tracking the rider's eyes. Thus, the rider cognition sensor 76 can collect data indicative of where the rider is looking and/or whether the rider's eyes are open and looking up at the forward travel path.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kimura to incorporate the teaching of wherein the driver state information further includes driver posture information indicating a posture of the driver of Ginther, with a reasonable expectation of success, in order to limit the actual implementation of the autonomous braking to times when the rider is judged to be capable of managing the consequences and maintaining control of the motorcycle throughout the actual autonomous braking event (see at least Ginther, para. [0041]).
As per claim 4 Kimura does not explicitly disclose
wherein the confirmation motion includes a motion of directing the line of sight of the driver to the object around the vehicle indicated by the vehicle periphery information
Ginther teaches
wherein the confirmation motion includes a motion of directing the line of sight of the driver to the object around the vehicle indicated by the vehicle periphery information (see at least Ginther, para. [0036]: Once the controller 64 determines that the rider is positively physically engaged with the motorcycle 10 at step 216, the method proceeds to step 218 where it must be determined whether the rider is alert and paying attention. In other words, the controller 64 performs or acts upon a cognitive analysis of the rider. This can involve interpreting signals output from sensors including, for example, the rider cognition sensor 76 and/or the helmet-based rider cognition sensor 84.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kimura to incorporate the teaching of wherein the confirmation motion includes a motion of directing the line of sight of the driver to the object around the vehicle indicated by the vehicle periphery information of Ginther, with a reasonable expectation of success, in order to limit the actual implementation of the autonomous braking to times when the rider is judged to be capable of managing the consequences and maintaining control of the motorcycle throughout the actual autonomous braking event (see at least Ginther, para. [0041]).
As per claim 5 Kimura does not explicitly disclose
wherein the actual route is updated at a predetermined cycle based on the vehicle state information and the vehicle periphery information, and thereby the predicted route is updated based on driver state information, and the output of the signal is suppressed in a case where the updated predicted route no longer satisfies the predetermined condition after outputting the signal.
Ginther teaches
wherein the actual route is updated at a predetermined cycle based on the vehicle state information and the vehicle periphery information, and the output of the signal is suppressed in a case where the updated predicted route no longer satisfies the predetermined condition after outputting the signal (see at least Ginther, para. [0014-0015] & para. [0034]: After start-up, the method proceeds to box 204 where inputs from vehicle travel sensors (e.g., forward travel sensor(s) 68 and wheel speed sensor 60) are monitored by the controller 64…If no front end collision is imminent, the method returns to step 204 in a cycle of continuous or periodic monitoring. If a collision is imminent as determined at step 206, this provides a trigger for an autonomous braking event, and the method proceeds to step 208 where it is determined whether the rider is already applying the brakes…If maximum deceleration is already being achieved through rider-applied braking, the controller 64 disregards the autonomous braking trigger—the method ends at box 212 and no further intervention is made. ).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kimura to incorporate the teaching of wherein the calculation unit updates the travel route at a predetermined cycle based on the vehicle state information, the vehicle periphery information, and the driver state information, and the signal output unit suppresses the output of the signal in a case where the updated travel route no longer satisfies the predetermined condition after outputting the signal of Ginther, with a reasonable expectation of success, in order to limit the actual implementation of the autonomous braking to times when the rider is judged to be capable of managing the consequences and maintaining control of the motorcycle throughout the actual autonomous braking event (see at least Ginther, para. [0041]).
Jiang teaches
thereby the predicted route is updated based on driver state information (see at least Jiang, para. [0051]: For example, during traveling along the planned route, the driver's intention had shifted to another one, the processing unit determines whether to update/change the en-route goal to the second target according to, e.g., whether the original target is closer than the second target, whether it is feasible/safe to change to the second target, whether it is urgent to change the goal, whether it is quicker to move to the original target or the second target, or the combination of the above….Alternatively, the en-route goal could be changed/updated to the second target immediately, and therefore a new route is planned, and the original route is abandoned.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kimura to incorporate the teaching of thereby the predicted route is updated based on driver state information of Jiang, with a reasonable expectation of success, in order to estimate the driver's intention and provides the updated route such that that the operation could be smoothly executed, and thus enables a more efficient communication between the driver and the vehicle (see at least Jiang, para. [0039]).
As per claim 7 Kimura discloses
A vehicle comprising a computation apparatus for computing a travel route of a vehicle (see at least Kimura, para. [0051]: The autonomous driving ECU 3 generates a travel plan for allowing the vehicle 2 to travel along the target route.); and
a wheel (see at least Kimura, para. [0055]: An example of the vehicle speed sensor 7 is a wheel speed sensor that is provided on the wheels of the vehicle 2…),
the computation apparatus comprising at least one processor circuit with a memory comprising instructions that cause, when executed by the processor circuit, the at least one processor circuit to at least (see at least Kimura, para. [0048]: The autonomous driving ECU 3 is an electronic control unit having a computing unit such as a Central Processing Unit (CPU), storage units such as a Read Only Memory (ROM) and a Random Access Memory (RAM), and a Controller Area Network (CAN) communication circuit.):
acquire vehicle state information (i1) indicating a state of the vehicle (see at least Kimura, para. [0049]: The autonomous driving ECU 3is connected to a map database (not shown) that stores map information, a positioning unit (not shown) that determines the position of the vehicle 2 on the map by the Global Positioning System(GPS), various sensors (not shown) that detect the traveling state of the vehicle 2, and various actuators (not shown) that cause the vehicle 2 to travel.);
acquire vehicle periphery information (i2) indicating a state around the vehicle (see at least Kimura, para. [0053]: The radar sensor 5, provided, for example, at the front end of the vehicle 2, uses radio waves(or light) to detect an obstacle ahead of the vehicle 2 (including an obstacle obliquely in front of the vehicle 2). The radar sensor 5 detects an obstacle by sending radio waves in the forward direction of the vehicle 2 then receiving radio waves reflected by the obstacle such as another vehicle. The radar sensor 5 sends the obstacle information on the detected obstacle to the ECU 10.);
acquire driver state information (i3) indicating a state of a driver of the vehicle (see at least Kimura, para. [0054]: The driver monitor camera 6, provided, for example, on the cover of the steering column of the vehicle 2 and in the position in front of the driver, captures the image of the driver's face (see FIG. 2A).FIG. 2A shows the capturing range Dp of the driver monitor camera 6. The driver monitor camera 6sends the captured information on the driver Dr to the ECU 10.);
calculate an actual route (RTa) of the vehicle based on the vehicle state information and the vehicle periphery information, (see at least Kimura, para. [0051]: The autonomous driving ECU 3 generates a travel plan for allowing the vehicle 2 to travel along the target route. The travel plan includes, for example, a steering target value and a vehicle speed target value that are set for each predetermined distance on the target route. The autonomous driving ECU 3 uses a known method to generate a travel plan. The autonomous driving ECU 3 autonomously drives the vehicle 2 according to the travel plan based on the position information on the vehicle 2 on the map positioned by the positioning unit. The autonomous driving ECU 3 sends control signals to various actuators to control the vehicle 2 for autonomously driving the vehicle.); and
the signal includes: a first notification signal for notifying the driver (see at least Kimura, para. [0116]: When performing the first warning, the warning control unit 106B of the vehicle system 1Bcauses the warning unit 20 to perform the first warning that is either a vibration generated by the vibration generation unit 203 or a sound output by the sound output unit 202,), and
a second notification signal for notifying the driver, having a higher notification level than the first notification signal (see at least Kimura, para. [0117]: The second warning is a warning that includes both a vibration, generated by the vibration generation unit203, and a sound output by the sound output unit 202…On the other hand, the warning control unit 106B determines a driver who does not turn the face toward the front in response to a light stimulus and is not holding the steering wheel as being a driver whose wakefulness level is extremely low or as a driver whose driving awareness level is extremely low. For such a driver, the warning control unit 106B outputs the second warning stronger than the first warning. The second warning is a warning that has a warning level higher than that of the first warning or a warning that excites more senses than the first warning.),
which is output in a case where the confirmation motion is not made for a predetermined period (see at least Kimura, para. [0117]: The warning control unit 106B causes the warning unit 20 to perform a second warning while alight stimulus is being presented if it is determined by the first determination unit 107 that the driver is not facing forward and if it is determined by the second determination unit 108 that the driver does not hold the steering wheel before a predetermined time elapses after the first warning is started.).
However Kimura does not explicitly disclose
obtain a predicted route (RTb) by correcting the actual route based on the driver state information; and
output a signal (SIG1) to a display apparatus (7) in a case where the predicted route satisfies a predetermined condition;
wherein the driver state information includes acquires confirmation motion information (i33) indicating whether or not a confirmation motion is made by the driver,
the output of the signal is suppressed based on the confirmation motion information after outputting the signal,
the predetermined condition includes:
an object around the vehicle indicated by the vehicle periphery information is located on the predicted route, and/or
in a case where the object is another vehicle, its travel route calculated based on the vehicle periphery information intersects the predicted route, and
the actual route and the predicted route are individually displayed on the display apparatus.
Jiang teaches
obtain a predicted route (RTb) by correcting the actual route based on the driver state information (see at least Jiang, para. [0051]: For example, during traveling along the planned route, the driver's intention had shifted to another one, the processing unit determines whether to update/change the en-route goal to the second target according to, e.g., whether the original target is closer than the second target, whether it is feasible/safe to change to the second target, whether it is urgent to change the goal, whether it is quicker to move to the original target or the second target, or the combination of the above….Alternatively, the en-route goal could be changed/updated to the second target immediately, and therefore a new route is planned, and the original route is abandoned.); and
output a signal (SIG1) to a display apparatus (7) in a case where the predicted route satisfies a predetermined condition (see at least Jiang, para. [0022]: The processing unit 130 is coupled to the driver interface 110, and the sensing unit 120. The processing unit 130 may process the input signals, data and instructions. para. [0030]: In some other embodiments, the driver assistance system 100 further includes an audible unit configured to warn, notify or acknowledge the driver regarding the creation or update of the en-route goal. para. [0053]: Taking FIG. 9 for example, the vehicle 900 is traveling on the road 990. The en-route goal is determined to be the shop 960, and the route 972 is planned. At the time of the planning, it is feasible and safe to switch lanes from lane L1 to lane L2. However, during the vehicle is moving, the nearby vehicle 940 is approaching such that it is not safe for the driver to switch lanes. As such, the en-route goal is updated to the next shop 962; and thus the updated route 974 is planned. It is noted that these scenarios are for illustration purpose only, the en-route goal determination and the route planning process are not limited thereto.);
the predetermined condition includes: an object around the vehicle indicated by the vehicle periphery information is located on the predicted route, and/or in a case where the object is another vehicle, its travel route calculated based on the vehicle periphery information intersects the predicted route (see at least Jiang, para. [0053]: Taking FIG. 9 for example, the vehicle 900 is traveling on the road 990. The en-route goal is determined to be the shop 960, and the route 972 is planned. At the time of the planning, it is feasible and safe to switch lanes from lane L1 to lane L2. However, during the vehicle is moving, the nearby vehicle 940 is approaching such that it is not safe for the driver to switch lanes. As such, the en-route goal is updated to the next shop 962; and thus the updated route 974 is planned. It is noted that these scenarios are for illustration purpose only, the en-route goal determination and the route planning process are not limited thereto.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kimura to incorporate the teaching of obtain a predicted route (RTb) by correcting the actual route based on the driver state information; and output a signal (SIG1) to a display apparatus (7) in a case where the predicted route satisfies a predetermined condition, the predetermined condition includes an object around the vehicle indicated by the vehicle periphery information is located on the predicted route, and/or in a case where the object is another vehicle, its travel route calculated based on the vehicle periphery information intersects the predicted route of Jiang, with a reasonable expectation of success, in order to estimate the driver's intention and provides the updated route such that that the operation could be smoothly executed, and thus enables a more efficient communication between the driver and the vehicle (see at least Jiang, para. [0039]).
Ginther teaches
output a signal in a case where the travel route satisfies a predetermined condition (see at least Ginther, para. [0015]: The one or more forward travel sensors 68 are operable to detect a detrimental riding situation in the motorcycle's travel path (e.g., a vehicle, animal, or other object, or various road-based hazards such as potholes or bridge grates within a predetermined range of the motorcycle's forward travel path) and output a corresponding signal to the controller 64.);
wherein the driver state information includes confirmation motion information (i33) indicating whether or not a confirmation motion is made by the driver (see at least Ginther, para. [0036-0037]: Once the controller 64 determines that the rider is positively physically engaged with the motorcycle 10 at step 216, the method proceeds to step 218 where it must be determined whether the rider is alert and paying attention. In other words, the controller 64 performs or acts upon a cognitive analysis of the rider.),
the output of the signal is suppressed based on the confirmation motion information after outputting the signal (see at least Ginther, para. [0037]: If at step 218, the rider is found to have sufficient cognitive engagement in the riding activity, i.e., the rider is determined to be vigilant, the method proceeds to step 220 where the controller 64 sends a signal to the throttle actuator 72 to automatically close the engine throttle ( e.g., an override of a rider-input throttle position) and to automatically engage a brake actuator ( e.g., brake actuator 56 or other without any required input from the rider) to actuate the brake 48 to achieve maximum deceleration (e.g., to engage ABS).).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kimura to incorporate the teaching of wherein the driver state information includes acquires confirmation motion information (i33) indicating whether or not a confirmation motion is made by the driver, the output of the signal is suppressed based on the confirmation motion information after outputting the signal, of Ginther, with a reasonable expectation of success, in order to limit the actual implementation of the autonomous braking to times when the rider is judged to be capable of managing the consequences and maintaining control of the motorcycle throughout the actual autonomous braking event (see at least Ginther, para. [0041]).
Abrashov teaches
the actual route and the predicted route are individually displayed on the display apparatus (see at least Abrashov, para. [0058-0060]: Then, the optimum trajectory TO and actual current trajectory TE are materialized on a medium EA with an aspect that depends on the determined value of the parameter. This materialization is triggered by the calculation means MC and performed in the sub-step 30 of the algorithm of FIG. 2… Thus, it can be done by displaying different lines on a medium EA which is selected from a display screen equipping the vehicle VA and a windshield of the vehicle VA. In the non-limiting example shown in FIGS. 1 and 3, the materialization is accomplished by displaying of images IA on a screen EA of the central screen CC located in (or on) the dashboard PB of the vehicle VA. Alternatively or in addition, it could be accomplished by displaying images IA on a screen of the vehicle dashboard VA or on the windshield when the vehicle VA has a “head up” display device.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kimura to incorporate the teaching of the actual route and the predicted route are individually displayed on the display apparatus, of Abrashov, with a reasonable expectation of success, in order to inform the driver of the influence of their manual action on the actual trajectory (see at least Abrashov, para. [0060]).
Claim(s) 6 & 8 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kimura, in view of Jiang, in view of Ginther, in view of Abrashov, in view of US 2023/0339494A1 (“Hack”).
As per claim 6 Kimura does not explicitly disclose
wherein the vehicle state information includes information indicating a speed of the vehicle, information indicating a steering angle of the vehicle, and information indicating an inclination of a vehicle body of the vehicle.
Hack teaches
wherein the vehicle state information includes information indicating a speed of the vehicle, information indicating a steering angle of the vehicle, and information indicating an inclination of a vehicle body of the vehicle (see at least Hack, para. [0013]: a trajectory of the motorcycle is able to be determined from sensor values of the motorcycle. For instance, a steering angle of the motorcycle, a velocity of the motorcycle and an angle of inclination of the motorcycle are able to be acquired in order to determine a current curve radius of the motorcycle. It is also possible to extrapolate the trajectory up to a limited time horizon.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kimura to incorporate the teaching of wherein the vehicle state information includes information indicating a speed of the vehicle, information indicating a steering angle of the vehicle, and information indicating an inclination of a vehicle body of the vehicle of Hack, with a reasonable expectation of success, in order a further reduction of serious motorcycle accidents and hence a lower injury risk of the motorcyclists (see at least Hack, para. [0068]).
As per claim 8 Kimura discloses
further comprising a display apparatus (7) connected to the computation, wherein the display apparatus performs display for notifying the driver based on the signal (see at least Kimura, para. [0057]: The warning unit 20, mounted on the vehicle 2, sends the information to the driver. The warning unit 20 includes a display projection unit 201 and a sound output unit 202. The display projection unit 201 is a device that projects a display on the display area included in the peripheral visual field of the driver. One example of the display projection unit 261 is a part of the configuration of the head-up display that projects a display of various information on the windshield.).
However Kimura does not explicitly disclose
configured to display at least the actual route.
Hack teaches
further comprising a display apparatus connected to the computation apparatus and configured to display at least the actual route (see at least Hack, para. [0054]: In one exemplary embodiment, communications system 138 has a display for the display of trajectory information 136. Trajectory information 136 and further information is able to be graphically displayed on the display. The display can be fixedly integrated into the cockpit or also be embodied as a head-up display. In the same way, an external device, e.g., a navigation device situated in the field of view of rider 140, is able to be used as a display. The display may also be integrated into the helmet, however.),
wherein the display apparatus performs display for notifying the driver based on the signal (see at least Hack, para. [0085]: Depending on the used display technology, the display may be realized by warning lamps up to and including the display of the free space on a display, and/or by smart glasses (a transparent, holographic display in the visor of the helmet) using danger regions marked in color and/or trajectories and/or augmented reality (superimposed symbols and/or highlighting and/or markings, etc.).).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kimura to incorporate the teaching of configured to display at least the actual route of Hack, with a reasonable expectation of success, in order a further reduction of serious motorcycle accidents and hence a lower injury risk of the motorcyclists (see at least Hack, para. [0068]).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MOHAMED ABDO ALGEHAIM whose telephone number is (571)272-3628. The examiner can normally be reached Monday-Friday 8-5PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Fadey Jabr can be reached at 571-272-1516. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MOHAMED ABDO ALGEHAIM/Primary Examiner, Art Unit 3668