DETAILED ACTION
This non-final action is in response to the application filed 24 November 2024.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Claims 1-13 are pending, having a filing dated of 24 November 2024.
Drawings
The drawings, filed 24 November 2024, are accepted by the examiner.
Claim Objections
Claim 12 is objected to because of the following informalities. Claim 12 recites “moves into to said optimal driving lane” appears to be a typographical error. The recitation should perhaps be “moves into said optimal driving lane.” Appropriate correction is required.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or non-obviousness.
Claims 1-5, 9, 10 and 12 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent Publication Number 2023/0399004 to Lee et al. (hereafter Lee) in view of U.S. Patent Publication Number 2024/0416949 to Barrera.
As per claim 1, Lee discloses [a] system for determining the optimal lane for vehicle travel under adverse road conditions (see at least Lee, Abstract) comprising:
at least one forward-facing sensor fixedly engaged with a vehicle and electrically coupled with an onboard computer (see at least Lee, Fig. 1 showing camera 310a, radar 320; [0118] disclosing that camera 310 may be located on an appropriate portion outside the vehicle to acquire an external image of the vehicle. The camera 310 may be a mono camera, a stereo camera 310a, an around view monitoring (AVM) camera 310b or a 360-degree camera; [0123] The radar 320 may include electric wave transmitting and receiving portions. The radar 320 may be implemented as a pulse radar or a continuous wave radar according to a principle of emitting electric waves.; [0137] The processor 370 may detect an object based on an acquired image, and track the object. The processor 370 may execute operations, such as a calculation of a distance from the object, a calculation of a relative speed with the object and the like, through an image processing algorithm); and
at least one sensor for monitoring vehicle speed and direction, electronically coupled to the onboard computer (see at least Lee, [0212] disclosing that the sensing unit 120 may sense a status of the vehicle. The sensing unit 120 may include a posture sensor (e.g., a yaw sensor, a roll sensor, a pitch sensor, etc.), a collision sensor, a wheel sensor, a speed sensor, a tilt sensor, a weight-detecting sensor, a heading sensor, a gyro sensor, a position module, a vehicle forward/backward movement sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by a turn of a handle, a vehicle internal temperature sensor, a vehicle internal humidity sensor, an ultrasonic sensor, an illumination sensor, an accelerator position sensor, a brake pedal position sensor; [0213] disclosing that the sensing unit 120 may acquire sensing signals with respect to vehicle-related information, such as a pose, a collision, an orientation, a position (GPS information), an angle, a speed, an acceleration, a tilt, a forward/backward movement, a battery, a fuel, tires, lamps, internal temperature, internal humidity, a rotated angle of a steering wheel, external illumination, pressure applied to an accelerator, pressure applied to a brake pedal); and
a user interface (see at least Lee, [0233] disclosing that the AR display device 800 may render an AR graphic interface, which represents the current driving state of the vehicle based on map data (e.g., a map relating to a current location of the vehicle, route information, POI information, etc.) of the vehicle, sensing data, and a front image obtained by a camera, and provide the rendered AR graphic interface to an AR GUI surface and an AR camera surface of the navigation application in real time);
wherein the at least one forward-facing sensor (see at least Lee, [0125] disclosing that radar 320 may be disposed on an appropriate position outside the vehicle for detecting an object which is located at a front, rear or side of the vehicle; [0128] Disclosing that for the drive type, the LiDAR 330 may be rotated by a motor and detect object near the vehicle 100; [0131] disclosing that the LiDAR 330 may be disposed on an appropriate position outside the vehicle for detecting an object located at the front, rear or side of the vehicle) is configured to
scan a road in proximity of the vehicle (see at least Lee, [0125]; [0131])
to detect lane boundaries (see at least Lee, [0390] disclosing that the processor 820 may calculate a location of the vehicle in the lane, a lateral offset to the deviated lane boundary, speed/acceleration in the lateral direction, a wheel angle of the vehicle, and the like, based on the sensing data of the vehicle, the map data, and the ADAS sensing data, and thus calculate the pitch, roll, and yaw value) ... (1) ... , and
to generate high-resolution 3D images of road surfaces (see at least Lee, [0234] disclosing that AR display device 800 may render an AR object separated from the AR graphic interface to provide (indicate, display) a guide for a driving situation of the vehicle, based on the map data (e.g., the route information, the POI information, etc.), the sensing data, and the front image obtained by the camera, and provide the rendered AR object to the AR GUI surface and the AR camera surface of the navigation application in real time ); and
the at least one sensor for monitoring vehicle speed and direction (see at least Lee, [0213] disclosing that sensing unit 120 may acquire sensing signals with respect to vehicle-related information, such as a pose, a collision, an orientation, a position (GPS information), an angle, a speed, an acceleration, a tilt, a forward/backward movement, a battery, a fuel, tires, lamps, internal temperature, internal humidity, a rotated angle of a steering wheel, external illumination, pressure applied to an accelerator, pressure applied to a brake pedal and the like) is configured to
monitor the vehicle’s movement and orientation (see at least Lee, [0213]);
and the onboard computer gathers data from each sensor and performs calculations, which are sent to and interpreted by the user interface (see at least Lee, [0337] disclosing that continuously, the processor 820 determines whether a possibility of collision is predicted between the vehicle and the another vehicle (or the moving object such as the two-wheeled vehicle, the electric kickboard, etc.) approaching from the left rear side (1404), and when the possibility of collision is predicted, the processor 820 indicates that the lane change is not allowed through the separated second AR object (1405); [0388] disclosing that according to the determination (1404), when the possibility of collision with the approaching another vehicle from the left rear side is not predicted, the processor 820 displays a lane change guide trajectory through the separated second AR object (1406); [0339] disclosing that at this time, the lane change guide trajectory displayed by the separated second AR object is updated according to the driving state (e.g., location, driving direction, driving speed) of the vehicle (1407)),
which is configured to provide real-time feedback to a vehicle driver (see at least Lee, [0337] disclosing that continuously <interpreted as in real-time>, the processor 820 determines whether a possibility of collision is predicted between the vehicle and the another vehicle (or the moving object such as the two-wheeled vehicle, the electric kickboard, etc.) approaching from the left rear side (1404), and when the possibility of collision is predicted, the processor 820 indicates that the lane change is not allowed through the separated second AR object (1405); [0038]-[0039] <interpreted as the real-time feedback>; [0417]-[0418]; [0419] disclosing that after the vehicle changes to the left lane, the processor 820 may display second required driving speed information 1730b (e.g., ‘80 km/h’) which is proposed on a crossing point of a second route 1710f, based on the location and driving speed of the target vehicle, which is calculated based on the ADAS sensing data, and an existence of a moving object in front in a return lane). But Lee does not explicitly teach the following limitation taught in Barrera:
(1) to detect ... road conditions (see at least Barrera, [0284] disclosing that the algorithms of the computer vision system may conclude that the road is not inclined or very less inclined. In an embodiment, the autonomous vehicle may also determine a downhill portion of the road by determining that the road is disappearing abruptly and then appearing abruptly <interpreted as a road condition> by observing the lane or road markings) ... .
Lee and Barrera are analogous art to claim 1 because they are in the same field of using sensors to monitor road conditions and determines the best lane for vehicle travel. Lee relates to an AR display device capable of providing a guide associated with a lane change of a vehicle through an AR technology while the vehicle travels, and a method for operating the same (see at least Lee, [0002]). Barrera relates to a system operable to determine a road surface condition (see at least Barrera, Abstract).
Therefore, it would have been prima facie obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention of have modified the system, as disclosed in Lee, to provide the benefit of detecting road conditions, as disclosed in Barrera, with a reasonable expectation of success. Doing so would provide the benefit of providing alerts and taking action to minimize or avoid collisions while driving in icy uphill roads (see at least Barrera, [0003]).
As per claim 2, the combination of Lee and Barrera discloses all of the limitations of claim 1, as shown above. Lee further discloses the following limitation:
the at least one forward facing sensor is a LiDAR sensor (see at least Lee, [0128] ; [0131]).
As per claim 3, the combination of Lee and Barrera discloses all of the limitations of claim 1, as shown above. Lee further discloses the following limitation:
the sensor for monitoring vehicle speed and direction is a wheel-speed sensor (see at least Lee, [0212]).
As per claim 4, the combination of Lee and Barrera discloses all of the limitations of claim 3, as shown above. Lee further discloses the following limitation:
the wheel-speed sensor is coupled with the at least one yaw sensor (see at least Lee, [0212] disclosing that sensing unit 120 may sense a status of the vehicle. The sensing unit 120 may include a posture sensor (e.g., a yaw sensor, a roll sensor, a pitch sensor, etc.), a collision sensor, a wheel sensor, a speed sensor, a tilt sensor, a weight-detecting sensor, a heading sensor ... ; [0213]; [0303] disclosing that the processor 820 runs a preset application (e.g., the navigation application 930) according to an execution of an AR driving mode, and renders an AR graphic interface to overlap a front image of the vehicle based on map data (e.g., navigation/map data). Here, the AR graphic interface is implemented in a combined form of a first AR object indicating a driving state of the vehicle and a second AR object displaying a guide for a driving situation of the vehicle; [0351] disclosing that the processor 820 runs a preset application (e.g., the navigation application 930) according to an execution of an AR driving mode, and renders an AR graphic interface to overlap a front image of the vehicle based on map data (e.g., navigation/map data). Here, the AR graphic interface is implemented in a combined form of a first AR object indicating a driving state of the vehicle and a second AR object displaying a guide for a driving situation of the vehicle ).
As per claim 5, the combination of Lee and Barrera discloses all of the limitations of claim 1, as shown above. Lee further discloses the following limitation:
at least one acceleration sensor configured to detect changes in vehicle acceleration due to road conditions, electronically coupled to the onboard computer (see at least Lee, [0212]; [0213]).
As per claim 9, the combination of Lee and Barrera discloses all of the limitations of claim 1, as shown above. Lee further discloses the following limitations:
at least one proximity sensor configured to detect neighboring vehicles (see at least Lee, Fig. 6, showing another vehicle OB11; [0105] disclosing that the object detecting apparatus 300 is an apparatus for detecting an object located at outside of the vehicle 100. The object may be a variety of objects associated with driving (operation) of the vehicle 100. Referring to FIGS. 5 and 6, an object O may include a traffic lane OB10, another vehicle OB11, a pedestrian a two-wheeled vehicle OB13, traffic signals OB14 and OB15, light, a road, a structure, a speed hump, a terrain, an animal and the like) and
generate a signal in advance of providing real-time feedback to the vehicle driver (see at least Lee, [0335]; [0336] disclosing that when the another vehicle (or the moving object such as the two-wheeled vehicle, the electric kickboard, etc.) approaching from the left rear side is detected, a portion of the separated second AR object is moved to a location of the another vehicle approaching from the left rear side and displayed thereon (1403). For example, the portion of the second AR object is changed in color (or changed in shape/highlighted) and displayed on the left rear side based on the location of the first AR object; [0337] disclosing that continuously, the processor 820 determines whether a possibility of collision is predicted between the vehicle and the another vehicle (or the moving object such as the two-wheeled vehicle, the electric kickboard, etc.) approaching from the left rear side (1404), and when the possibility of collision is predicted, the processor 820 indicates that the lane change is not allowed through the separated second AR object (1405); [0338] disclosing that according to the determination (1404), when the possibility of collision with the approaching another vehicle from the left rear side is not predicted, the processor 820 displays a lane change guide trajectory through the separated second AR object (1406)).
As per claim 10, the combination of Lee and Barrera discloses all of the limitations of claim 1, as shown above. Lee further discloses the following limitation:
the system operates in automatic mode to control the vehicle’s steering to navigate to an optimal lane (see at least Lee, [0421] disclosing that processor 820 of the AR display device 800 according to the present disclosure may detect a front object, with which the vehicle is likely to collide, as a notification event related to a driving lane, based on sensing data (e.g., CAN data (steering wheel angle, driving speed (Speed), yaw rate (Yaw rate)), GPS location/direction information, map data (e.g., navigation/map data (lane geometry)), and ADAS sensing data (e.g., camera, radar, lidar); [0422] disclosing that the processor 820 may determine an avoidance driving lane based on the current location of the vehicle and the ADAS sensing data, in response to the detection of the front object to collide with the vehicle, and render the separate second AR object to display guide trajectories for the vehicle to travel according to the determination; [0423] disclosing that when an object to collide with the vehicle is detected in front of the vehicle during driving through an ADAS system, the vehicle 100 according to the present disclosure activates an Automatic Emergency Braking (hereinafter, ‘AEB’) function to perform emergency braking or activates an Automatic emergency steering (hereinafter, ‘AES’) function to perform automatic avoidance driving.; [0425] disclosing that The AES function is a function that detects a potential collision of the vehicle and automatically controls a steering control device to prevent or reduce severity of the collision. Some systems may detect pedestrians or other objects).
As per claim 12, the combination of Lee and Barrera discloses all of the limitations of claim 1, as shown above. Lee further discloses the following limitations:
gathering information from said sensors (similar to the citation in claim 1, see at least Lee, [0118] camera 310a; [0123] radar 320; [0137] processor 370; [0212]); and
analyzing sensor data (see at least Lee, [0313]; [0324] disclosing that with regard to FIG. 13C, the processor 820 confirms, based on the ADAS sensing data, a moving object (e.g., a vehicle, a two-wheeled vehicle, a bicycle, an electric kickboard, etc.) approaching the vehicle from the rear of a lane to be changed (e.g., a right lane) 10R ); and ... ;
generating 3D images of road surfaces from analyzed sensor data (see claim 1, see at least Lee, [0234]); and
determining an optimal driving lane (see at least Lee, [0337] disclosing that, continuously, the processor 820 determines whether a possibility of collision is predicted between the vehicle and the another vehicle (or the moving object such as the two-wheeled vehicle, the electric kickboard, etc.) approaching from the left rear side (1404), and when the possibility of collision is predicted, the processor 820 indicates that the lane change is not allowed through the separated second AR object (1405)); and
providing optimal-driving-lane prompts on a vehicle user interface (see at least Lee, [0338] disclosing that according to the determination (1404), when the possibility of collision with the approaching another vehicle from the left rear side is not predicted, the processor 820 displays a lane change guide trajectory through the separated second AR object (1406); [0339]);
wherein a driver moves into said optimal driving lane, and the method repeats (see at least Lee, [0340] disclosing that when the vehicle follows the lane change guide trajectory at high speed, the color of the first AR object is changed based on the driving speed of the vehicle. Also, when the vehicle completes the lane change by following the lane change guide trajectory, the first and second objects are updated to be displayed in the combined form instead of displaying the approaching another vehicle and the guide trajectory ; [0410] disclosing that when the lane change along the first route is completed, that is, when the vehicle moves to the left lane along the first route and is located alongside the target vehicle, the separated second AR object is displayed in the combined form (1710b) with the first AR object. ). Barrera further discloses the following limitation:
detecting road conditions (see claim 1, see at least Barrera, [0284]) ... .
Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Lee and Barrera as applied to claim 1 above, and further in view of U.S. Patent Publication Number 2024/0140459 to Ropel et al. (hereafter Ropel).
As per claim 6, the combination of Lee and Barrera discloses all of the limitations of claim 1, as shown above. But, neither Lee nor Barrera explicitly teach the following limitation taught in Ropel:
at least one microphone configured to detect road noise (see at least Ropel, [0051] disclosing that the one or more microphones can record one or more noises that occur on the road near the first vehicle; [0267]; [0273]).
Lee, Barrera and Ropel are analogous art to claim 6 because they are in the same field of using sensors to monitor road conditions and determines the best lane for vehicle travel. Lee relates to an AR display device capable of providing a guide associated with a lane change of a vehicle through an AR technology while the vehicle travels, and a method for operating the same (see at least Lee, [0002]). Barrera relates to a system operable to determine a road surface condition (see at least Barrera, Abstract). Ropel relates to peer-to-peer vehicular provision of artificially intelligent traffic analysis (see at least Ropel, [0001]).
Therefore, it would have been prima facie obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention of have modified the system, as disclosed in Lee, as modified by Barrera, to provide the benefit of having at least one microphone configured to detect road noise, as disclosed in Ropel, with a reasonable expectation of success. Doing so would provide the benefit of providing the user with an assessment of the external noise state.
Claims 7 and 8 are rejected under 35 U.S.C. 103 as being unpatentable over Lee and Barrera as applied to claim 1 above, and further in view of U.S. Patent Publication Number 2023/0331231 to Mujumdar (hereafter Mujumdar).
As per claim 7, the combination of Lee and Barrera discloses all of the limitations of claim 1, as shown above. Barrera further disclose the following limitation:
detect lane boundaries (see at least Barrera, [0284] disclosing that the algorithms of the computer vision system may conclude that the road is not inclined or very less inclined. In an embodiment, the autonomous vehicle may also determine a downhill portion of the road by determining that the road is disappearing abruptly and then appearing abruptly <interpreted as a road condition> by observing the lane or road markings) ... . But, neither Lee nor Barrera explicitly teach the following limitations taught in Mujumdar:
evaluate road conditions (see at least Mujumdar, [0031] disclosing that the trajectory module 108 also obtains attributes 304 that may affect the lane change trajectory 110. The attributes 304 may be of an environment (e.g., the example environment 100) and/or vehicle/driver settings. As shown, the attributes 304 include aspects about the roadway 106 and preferences 306; [0033] disclosing that the aspects of the roadway 106 include conditions 308 and the curvature 114. The conditions 308 include dynamic aspects that may affect a safety or viability of certain lane change trajectories (e.g., weather, road surface conditions, road surface, gradient(s), quality of lane markings) <interpreted a hazards>. The curvature 114 may indicate a radius of the roadway 106 at a location of the lane change and/or whether the roadway 106 curves to the left or right of a heading of the host vehicle 102; [0057] disclosing that determining whether road conditions of the roadway are appropriate for an asymmetric lane change trajectory; and responsive to determining that the road conditions are not appropriate for an asymmetric lane change trajectory, calculating the lane change trajectory as a symmetrical lane change trajectory), ... and
identify potential hazards (see at least Mujumdar, [0031]; [0033] ).
Lee, Barrera and Mujumdar are analogous art to claim 7 because they are in the same field of using sensors to monitor road conditions and determines the best lane for vehicle travel. Lee relates to an AR display device capable of providing a guide associated with a lane change of a vehicle through an AR technology while the vehicle travels, and a method for operating the same (see at least Lee, [0002]). Barrera relates to a system operable to determine a road surface condition (see at least Barrera, Abstract). Mujumdar relates to techniques and systems herein are configured for dynamically calculating lane change trajectories (see at least Mujumdar, Abstract).
Therefore, it would have been prima facie obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention of have modified the system, as disclosed in Lee, as modified by Barrera, to provide the benefit of evaluating road conditions and identifying potential hazards, as disclosed in Mujumdar, with a reasonable expectation of success. Doing so would provide the benefit of providing trajectories that lead to lane changes that are expected by the drivers and safe for the conditions (see at least Mujumdar, [0001]).
As per claim 8, the examiner notes that the limitation of this claim and the claim dependency are exactly identical to claim 7, and thus the exact same rejection rational applies.
Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over Lee and Barrera as applied to claim 1 above, and further in view of U.S. Patent Publication Number 2022/0223036 to Chikamori et al. (hereafter Chikamori).
As per claim 11, the combination of Lee and Barrera discloses all of the limitations of claim 1, as shown above. But, Lee does not explicitly teach the following limitations taught in Chikamori:
the system operates in user-driven mode to provide real-time feedback and lane recommendations to the driver (see at least Chikamori, [0061] disclosing that in a case where the recommended lane 41 is switched as described above in the manual driving mode, the navigation device 11 causes the touch panel 23 and the sound generating device 24, at or before the switching point 42, to notify the driver that the recommended lane 41 is switched; [0075] disclosing that the storage unit 34 may store the preference of the driver for a timing of the lane change to the recommended lane 41, and the recommended lane determining unit 32 may adjust the switching point 42 of the recommended lane 41 according to the preference thereof. Preferably, such an adjustment of the switching point 42 according to the preference of the driver is executed during the manual driving mod ).
Lee, Barrera and Chikamori are analogous art to claim 11 because they are in the same field of using sensors to monitor road conditions and determines the best lane for vehicle travel. Lee relates to an AR display device capable of providing a guide associated with a lane change of a vehicle through an AR technology while the vehicle travels, and a method for operating the same (see at least Lee, [0002]). Barrera relates to a system operable to determine a road surface condition (see at least Barrera, Abstract). Chikamori relates to a vehicle system for determining a recommended lane for an own vehicle in a case where the own vehicle travels on a road having a plurality of lanes on one lateral side thereof (see at least Chikamori, [0001]).
Therefore, it would have been prima facie obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention of have modified the system, as disclosed in Lee, as modified by Barrera, to provide the benefit of having the system operate in user-driven mode to provide real-time feedback and lane recommendations to the driver, as disclosed in Chikamori, with a reasonable expectation of success. Doing so would provide the benefit of improving driving comfort (see at least Chikamori, [0003]).
Claim 13 is rejected under 35 U.S.C. 103 as being unpatentable over Lee as applied to claim 1 above, and further in view of U.S. Patent Publication Number 2024/0416908 to Taniguchi et al. (hereafter Taniguchi).
As per claim 13, Lee discloses all of the limitations of claim 1, as shown above. Lee further disclose the following limitations:
gathering information from said sensors (similar to claim 12, see at least Lee, [0118] camera 310a; [0123] radar 320; [0137] processor 370; [0212]); and
analyzing sensor data (similar to claim 12, see at least Lee, [0313]; [0324]); and
... ; and
determining an optimal driving lane (similar to claim 12, see at least Lee, [0337]; [0338]); ... .
Barrera further disclose the following limitation:
detect road conditions (as cited in claim 1, see at least Barrera, [0284] ) ... . But, neither Lee nor Barrera explicitly teach the following limitations taught in Taniguchi:
engaging onboard control system (see at least Taniguchi, [0206] disclosing that during the execution of the lane keeping mode of the autonomous steering control/hands-off mode, when the condition (9) of Fig. 7 is satisfied, the mode transitions to a lane change mode of the autonomous steering control/hands-on mode; [0209] disclosing that during the execution of the lane change mode of the autonomous steering control/hands-on mode, when the condition (10) of FIG. 7 is satisfied, the mode transitions to the lane keeping mode of the autonomous steering control/hands-on mode.); and
actuating a steering-adjustment mechanism to steer the vehicle into the optimal lane, and the method repeats (see at least Taniguchi, [0238] disclosing that, with respect to Fig. 8, in step S14, a determination is made as to whether or not the direction indicator lever has been operated by the driver. When the direction indicator lever has been operated, the condition (9) for transitioning to the lane changing mode of the autonomous steering control/hands-on mode is satisfied, and the process proceeds to step S15. In step S15, the lane change assist control is executed. When the lane change assist control in step S15 is concluded, the process returns to step S3. When the direction indicator lever has not been operated by the driver in step S14, the process proceeds to step S16).
Lee, Barrera and Taniguchi are analogous art to claim 13 because they are in the same field of using sensors to monitor road conditions and determines the best lane for vehicle travel. Lee relates to an AR display device capable of providing a guide associated with a lane change of a vehicle through an AR technology while the vehicle travels, and a method for operating the same (see at least Lee, [0002]). Barrera relates to a system operable to determine a road surface condition (see at least Barrera, Abstract). Taniguchi relates to a vehicle travel assistance method and a travel assistance device for a vehicle include determining whether or not a route-following lane change that is a lane change for traveling along a set travel route can be assisted with autonomous lane change control (see Taniguchi, Abstract).
Therefore, it would have been prima facie obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention of have modified the system, as disclosed in Lee, as modified by Barrera, to provide the benefit of engaging onboard control system and actuating a steering-adjustment mechanism to steer the vehicle into the optimal lane, as disclosed in Taniguchi, with a reasonable expectation of success. Doing so would provide the benefit of matching the executable control that the driver recognizes with the control that can be actually executed (see at least Taniguchi, [0006]).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
U.S. Patent Publication Number 2024/0375674 to Kume et al. (hereafter Kume), see Claim 24, disclosing that the automatic driving control device according to claim 23, further comprising: a road condition grasping unit that grasps a condition of a road on which the subject vehicle travels, wherein the lane change control unit sets at least either the first point or the second point according to a condition of the road; [0064]-[0065]; [0156]; and
U.S. Patent Publication Number 2021/0197825 to Choi, see Abstract, A method of controlling a vehicle includes obtaining, by a camera, an image of a road ahead; recognizing, by a controller, a curvature of a front lane from the obtained road image; obtaining, by the controller, position information and speed information of another vehicle based on obstacle information detected by an obstacle detector; periodically storing, by a storage, driving speed information, yaw rate information, and steering angle information while driving; recognizing, by the controller, a curvature of a rear lane based on the periodically stored driving speed information, yaw rate information, and steering angle information in response to determining that a lane change is necessary; determining, by the controller, a lane change possibility based on the recognized curvature of the front lane, the recognized curvature of the rear lane, and the position information and speed information of the other vehicle; and controlling, by the controller, at least one of steering, deceleration, and acceleration based on the determined lane change possibility.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to PATRICK M. BRADY III whose telephone number is (571)272-7458. The examiner can normally be reached Monday - Friday 8:00 am - 5;30 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Helal Algahaim can be reached at (571) 270-5227. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
PATRICK M. BRADY III
Examiner
Art Unit 3666
/PATRICK M BRADY/Examiner, Art Unit 3666
/HELAL A ALGAHAIM/SPE , Art Unit 3666