Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
This is in response to applicant’s filing date and preliminary amendment of September 27, 2024. In the preliminary amendment, claim 1-11 were cancelled and claim 12-28 were added. Claims 12-28 are currently pending.
Priority
Acknowledgment is made of applicant’s claim for foreign priority to Application DE102022109158.9, filed on April 13, 2022. The certified copy of the application as required by 37 CFR 1.55 has been received.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on September 27, 2024, is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Priority to prior-filed application
Applicant’s claim for the benefit of a prior-filed application, PCT/EP2023/055824 filed on 3/08/2023, under 35 U.S.C. 119(e) or under 35 U.S.C. 120, 121, 365(c), or 386(c) is acknowledged.
Claim Rejections -- 35 U.S.C. § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 12-28 are rejected under 35 U.S.C. 103 as being unpatentable over Ng-Thow-Hing et al (US-20160003636-A1)(“NTH”), provided by Applicant in the IDS filed on 9/27/2024, and YAMASHITA et al (US-20200064629-A1)(“Yamashita”).
As per claim 12, NTH discloses a method for assisting a driver of a vehicle when driving on a predetermined route course in road traffic (Figure 6), the method comprising:
receiving prepared route data, which describe the predetermined route course located in front of the vehicle at least via curve parameters (NTH at Figure 6, steps 604 & 608 describing process for projection in front of vehicle (windshield 112) and adjusting based on road conditions, and Para. [0093] discloses the presentation of navigation data on the display on the windshield 112:” information related to the navigation function may be presented to the driver as a contact-analog augmented reality graphic element projected by the second projector 120 of the HUD device 102. The vehicle control system 180 may, upon receiving a navigation request from the driver (e.g., the input of a desired location), generate a navigation route for the driver to follow to travel to the desired location. The navigation route may include a set of driving directions for the driver to follow and may include instructions to turn onto streets on the route to the desired location.”); and
displaying the prepared route data in an animated display, wherein the animated display takes place within an augmented reality via a head-up display (NTH at Figure 4, graphic elements projected by a vehicular heads-up display system, and Paras. [0005] & [0064], and in Para. [0138] disclosing that the graphic element can be animated based on a displayed parameter:” controller component 104 may determine a type of graphic element to be displayed, projected, animated, rendered, etc. by the HUD component 100. As an example, when a vehicle is travelling along one or more portions of a route that includes relatively straight road segments, the controller component 104 may project a graphic element as an avatar. The avatar may appear or be projected as a vehicle or a guide vehicle. In a scenario where a vehicle is travelling along one or more portions of a route that includes one or more turns or other navigation maneuvers, the controller component 104 may command the HUD component 100 to project a graphic element to be a marker at a location associated with one or more of the turns. For example, if a route includes a right turn from a first street onto a second street, the controller component 104 may command the HUD component 100 to project a marker or identifier at, to, around, etc. the intersection of the first street and the second street.”).
While NTH discloses causing the graphics to announce information about the roadway and vehicle parameters such as speed and the like to display above the roadway. See Figures 10A-11B and Paras. [0140]-[0142].
Yamashita discloses road information acquisition unit that acquires road information about a road in a traveling direction of the moving body and display controller that displays a virtual image illustrating at least one graphic as seen from the moving body on a display medium. See Abstract and Figures 4 & 22.
In particular, Yamashita discloses a process wherein in the animated display of the prepared route data, a curve of the predetermined route course, which is visible to the driver and is located in front of the vehicle (Yamashita at Figure 3B, HUD unit 190, and Para. [0073] discloses presenting a virtual image representation in the traveling direction of the vehicle:” display device 100 of the exemplary embodiment projects the virtual image onto windshield 201 using HUD unit 190 as illustrated in FIG. 3B, thereby presenting the virtual image superimposed on the real space to the driver.” Further, in Para. [0083] road information about the road and vehicle is displayed at HUD unit 190:” road information acquisition unit 110 may be divided into a functional block that acquires the information such as the road itself such as the slope shape and the curve shape of the road and a functional block that acquires the positional information such as the current position of vehicle 300 and the disposition position of the graphic such as the right-turn mark. In this case, a set of pieces of information acquired by the functional blocks can be dealt with as the road information.”), is announced by a docked display element and an announcing display element above a roadway (Yamashita at Para. [0084] discloses an announcing element 52 to change based certain conditions like curving or change in grade of the road surface:” display device 100 of the exemplary embodiment, the size and the like of at least one graphic that becomes a base of the virtual image and is virtually disposed at the predetermined position are not constant, but are updated according to the road information. Consequently, for example, the size and the like of the graphic of at least one graphic illustrated in the virtual image can be adjusted according to a positional relationship between vehicle 300 and the predetermined position.”); and
Further, Yamashita discloses a process wherein the curve, which is visible to the driver and is located in front of the vehicle (Yamashita at Figure 21, curved road, and Para. [0152] discloses a curve road visible to the driver:” it is assumed that the plurality of graphics 53 are continuously disposed along the route of vehicle 300. That is, in the specific example 8, the plurality of graphics 53 are continuously and virtually disposed along road 90 curved in the middle.”) , is announced by the announcing display element moving within the augmented reality along the curve (Yamashita at Para. [0084] discloses an announcing element 52 to change based certain conditions like curving or change in grade of the road surface:” display device 100 of the exemplary embodiment, the size and the like of at least one graphic that becomes a base of the virtual image and is virtually disposed at the predetermined position are not constant, but are updated according to the road information. Consequently, for example, the size and the like of the graphic of at least one graphic illustrated in the virtual image can be adjusted according to a positional relationship between vehicle 300 and the predetermined position.”).
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the navigation monitoring and control system as taught by NTH with the display control method display control method as taught by Yamashita with a reasonable expectation of success in order for the one or more method steps to graphically announce road curvature when operating a vehicle. The teaching suggestion/motivation to combine is that by incorporating modifying graphics in a vehicle head-up display, safety can be improved since a driver can easily ascertain changes in the slope and shape of a road as taught by Yamashita in Paras. [0076]-[0078].
As per claim 13, NTH and Yamashita disclose a method according to claim 12, wherein driving dynamics data are additionally received, which describe a vehicle speed and/or a vehicle lateral acceleration (Yamashita at Para. [0078], lateral acceleration acquisition, and Para. [0133] disclosing speed:” the moving speed in the case where graphic 52 moves along the route is a speed of an extent to which the driver easily recognizes the movement and the moving direction of graphic 52 when graphic 52 is moved on the road surface parallel to the traveling direction. Moving speed V1 of graphic 52 is decided from this viewpoint.”);
and in the animated display, the docked display element is displayed within the augmented reality inclined relative to the roadway at an angle of inclination, wherein the angle of inclination of the docked display element depends on the vehicle speed and/or the vehicle lateral acceleration and/or the curve parameters (Yamashita at Figures 16 & 17, positive and negative inclination, and Paras. [0133] & [0134], disclosing positioning the graphics on the road reflecting vehicle dynamics :” the moving speed in the case where graphic 52 moves along the route is a speed of an extent to which the driver easily recognizes the movement and the moving direction of graphic 52 when graphic 52 is moved on the road surface parallel to the traveling direction. Moving speed V1 of graphic 52 is decided from this viewpoint.”).
As per claim 14, NTH and Yamashita disclose a method according to claim 12, wherein driving dynamics data are additionally received, which describe a vehicle speed and/or a vehicle lateral acceleration (Yamashita at Para. [0078], lateral acceleration acquisition, and Para. [0133] disclosing speed:” the moving speed in the case where graphic 52 moves along the route is a speed of an extent to which the driver easily recognizes the movement and the moving direction of graphic 52 when graphic 52 is moved on the road surface parallel to the traveling direction. Moving speed V1 of graphic 52 is decided from this viewpoint.”);
a predicted lateral acceleration profile is determined based on the vehicle speed and/or the vehicle lateral acceleration and/or the curve parameters, wherein the predicted lateral acceleration profile describes a profile of the lateral acceleration of the vehicle to be expected when driving on the curve located in front of the vehicle (Yamashita at Figure 16, downward slope road, and Para. [0138] discloses setting the graphics at higher velocity (prediction) based on the downward slope of the road:” display controller 140 increases the moving speed of at least one graphic 52 such that the moving speed is greater than V1 when at least one graphic 52 is virtually disposed on the downslope (when the relative gradient angle is negative). Consequently, the driver easily recognizes the movement and the moving direction of at least one graphic 52 that is visually recognized so as to move along the downslope.”) ; and
in the animated display, the announcing display element, which moves within the augmented reality along the curve, inclines during the movement depending on the predicted lateral acceleration profile (Yamashita at Figure 17, upward slope is animate with graphics 52 in an upward direction, and Para. [0131] discloses changes in the graphics 52 based on the characteristics of the road:” it is assumed that the plurality of graphics 52 are continuously and virtually disposed along the route of vehicle 300, and that the plurality of graphics 52 move. That is, animation display in which the plurality of graphics 52 move along the route from the horizontal portion of road 90 to the visually recognizable range of the upslope of road 90 in the traveling direction of vehicle 300 is performed in the specific example 6.”).
As per claim 15, NTH and Yamashita disclose a method according to claim 13, wherein driving dynamics data are additionally received, which describe a vehicle speed and/or a vehicle lateral acceleration;
a predicted lateral acceleration profile is determined based on the vehicle speed and/or the vehicle lateral acceleration and/or the curve parameters, wherein the predicted lateral acceleration profile describes a profile of the lateral acceleration of the vehicle to be expected when driving on the curve located in front of the vehicle (Yamashita at Para. [0135] discloses setting the speed to a predicted amount based on the geometry of the road:” display device 100 of the exemplary embodiment, the moving speed of graphic 52 is updated to V2 (V2<V1) when the relative gradient angle of the road surface at the disposition position of moving graphic 52 is greater than zero as illustrated in part (a) of FIG. 18. Specifically, road information acquisition unit 110 acquires the road information indicating the relative gradient angle with respect to the disposition position of at least one graphic 52. Display controller 140 superimposes the virtual image on the real space to display the virtual image on windshield 201 while moving at least one graphic the extending direction of the road at the disposition position, and decreases the moving speed of at least one graphic 52 with increasing relative gradient angle when upward of vehicle 300 is set to positive, the relative gradient angle being indicated by the road information.”); and
in the animated display, the announcing display element, which moves within the augmented reality along the curve, inclines during the movement depending on the predicted lateral acceleration profile (Yamashita at Figure 18, incline road, Para. [0137] discloses that announcing display element 52 is animated to indicate an incline position:” graphic 52 in the extending direction of the road at the disposition position, and increases the moving speed of at least one graphic 52 with decreasing relative gradient angle when upward of vehicle 300 is set to positive, the relative gradient angle being indicated by the road information.”) .
As per claim 16, NTH and Yamashita disclose a method according to claim 12, wherein in the animated display of the prepared route data, the driver is made aware of the curve, which is visible to the driver and is located in front of the vehicle, by a pointing display element, which is located at a curve position assigned to the curve within the augmented reality of the head-up display (Yamashita at Para. [0156] discloses a pointing element at a curve position such as the bend of the curve road to notify the driver of upcoming curving position:” Display controller 140 decreases the spacing between two graphics 53 with increasing turning angle indicated by the road information. For example, as illustrated in part (a) of FIG. 22, display controller 140 sets the spacing for at least one graphic 53 (that is, two graphics 53 disposed in front of the direction change position) having no turning angle to P1, and updates the spacing between two graphics 53 (that is, two graphics 53 disposed ahead of the direction change position) in which the turning angle is φ2 (0°<φ2≤90°) to P4 (P4<P1).”).
As per claim 17, NTH and Yamashita disclose a method according to claim 13, wherein in the animated display of the prepared route data, the driver is made aware of the curve, which is visible to the driver and is located in front of the vehicle, by a pointing display element, which is located at a curve position assigned to the curve within the augmented reality of the head-up display (Yamashita at Figure 22 and Para. [0157] discloses a pointing element at a curve position such as the bend of the curve road to notify the driver of upcoming curving position:” the spacing on the display between two graphics 53 virtually disposed ahead of the direction change position becomes relatively small, and resultantly the continuity of the plurality of graphics 53 virtually disposed along the route including the curve is maintained. That is, the route can adequately be guided for the driver.”) ) . As per claim 18, NTH and Yamashita disclose a method according to claim 14, wherein in the animated display of the prepared route data, the driver is made aware of the curve, which is visible to the driver and is located in front of the vehicle, by a pointing display element, which is located at a curve position assigned to the curve within the augmented reality of the head-up display (Yamashita at Figure 3B and Para. [0156] discloses a pointing element at a curve position such as the bend of the curve road to notify the driver of upcoming curving position:” Display controller 140 decreases the spacing between two graphics 53 with increasing turning angle indicated by the road information. For example, as illustrated in part (a) of FIG. 22, display controller 140 sets the spacing for at least one graphic 53 (that is, two graphics 53 disposed in front of the direction change position) having no turning angle to P1, and updates the spacing between two graphics 53 (that is, two graphics 53 disposed ahead of the direction change position) in which the turning angle is φ2 (0°<φ2≤90°) to P4 (P4<P1).”).
As per claim 19, NTH and Yamashita disclose a method according to claim16, wherein the pointing display element is displayed at the curve position assigned to the curve within the augmented reality of the head-up display using chevrons (NTH at Figure 4, graphic elements (avatar) 160-172 and where the element can be “V”shape like chevron or arrow tip, and Para. [0082] disclosing that the graphic element is rendered as 3-D shape:” second and fourth graphic elements 164, 172 may be rendered with an actual 3-dimensional (3-D) volumetric shape, instead of as line segments, to add monocular cues to strengthen depth perception.”), wherein a chevron density assigned to the chevrons describes a curve radius of the curve of the predetermined route course located in front of the vehicle (NTH at Para. [0143] discloses that the graphic element or avatar change color and/or angle based on vehicle or road conditions:” controller component 104 may command the HUD component 100 to project an avatar to speed up, slow down, stop, change lanes, activate a turn signal prior to changing lanes, flash, blink, change an orientation or angle of an avatar, change a color of an avatar, etc. Further, the controller component 104 may adjust target positions for one or more of the graphic elements based on road conditions, a current position of the vehicle, a current velocity of the vehicle, or other attributes, characteristics, or measurements.”).
As per claim 20, NTH and Yamashita disclose a method according to claim16, wherein a recommended action is determined based on the driving dynamics data and/or the prepared route data (NTH at 10B, route 1032 to a destination, and Figure 15, navigation instructions, and Para. [0189] disclosing recommended action to instruct or guide a driver to a destination:” HUD component 100 may project navigation instructions in a text box 1130 and one or more avatars 1002A, 1002B, 1002C, 1002D, 1002E, etc. Additionally, the HUD component 100 may emphasize a route or path provided by respective avatars 1002A, 1002B, 1002C, 1002D, or 1002E by sequentially flashing the avatars, for example. The HUD component 100 may project one or more of the avatars 1002A, 1002B, 1002C, 1002D, or 1002E such that the avatars appear to navigate, drive, or move around objects detected by the sensor component 570.”); and
the pointing display element, which is located at a curve position assigned to the curve within the augmented reality of the head-up display, is colored depending on the recommended action (NTH at Para. [0173] discloses that the avatars can be assigned different colors to represent a navigation action:” the controller component 104 may utilize a color scheme or instruct the HUD component 100 project one or more of the graphic elements or avatars utilizing different colors to represent one or more navigation actions.”).
As per claim 21, NTH and Yamashita disclose a method according to claim 12, wherein the prepared route data comprise a trajectory prediction, which comprises at least one roadway edge and/or one roadway marking of the predetermined route course (NTH at Para. [0246] disclosing trajectory and marking in furtherance to a navigational route:” HUD (or other device) may output perceived lane boundaries and center steering trajectory. These visual guides may be used to indicate the current level of steering control … boundaries of a lane 2602 may be determined by a lane marking manager 2210 that may determine lane boundaries and a center steering trajectory of a lane of traffic.”); and
the roadway edge and/or the roadway marking are displayed within the augmented reality of the head-up display (NTH at Para. [0250] discloses displaying curb edge and the opposite boundary for traffic moving in an opposite direction.), wherein the displayed roadway edge and/or the displayed roadway marking extend beyond an actual field of view of the driver upon concealment by a surroundings object (NTH at Figures 8A & 8B showing road edge ahead of vehicle and Para. [0231] discloses positioning an avatar in a forward position not visible by the driver:” the fifth avatar 2414 and the sixth avatar 2416 may be positioned to visually appear to be located over a turn (or exit lane) that is slightly out of view of the vehicle occupants.” See Para. [0295] for avatars representing other objects in the environment).
As per claim 22, NTH and Yamashita disclose a method according to claim 13, wherein the prepared route data comprise a trajectory prediction, which comprises at least one roadway edge (NTH at Para. [0250] discloses displaying curb edge and the opposite boundary for traffic moving in an opposite direction.) and/or one roadway marking of the predetermined route course (NTH at Para. [0246] disclosing trajectory and marking in furtherance to a navigational route:” HUD (or other device) may output perceived lane boundaries and center steering trajectory. These visual guides may be used to indicate the current level of steering control … boundaries of a lane 2602 may be determined by a lane marking manager 2210 that may determine lane boundaries and a center steering trajectory of a lane of traffic.”); and
the roadway edge and/or the roadway marking are displayed within the augmented reality of the head-up display, wherein the displayed roadway edge and/or the displayed roadway marking extend beyond an actual field of view of the driver upon concealment by a surroundings object (NTH at Figures 8A & 8B showing road edge ahead of vehicle and Para. [0231] discloses positioning an avatar in a forward position not visible by the driver:” the fifth avatar 2414 and the sixth avatar 2416 may be positioned to visually appear to be located over a turn (or exit lane) that is slightly out of view of the vehicle occupants.” In Para. [0295] creating a presentation of other objects in the focal plane:” sensor component may track one or more objects in an environment surrounding a vehicle and one or more corresponding coordinates for respective objects relative to the vehicle. The HUD component may project, render, present, or display one or more graphic elements on one or more focal planes corresponding to one or more of the objects”.).
As per claim 23, NTH and Yamashita disclose a method according to claim 14, wherein the prepared route data comprise a trajectory prediction, which comprises at least one roadway edge (NTH at Para. [0250] discloses displaying curb edge and the opposite boundary for traffic moving in an opposite direction.) and/or one roadway marking of the predetermined route course (NTH at Para. [0246] disclosing trajectory and marking in furtherance to a navigational route:” HUD (or other device) may output perceived lane boundaries and center steering trajectory. These visual guides may be used to indicate the current level of steering control … boundaries of a lane 2602 may be determined by a lane marking manager 2210 that may determine lane boundaries and a center steering trajectory of a lane of traffic.”); and
the roadway edge and/or the roadway marking are displayed within the augmented reality of the head-up display, wherein the displayed roadway edge and/or the displayed roadway marking extend beyond an actual field of view of the driver upon concealment by a surroundings object (NTH at Figures 8A & 8B showing road edge ahead of vehicle and Para. [0231] discloses positioning an avatar in a forward position not visible by the driver:” the fifth avatar 2414 and the sixth avatar 2416 may be positioned to visually appear to be located over a turn (or exit lane) that is slightly out of view of the vehicle occupants.” In Para. [0295] creating a presentation of other objects in the focal plane:” sensor component may track one or more objects in an environment surrounding a vehicle and one or more corresponding coordinates for respective objects relative to the vehicle. The HUD component may project, render, present, or display one or more graphic elements on one or more focal planes corresponding to one or more of the objects”.). 24. The method according to claim 20, wherein the displayed roadway edge and/or the displayed roadway marking are colored depending on the recommended action, wherein a curve following the curve located in front of the vehicle is taken into consideration when determining the recommended action.
As per claim 25, NTH and Yamashita disclose a method according to claim 21, wherein the displayed roadway edge and/or the displayed roadway marking are colored depending on the recommended action (NTH at 10B, route 1032 to a destination, and Figure 15, navigation instructions, and Para. [0189] disclosing recommended action to instruct or guide a driver to a destination:” HUD component 100 may project navigation instructions in a text box 1130 and one or more avatars 1002A, 1002B, 1002C, 1002D, 1002E, etc. Additionally, the HUD component 100 may emphasize a route or path provided by respective avatars 1002A, 1002B, 1002C, 1002D, or 1002E by sequentially flashing the avatars, for example. The HUD component 100 may project one or more of the avatars 1002A, 1002B, 1002C, 1002D, or 1002E such that the avatars appear to navigate, drive, or move around objects detected by the sensor component 570.”), wherein a curve following the curve located in front of the vehicle is taken into consideration when determining the recommended action (NHT at Para. [0150] discloses altering the graphics based on the navigation instruction:” navigation instructions that may be projected by the HUD component 100 may include following a guide vehicle and speeding up (e.g., changing a dynamic focal plane to have an increased distance from the focal plane to the vehicle, thereby adjusting a near-far perception a driver or occupant may have of the graphic element). Other examples may include slowing down (e.g., adjusting the distance between a focal plane and the vehicle to be reduced), changing lanes (e.g., adjusting a target position for a graphic element), navigating around obstructions, turning, arrival, marking a location, and so forth.”).
As per claim 26, NTH and Yamashita disclose a computing device for a vehicle, which is configured to carry out a method according to claim 12 (NTH at Figure 5, system 500.).
As per claim 27, NTH and Yamashita disclose a non-transitory computer-readable medium storing commands which, upon execution by a computing device, cause the computing device to carry out a method according to claim 12 (NTH at Figure 31, computer-readable medium 3108.)
As per claim 28, NTH discloses an assistance system for a vehicle, the assistance system comprising:
a computing device for the vehicle (NTH at Figure 5, system 500.);
a non-transitory computer-readable medium storing commands which, upon execution by the computing device, cause the computing device to carry out a method according to claim 12 (NTH at Figure 31, computer-readable medium 3108. Further see above rejection of claim 12 where NTH and Yamashita disclose a method according to claim 12);
a display device, which is configured to display prepared route data within an augmented reality of a head-up display in an animated manner (NTH at Figure 4, graphic elements projected by a vehicular heads-up display system, and Paras. [0005] & [0064], and in Para. [0138] disclosing that the graphic element can be animated based on a displayed parameter:” controller component 104 may determine a type of graphic element to be displayed, projected, animated, rendered, etc. by the HUD component 100. As an example, when a vehicle is travelling along one or more portions of a route that includes relatively straight road segments, the controller component 104 may project a graphic element as an avatar. The avatar may appear or be projected as a vehicle or a guide vehicle. In a scenario where a vehicle is travelling along one or more portions of a route that includes one or more turns or other navigation maneuvers, the controller component 104 may command the HUD component 100 to project a graphic element to be a marker at a location associated with one or more of the turns. For example, if a route includes a right turn from a first street onto a second street, the controller component 104 may command the HUD component 100 to project a marker or identifier at, to, around, etc. the intersection of the first street and the second street.”).
It would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the navigation monitoring and control system as taught by NTH with the display control method display control method as taught by Yamashita with a reasonable expectation of success in order for the one or more method steps to graphically announce road curvature when operating a vehicle. The teaching suggestion/motivation to combine is that by incorporating modifying graphics in a vehicle head-up display, safety can be improved since a driver can easily ascertain changes in the slope and shape of a road as taught by Yamashita in Paras. [0076]-[0078].
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
SAKUMA; Yasushi et al. (US-20220084458-A1) DISPLAY CONTROL DEVICE AND NON-TRANSITORY TANGIBLE COMPUTER READABLE STORAGE MEDIUM;
Seder; Thomas A. et al. (US-20100253541-A1) TRAFFIC INFRASTRUCTURE INDICATOR ON HEAD-UP DISPLAY;
Weiss; John P. et al. (US-20230314157-A1) PARKING ASSIST IN AUGMENTED REALITY HEAD-UP DISPLAY SYSTEM;
Han; Ga Young (US-20200184812-A1) ROAD SPEED LIMIT IDENTIFICATION METHOD, ROAD SPEED LIMIT IDENTIFICATION APPARATUS, ELECTRONIC APPARATUS, COMPUTER PROGRAM, AND COMPUTER READABLE RECORDING MEDIUM;
Mueller; Mario (US-9791288-B2) Method and system for displaying navigation instructions;
YANG YIWEN et al. (DE-102009027026-A1) Generation and display of virtual road marking for driver following curve, combines data from navigational and optical interfaces for representation in head-up display;
WYSZKA ROBERT JAN et al. (DE-102020200047-A1) Method and device for displaying virtual navigation elements;
MATSUBARA KATSUMASA et al. (JP-2006284458-A) SYSTEM FOR DISPLAYING DRIVE SUPPORT INFORMATION; and,
Lee et al (KR-20120031656-A) CURVE LANE ACTIVE WARNING SYSTEM AND METHOD where a plurality of chevron indicators displays a chevron shape in order to maintain a fixed distance within a curved road section from a rear side separated from the speed limit indicators.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ELLIS B. RAMIREZ whose telephone number is (571)272-8920. The examiner can normally be reached 7:30 am to 5:00pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ramon Mercado can be reached at 571-270-5744. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ELLIS B. RAMIREZ/Examiner, Art Unit 3658