Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1 and 4 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kume et al. (US 20230373309 A1).
In regards to claim 1, Kume teaches an in-vehicle display system mounted on a host vehicle and configured to display a situation around the host vehicle on an in-vehicle display (Abstract; Paragraphs 19, 26)
A display control device for controlling display of a peripheral image indicating a traveling environment of a host vehicle based on information acquired by a sensor recognizes the traveling environment. The display control device includes an information acquisition unit and a display control unit. The information acquisition unit acquires at least type information indicating a type of a road and shape information indicating a lane shape of the road as road information related to the road around the host vehicle. The display control unit draws line images reproducing the road in the peripheral image based on the shape information. The display control unit changes a number of the line images drawn in the peripheral image based on the type information.[Abstract]
A vehicle display device in a comparative example displays a host vehicle icon and line icons on a display of a host vehicle. The vehicle display further displays, for example, an arrow icon indicating that a lane change to an adjacent lane is possible, another vehicle icon indicating another vehicle in the adjacent lane, and the like based on a traveling environment around the host vehicle recognized by a camera, a radar, and the like.[P-19]
According to one aspect of the present disclosure, display control techniques for controlling display of a peripheral image indicating a traveling environment of a host vehicle based on information acquired by a sensor recognizes the traveling environment. The display control techniques acquire shape information indicating a lane shape of the road around the host vehicle and recognition information of other vehicles traveling around the host vehicle. The display control techniques draw line images reproducing the road in the peripheral image based on the shape information. The display control techniques change a number of the line images drawn in the peripheral image according to a number of the other vehicles traveling around the host vehicle based on the recognition information.[P-26]
Kume further teaches a lane line recognition unit configured to recognize a lane line of a traveling lane of the host vehicle based on a detection result of a camera of the host vehicle (Paragraphs 19, 52, 53, 70, 71)
A vehicle display device in a comparative example displays a host vehicle icon and line icons on a display of a host vehicle. The vehicle display further displays, for example, an arrow icon indicating that a lane change to an adjacent lane is possible, another vehicle icon indicating another vehicle in the adjacent lane, and the like based on a traveling environment around the host vehicle recognized by a camera, a radar, and the like.[P-19]
The driving control block 60 is a control block that realizes a driving assistance function and an automated driving function. The driving control block 60 realizes the driving assistance function, for example, an adaptive cruise control (i.e., ACC), a lane tracking assist (i.e., LTA), and a lane change assist (i.e., LCA). In addition, the driving control block 60 performs driving control corresponding to ACC, LTC, LCA, and the like in a complex manner by the operation of the automated driving function. The driving control block 60 includes an environment recognition unit 61, a behavior determination unit 62, and an operation execution unit 63 as functional units based on the automated driving program.[P-52]
The environment recognition unit 61 recognizes a traveling environment of the host vehicle A based on the locator information and the map data acquired from the locator 35 and the detection information acquired from the peripheral monitoring sensor 30. More specifically, the environment recognition unit 61 grasps a position of a driver's vehicle lane on which the host vehicle travels among lanes, a lane shape and a travel direction of each lane, and a relative position, relative speed, and others of another vehicle and a pedestrian around the driver's vehicle. The environment recognition unit 61 provides these recognition results and information (hereinafter referred to as sensor reliability information) indicating reliability of detection by the peripheral monitoring sensor 30 to the HMI control block 70.[P-53]
The road type information includes at least information indicating whether the road on which the vehicle is traveling is a general road or an expressway (or a parking lot) and information indicating a traveling direction of each lane of the road on which the vehicle is traveling. The expressway mentioned here includes a motorway. The road shape information is information indicating a shape of each lane of the road on which the vehicle is traveling, in other words, information indicating the shape of a lane marking or a road edge that partitions each lane. The road shape information may be information based on the recognition result of the lane marking or the road edge provided from the environment recognition unit 61 in addition to the map data provided from the locator 35.[P-63]
Each line icon PIn is drawn using the road shape information, and indicates a plurality of lane markings provided on the road on which the host vehicle is traveling. The driver's vehicle lane VLs in which the host vehicle A travels is reproduced by the line icons PIn drawn on the left and right of the host vehicle icon Pv. The line icons PIn displayed at the left and right ends in the peripheral image Dpm are road edge icons Pre indicating the left and right road edges, respectively. The line icons PIn reproduces each lane of the traveling road in the peripheral image Dpm.[P-70]
The line icons PIn include a recognition line Li1 and an unrecognized line Li2. The recognition line Li1 is a line icon PIn corresponding to a lane marking (or a road edge) recorded in the map data and recognized by the peripheral monitoring sensor 30. The unrecognized line Li2 is a line icon PIn corresponding to a lane marking that is recorded in the map data but is not recognized by the peripheral monitoring sensor 30. The line icon PIn arranged in the vicinity of the host vehicle icon Pv is the recognition line Li1. On the other hand, the line icon PIn drawn at a position away from the host vehicle icon Pv is the unrecognized line Li2.[P-71]
Furthermore, Kume teaches a road edge structure recognition unit configured to recognize a road edge structure positioned outside the traveling line based on a detection result of a sensor of the host vehicle; and a display control unit configured to control a display of the in-vehicle display based on recognition results of the lane line recognition unit and the road edge structure recognition unit (Paragraphs 63, 103, 118)
The road type information includes at least information indicating whether the road on which the vehicle is traveling is a general road or an expressway (or a parking lot) and information indicating a traveling direction of each lane of the road on which the vehicle is traveling. The expressway mentioned here includes a motorway. The road shape information is information indicating a shape of each lane of the road on which the vehicle is traveling, in other words, information indicating the shape of a lane marking or a road edge that partitions each lane. The road shape information may be information based on the recognition result of the lane marking or the road edge provided from the environment recognition unit 61 in addition to the map data provided from the locator 35.[P-63]
The 3D drawing unit 73 hides the line icon PIn or the road edge icons Pre away from the driver's vehicle lane VLs, and then displays the pedestrian icon Pa (see FIG. 9) when the warning target information in which the warning target such as a pedestrian is detected is acquired. The 3D drawing unit 73 increases the number of line icons PIn drawn in the peripheral image Dpm in accordance with a start of display of the pedestrian icon Pa. The 3D drawing unit 73 redisplays the road edge icons Pre or the line icon PIn in the vicinity of the pedestrian icon Pa.[P-103]
Further, in the present embodiment, when the road on which the vehicle is traveling is a general road, the road edge icons Pre as the line icon PIn indicating the road edge is drawn in the peripheral image Dpm. The road edge icons Pre are not drawn when the vehicle travels on an expressway. According to the above, on a general road where a pedestrian or the like is present on a side of the road, the driver can be notified of the details of the surroundings of the host vehicle by clearly indicating the road edge. On the other hand, in an expressway where there is no pedestrian or the like, the peripheral image Dpm can be simplified.[P-118]
Kume fails to explicitly teach the display control unit is configured to display a road edge structure icon corresponding to the road edge structure on the in-vehicle display when the lane line is not recognized and the road edge structure is recognized
However, Kume teaches scenarios of the display unit displaying the road lane line icon or the road edge icon. Kume also teaches scenarios of which an specific road lane line is unrecognizable(Paragraphs 71, 103)
The line icons PIn include a recognition line Li1 and an unrecognized line Li2. The recognition line Li1 is a line icon PIn corresponding to a lane marking (or a road edge) recorded in the map data and recognized by the peripheral monitoring sensor 30. The unrecognized line Li2 is a line icon PIn corresponding to a lane marking that is recorded in the map data but is not recognized by the peripheral monitoring sensor 30. The line icon PIn arranged in the vicinity of the host vehicle icon Pv is the recognition line Li1. On the other hand, the line icon PIn drawn at a position away from the host vehicle icon Pv is the unrecognized line Li2.[P-71]
The 3D drawing unit 73 hides the line icon PIn or the road edge icons Pre away from the driver's vehicle lane VLs, and then displays the pedestrian icon Pa (see FIG. 9) when the warning target information in which the warning target such as a pedestrian is detected is acquired. The 3D drawing unit 73 increases the number of line icons PIn drawn in the peripheral image Dpm in accordance with a start of display of the pedestrian icon Pa. The 3D drawing unit 73 redisplays the road edge icons Pre or the line icon PIn in the vicinity of the pedestrian icon Pa.[P-103]
With Kume’s teaching above of the display displaying either an road edge icon or a lane line icon pertaining to the host vehicle’ s traveling environment. As well as situations where the lane line icon is unrecognized. It is obvious to one of ordinary skill in the art with the inventive features mentioned above to selectively configure scenarios of how and when the said inventive features may be displayed within the vehicle, such as displaying a road edge structure icon corresponding to the road edge structure on the in-vehicle display when the lane line is not recognized and the road edge structure is recognized. As well as displaying a lane line icon corresponding to the lane line and not to display the road edge structure icon on the in-vehicle display when the lane line is recognized even if the road edge structure is recognized.
Thereby it is obvious to one of ordinary skill in the art during the filing date of the said invention to selectively customize and configure how the display icons of the vehicular environment are displayed to the driver to better discern environmental hazards and danger on the road to promote vehicular safety.
In regards to claim 4, Kume teaches a display method of an in-vehicle display system mounted on a host vehicle and configured to display a situation around the host vehicle on an in-vehicle display(Abstract; Paragraphs 19,26)
A display control device for controlling display of a peripheral image indicating a traveling environment of a host vehicle based on information acquired by a sensor recognizes the traveling environment. The display control device includes an information acquisition unit and a display control unit. The information acquisition unit acquires at least type information indicating a type of a road and shape information indicating a lane shape of the road as road information related to the road around the host vehicle. The display control unit draws line images reproducing the road in the peripheral image based on the shape information. The display control unit changes a number of the line images drawn in the peripheral image based on the type information.[Abstract]
A vehicle display device in a comparative example displays a host vehicle icon and line icons on a display of a host vehicle. The vehicle display further displays, for example, an arrow icon indicating that a lane change to an adjacent lane is possible, another vehicle icon indicating another vehicle in the adjacent lane, and the like based on a traveling environment around the host vehicle recognized by a camera, a radar, and the like.[P-19]
According to one aspect of the present disclosure, display control techniques for controlling display of a peripheral image indicating a traveling environment of a host vehicle based on information acquired by a sensor recognizes the traveling environment. The display control techniques acquire shape information indicating a lane shape of the road around the host vehicle and recognition information of other vehicles traveling around the host vehicle. The display control techniques draw line images reproducing the road in the peripheral image based on the shape information. The display control techniques change a number of the line images drawn in the peripheral image according to a number of the other vehicles traveling around the host vehicle based on the recognition information.[P-26]
Kume then teaches a lane line recognition step of recognizing a lane line of a traveling lane of the host vehicle based on a detection result of a camera of the host vehicle a road edge structure recognition step of recognizing a road edge structure positioned outside the traveling line based on a detection result of a sensor of the host vehicle(Paragraphs 19, 52, 53, 63, 70, 71)
A vehicle display device in a comparative example displays a host vehicle icon and line icons on a display of a host vehicle. The vehicle display further displays, for example, an arrow icon indicating that a lane change to an adjacent lane is possible, another vehicle icon indicating another vehicle in the adjacent lane, and the like based on a traveling environment around the host vehicle recognized by a camera, a radar, and the like.[P-19]
The driving control block 60 is a control block that realizes a driving assistance function and an automated driving function. The driving control block 60 realizes the driving assistance function, for example, an adaptive cruise control (i.e., ACC), a lane tracking assist (i.e., LTA), and a lane change assist (i.e., LCA). In addition, the driving control block 60 performs driving control corresponding to ACC, LTC, LCA, and the like in a complex manner by the operation of the automated driving function. The driving control block 60 includes an environment recognition unit 61, a behavior determination unit 62, and an operation execution unit 63 as functional units based on the automated driving program.[P-52]
The environment recognition unit 61 recognizes a traveling environment of the host vehicle A based on the locator information and the map data acquired from the locator 35 and the detection information acquired from the peripheral monitoring sensor 30. More specifically, the environment recognition unit 61 grasps a position of a driver's vehicle lane on which the host vehicle travels among lanes, a lane shape and a travel direction of each lane, and a relative position, relative speed, and others of another vehicle and a pedestrian around the driver's vehicle. The environment recognition unit 61 provides these recognition results and information (hereinafter referred to as sensor reliability information) indicating reliability of detection by the peripheral monitoring sensor 30 to the HMI control block 70.[P-53]
The road type information includes at least information indicating whether the road on which the vehicle is traveling is a general road or an expressway (or a parking lot) and information indicating a traveling direction of each lane of the road on which the vehicle is traveling. The expressway mentioned here includes a motorway. The road shape information is information indicating a shape of each lane of the road on which the vehicle is traveling, in other words, information indicating the shape of a lane marking or a road edge that partitions each lane. The road shape information may be information based on the recognition result of the lane marking or the road edge provided from the environment recognition unit 61 in addition to the map data provided from the locator 35.[P-63]
Each line icon PIn is drawn using the road shape information, and indicates a plurality of lane markings provided on the road on which the host vehicle is traveling. The driver's vehicle lane VLs in which the host vehicle A travels is reproduced by the line icons PIn drawn on the left and right of the host vehicle icon Pv. The line icons PIn displayed at the left and right ends in the peripheral image Dpm are road edge icons Pre indicating the left and right road edges, respectively. The line icons PIn reproduces each lane of the traveling road in the peripheral image Dpm.[P-70]
The line icons PIn include a recognition line Li1 and an unrecognized line Li2. The recognition line Li1 is a line icon PIn corresponding to a lane marking (or a road edge) recorded in the map data and recognized by the peripheral monitoring sensor 30. The unrecognized line Li2 is a line icon PIn corresponding to a lane marking that is recorded in the map data but is not recognized by the peripheral monitoring sensor 30. The line icon PIn arranged in the vicinity of the host vehicle icon Pv is the recognition line Li1. On the other hand, the line icon PIn drawn at a position away from the host vehicle icon Pv is the unrecognized line Li2.[P-71]
Kume also teaches a display control step of controlling a display of the in-vehicle display based on recognition results of the lane line recognition step and the road edge structure recognition step(Paragraphs 63, 103, 118)
The road type information includes at least information indicating whether the road on which the vehicle is traveling is a general road or an expressway (or a parking lot) and information indicating a traveling direction of each lane of the road on which the vehicle is traveling. The expressway mentioned here includes a motorway. The road shape information is information indicating a shape of each lane of the road on which the vehicle is traveling, in other words, information indicating the shape of a lane marking or a road edge that partitions each lane. The road shape information may be information based on the recognition result of the lane marking or the road edge provided from the environment recognition unit 61 in addition to the map data provided from the locator 35.[P-63]
The 3D drawing unit 73 hides the line icon PIn or the road edge icons Pre away from the driver's vehicle lane VLs, and then displays the pedestrian icon Pa (see FIG. 9) when the warning target information in which the warning target such as a pedestrian is detected is acquired. The 3D drawing unit 73 increases the number of line icons PIn drawn in the peripheral image Dpm in accordance with a start of display of the pedestrian icon Pa. The 3D drawing unit 73 redisplays the road edge icons Pre or the line icon PIn in the vicinity of the pedestrian icon Pa.[P-103]
Further, in the present embodiment, when the road on which the vehicle is traveling is a general road, the road edge icons Pre as the line icon PIn indicating the road edge is drawn in the peripheral image Dpm. The road edge icons Pre are not drawn when the vehicle travels on an expressway. According to the above, on a general road where a pedestrian or the like is present on a side of the road, the driver can be notified of the details of the surroundings of the host vehicle by clearly indicating the road edge. On the other hand, in an expressway where there is no pedestrian or the like, the peripheral image Dpm can be simplified.[P-118]
Kume fails to explicitly teach the display control unit is configured to display a road edge structure icon corresponding to the road edge structure on the in-vehicle display when the lane line is not recognized and the road edge structure is recognized
However, Kume teaches scenarios of the display unit displaying the road lane line icon or the road edge icon. Kume also teaches scenarios of which an specific road lane line is unrecognizable (Paragraphs 71, 103)
The line icons PIn include a recognition line Li1 and an unrecognized line Li2. The recognition line Li1 is a line icon PIn corresponding to a lane marking (or a road edge) recorded in the map data and recognized by the peripheral monitoring sensor 30. The unrecognized line Li2 is a line icon PIn corresponding to a lane marking that is recorded in the map data but is not recognized by the peripheral monitoring sensor 30. The line icon PIn arranged in the vicinity of the host vehicle icon Pv is the recognition line Li1. On the other hand, the line icon PIn drawn at a position away from the host vehicle icon Pv is the unrecognized line Li2.[P-71]
The 3D drawing unit 73 hides the line icon PIn or the road edge icons Pre away from the driver's vehicle lane VLs, and then displays the pedestrian icon Pa (see FIG. 9) when the warning target information in which the warning target such as a pedestrian is detected is acquired. The 3D drawing unit 73 increases the number of line icons PIn drawn in the peripheral image Dpm in accordance with a start of display of the pedestrian icon Pa. The 3D drawing unit 73 redisplays the road edge icons Pre or the line icon PIn in the vicinity of the pedestrian icon Pa.[P-103]
With Kume’s teaching above of the display displaying either an road edge icon or a lane line icon pertaining to the host vehicle’ s traveling environment. As well as situations where the lane line icon is unrecognized. It is obvious to one of ordinary skill in the art with the inventive features mentioned above to selectively configure scenarios of how and when the said inventive features may be displayed within the vehicle, such as displaying a road edge structure icon corresponding to the road edge structure on the in-vehicle display when the lane line is not recognized and the road edge structure is recognized. As well as displaying a lane line icon corresponding to the lane line and not to display the road edge structure icon on the in-vehicle display when the lane line is recognized even if the road edge structure is recognized.
Thereby it is obvious to one of ordinary skill in the art during the filing date of the said invention to selectively customize and configure how the display icons of the vehicular environment are displayed to the driver to better discern environmental hazards and danger on the road to promote vehicular safety.
Claim(s) 2, 3, 5-7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kume et al. (US 20230373309 A1) in view of Ermilios et al (CN 109074653 A).
In regards to claim 2, Kume fails to teach a pavement edge recognition unit configured to recognize a pavement edge of a road positioned on either the left or right side of the host vehicle based on a detection result of the sensor of the host vehicle, wherein the display control unit is configured to display a pavement edge icon corresponding to the pavement edge on the in-vehicle display when the lane line is not recognized and the pavement edge is recognized, and to display the lane line icon and not to display the pavement edge icon on the in-vehicle display when the lane line is recognized even when the pavement edge is recognized.
Ermilios on the other hand teaches a pavement edge recognition unit configured to recognize a pavement edge of a road positioned on either the left or right side of the host vehicle based on a detection result of the sensor of the host vehicle (Page 6, Paragraph 2)
In particular, external calibration of the camera, which is called the motion tracking calibration (MTC) operation, is performed based on image captured by at least one camera, the image at least partially illustrates the pavement or the ground in the vicinity of the motor vehicle. and a texture (such as asphalt gravel ground) present on the surface of, but not displayed object. using the road surface texture of those image display, the computing device further adapted to calibrate a camera, especially does not need any existing on the image of the particular characteristic of interest, such as corners, profile, edge or line.[Pg 6, P-2]
Here, we see a pavement recognition device such as a camera, and a pavement illustration displayed within a vehicle.
When Ermilios teaching is combined with Kume’s teaching of scenarios of the display unit displaying the road lane line icon or the road edge icon, as well as displaying one icon when another is not recognized, and further selectively displaying a specific icon and hiding another for any specific scenario(Paragraphs 71, 103)
The line icons PIn include a recognition line Li1 and an unrecognized line Li2. The recognition line Li1 is a line icon PIn corresponding to a lane marking (or a road edge) recorded in the map data and recognized by the peripheral monitoring sensor 30. The unrecognized line Li2 is a line icon PIn corresponding to a lane marking that is recorded in the map data but is not recognized by the peripheral monitoring sensor 30. The line icon PIn arranged in the vicinity of the host vehicle icon Pv is the recognition line Li1. On the other hand, the line icon PIn drawn at a position away from the host vehicle icon Pv is the unrecognized line Li2.[P-71]
The 3D drawing unit 73 hides the line icon PIn or the road edge icons Pre away from the driver's vehicle lane VLs, and then displays the pedestrian icon Pa (see FIG. 9) when the warning target information in which the warning target such as a pedestrian is detected is acquired. The 3D drawing unit 73 increases the number of line icons PIn drawn in the peripheral image Dpm in accordance with a start of display of the pedestrian icon Pa. The 3D drawing unit 73 redisplays the road edge icons Pre or the line icon PIn in the vicinity of the pedestrian icon Pa.[P-103]
Then one of ordinary skill in the art may substitute pavement icon as illustrated in Ermilios instead of a given edge icon and further enable the display control unit is configured to display a pavement edge icon corresponding to the pavement edge on the in-vehicle display when the lane line is not recognized and the pavement edge is recognized, and to display the lane line icon and not to display the pavement edge icon on the in-vehicle display when the lane line is recognized even when the pavement edge is recognized.
Hence, it would have been obvious to one of ordinary skill in the art during the time of the filing date of the said invention to combine Ermilios’ teaching with Kume’s teaching in order to enable a more accurate visual representation of the vehicle’s surroundings along its path in order to increase the safety probability.
In regards to claim 3, Kume modified teaching of scenarios of the display unit displaying the road lane line icon or the road edge icon, as well as displaying one icon when another is not recognized, and further selectively displaying a specific icon and hiding another for any specific scenario(Paragraphs 71, 103, Kume)
The line icons PIn include a recognition line Li1 and an unrecognized line Li2. The recognition line Li1 is a line icon PIn corresponding to a lane marking (or a road edge) recorded in the map data and recognized by the peripheral monitoring sensor 30. The unrecognized line Li2 is a line icon PIn corresponding to a lane marking that is recorded in the map data but is not recognized by the peripheral monitoring sensor 30. The line icon PIn arranged in the vicinity of the host vehicle icon Pv is the recognition line Li1. On the other hand, the line icon PIn drawn at a position away from the host vehicle icon Pv is the unrecognized line Li2.[P-71]
The 3D drawing unit 73 hides the line icon PIn or the road edge icons Pre away from the driver's vehicle lane VLs, and then displays the pedestrian icon Pa (see FIG. 9) when the warning target information in which the warning target such as a pedestrian is detected is acquired. The 3D drawing unit 73 increases the number of line icons PIn drawn in the peripheral image Dpm in accordance with a start of display of the pedestrian icon Pa. The 3D drawing unit 73 redisplays the road edge icons Pre or the line icon PIn in the vicinity of the pedestrian icon Pa.[P-103]
Furthermore, Ermilios teaches a pavement edge recognition unit configured to recognize a pavement edge of a road positioned on either the left or right side of the host vehicle based on a detection result of the sensor of the host vehicle (Page 6, Paragraph 2)
In particular, external calibration of the camera, which is called the motion tracking calibration (MTC) operation, is performed based on image captured by at least one camera, the image at least partially illustrates the pavement or the ground in the vicinity of the motor vehicle. and a texture (such as asphalt gravel ground) present on the surface of, but not displayed object. using the road surface texture of those image display, the computing device further adapted to calibrate a camera, especially does not need any existing on the image of the particular characteristic of interest, such as corners, profile, edge or line.[Pg 6, P-2]
When combined with Kume’s teaching it is obvious for one of ordinary skill in the art to be able to selectively configure the display control unit to display the pavement edge icon and not to display the road edge structure icon on the in-vehicle display when the lane line is not recognized and both the road edge structure and the pavement edge are recognized.
In regards to claim 5, Kume teaches an in-vehicle display system mounted on a host vehicle and configured to display a situation around the host vehicle on an in-vehicle display (Abstract; Paragraphs 19, 45; Figure 2)
A display control device for controlling display of a peripheral image indicating a traveling environment of a host vehicle based on information acquired by a sensor recognizes the traveling environment. The display control device includes an information acquisition unit and a display control unit. The information acquisition unit acquires at least type information indicating a type of a road and shape information indicating a lane shape of the road as road information related to the road around the host vehicle. The display control unit draws line images reproducing the road in the peripheral image based on the shape information. The display control unit changes a number of the line images drawn in the peripheral image based on the type information.[Abstract]
A vehicle display device in a comparative example displays a host vehicle icon and line icons on a display of a host vehicle. The vehicle display further displays, for example, an arrow icon indicating that a lane change to an adjacent lane is possible, another vehicle icon indicating another vehicle in the adjacent lane, and the like based on a traveling environment around the host vehicle recognized by a camera, a radar, and the like.[P-19]
Kume teaches a road edge structure recognition unit configured to recognize a road edge structure positioned on either the left or right side of the host vehicle based on a detection result of a sensor of the host vehicle; and a display control unit configured to control a display of the in-vehicle display based on a recognition result of the road edge structure recognition unit (Paragraphs 63, 71, 118)
The road type information includes at least information indicating whether the road on which the vehicle is traveling is a general road or an expressway (or a parking lot) and information indicating a traveling direction of each lane of the road on which the vehicle is traveling. The expressway mentioned here includes a motorway. The road shape information is information indicating a shape of each lane of the road on which the vehicle is traveling, in other words, information indicating the shape of a lane marking or a road edge that partitions each lane. The road shape information may be information based on the recognition result of the lane marking or the road edge provided from the environment recognition unit 61 in addition to the map data provided from the locator 35.[P-63]
Each line icon PIn is drawn using the road shape information, and indicates a plurality of lane markings provided on the road on which the host vehicle is traveling. The driver's vehicle lane VLs in which the host vehicle A travels is reproduced by the line icons PIn drawn on the left and right of the host vehicle icon Pv. The line icons PIn displayed at the left and right ends in the peripheral image Dpm are road edge icons Pre indicating the left and right road edges, respectively. The line icons PIn reproduces each lane of the traveling road in the peripheral image Dpm.[P-71]
Further, in the present embodiment, when the road on which the vehicle is traveling is a general road, the road edge icons Pre as the line icon PIn indicating the road edge is drawn in the peripheral image Dpm. The road edge icons Pre are not drawn when the vehicle travels on an expressway. According to the above, on a general road where a pedestrian or the like is present on a side of the road, the driver can be notified of the details of the surroundings of the host vehicle by clearly indicating the road edge. On the other hand, in an expressway where there is no pedestrian or the like, the peripheral image Dpm can be simplified.[P-118]
Kume teaches the display control unit is configured to display a road edge structure icon corresponding to a road edge structure closest to the host vehicle among the recognized multiple types of road edge structures on the in-vehicle display (Paragraphs 63, 70, 93, 118)
The road type information includes at least information indicating whether the road on which the vehicle is traveling is a general road or an expressway (or a parking lot) and information indicating a traveling direction of each lane of the road on which the vehicle is traveling. The expressway mentioned here includes a motorway. The road shape information is information indicating a shape of each lane of the road on which the vehicle is traveling, in other words, information indicating the shape of a lane marking or a road edge that partitions each lane. The road shape information may be information based on the recognition result of the lane marking or the road edge provided from the environment recognition unit 61 in addition to the map data provided from the locator 35.[P-63]
Each line icon PIn is drawn using the road shape information, and indicates a plurality of lane markings provided on the road on which the host vehicle is traveling. The driver's vehicle lane VLs in which the host vehicle A travels is reproduced by the line icons PIn drawn on the left and right of the host vehicle icon Pv. The line icons PIn displayed at the left and right ends in the peripheral image Dpm are road edge icons Pre indicating the left and right road edges, respectively. The line icons PIn reproduces each lane of the traveling road in the peripheral image Dpm.[P-70]
As described above, in a traveling scene shown in FIG. 6, the 3D drawing unit 73 draws the line icons PIn corresponding to all the lane markings and road edges recognized based on the detection information of the peripheral monitoring sensor 30, the map data, and the like in the peripheral image Dpm. That is, a display of the line icon PIn is not omitted during the eyes-off automated driving. As a result, all the lanes are reproduced in the peripheral image Dpm by the line icons PIn including the road edge icons Pre.[P-93]
Further, in the present embodiment, when the road on which the vehicle is traveling is a general road, the road edge icons Pre as the line icon PIn indicating the road edge is drawn in the peripheral image Dpm. The road edge icons Pre are not drawn when the vehicle travels on an expressway. According to the above, on a general road where a pedestrian or the like is present on a side of the road, the driver can be notified of the details of the surroundings of the host vehicle by clearly indicating the road edge. On the other hand, in an expressway where there is no pedestrian or the like, the peripheral image Dpm can be simplified.[P-118]
Here, we see the monitor sensor detecting the road environment including the road edges, and based on the detected road edge display the detected road edge in 3D form on the display map with respect to the host vehicle’s immediate surrounding environment
Kume fails to teach specific the road edge structure types includes at least two of a guardrail, a wall, and a curb
Ermilios on the other hand teaches the road edge structure types includes at least two of a guardrail, a wall, and a curb(Page 3, Paragraph 4; Page 6, Paragraph 2; Page 8, Paragraph 5)
In order to provide a reliable and effective operation of the calibration method according to a purpose of the method of the invention is detecting above ground or below the average of the road beside the elongated environmental characteristics or the presence of an object, such as a wall, curb, ditch, vegetation or standing vehicle, because they has the negative influence to the calibration method. Here, it does not need to classify the attributes of these objects or characterization, but only detects their presence in a pair of image frames obtained from at least one camera mounted on the vehicle, the vehicle is almost vertical traveling to or in parallel with the object, such as a curb. Thus, it is possible to provide autonomous road based on based on the image captured by the camera calibration method, which will not be damaged by the presence of these objects and deflection. In particular, comprising such elongate features identification image is marked as not suitable frame, it is rejected. The method can be performed by the computing device of the vehicle side, the vehicle side computing device is based on at least two images captured by at least one camera to identify the object.[Pg3, P-4]
In particular, external calibration of the camera, which is called the motion tracking calibration (MTC) operation, is performed based on image captured by at least one camera, the image at least partially illustrates the pavement or the ground in the vicinity of the motor vehicle. and a texture (such as asphalt gravel ground) present on the surface of, but not displayed object. using the road surface texture of those image display, the computing device further adapted to calibrate a camera, especially does not need any existing on the image of the particular characteristic of interest, such as corners, profile, edge or line.[Pg 6, P-2]
Here, the motor vehicle 1 along the road 10 parallel to elongated body 12. under the current situation, the object 12 is a curb. object 12 can be wall, canal, vegetation or standing vehicle. these objects 12 present in the lower condition of the image captured by camera 4, calibration may be biased. Therefore, to be detected above the average ground level or 10 beside the road under those elongated environment object 12 or the existing feature. It does not attempt to attribute classification or characterization of the object, and only in order to detect their presence in a pair of image frames, e.g., from the provided unadjusted camera 4 to obtain on the vehicle 1, the vehicle 1 is substantially straight and parallel to the curb or object 12. in the identification condition of the elongated features 12, each frame may be marked as not suitable frame, therefore, is rejected to produce by the presence of 12 object and the deviation of calibration. In order to detect these objects 12, a computing device 3 adapted to perform schematic shown in FIG. 2 of method.[Pg 8, P-5]
Here we see a vehicle sensors such a s camera accounting for a moving vehicle’s environment and displaying the elements along the edge of the vehicle’s path/road including a wall, curb, and pavement
Therefore, it is obvious to one of ordinary skill in the art when combined with Kume’s selectively display of lane line icons or edge icons, representative of the closest detected elements along the vehicles path, enabling the display of the closest edge icon along the vehicles path, i.e. a pavement, a curb, a wall.
Thereby, enabling the display a road edge structure icon corresponding to a road edge structure closest to the host vehicle among the recognized multiple types of road edge structures on the in-vehicle display and not to display road edge structure icons corresponding to the remaining road edge structures on the in-vehicle display when multiple types of road edge structures are recognized on either the left or right side of the host vehicle.
Hence, it would have been obvious to one of ordinary skill in the art during the time of the filing date of the said invention to combine Ermilios’ teaching with Kume’s teaching in order to enable a more accurate visual representation of the vehicle’s surroundings along its path in order to increase the safety probability.
In regards to claim 6, Kume modified teaches a display control unit is configured to display a road edge structure icon corresponding to a road edge structure next closest to the host vehicle among the recognized multiple types of road edge structures on the in-vehicle display when the closest road edge structure to the host vehicle is lost when multiple types of road edge structures are recognized on either the left or right side of the host vehicle.
Kume teaches the selective display of different icons on the vehicles path given different scenarios, especially the configuration to display one icon when the other is unrecognized (Paragraphs 71, 103, Kume)
The line icons PIn include a recognition line Li1 and an unrecognized line Li2. The recognition line Li1 is a line icon PIn corresponding to a lane marking (or a road edge) recorded in the map data and recognized by the peripheral monitoring sensor 30. The unrecognized line Li2 is a line icon PIn corresponding to a lane marking that is recorded in the map data but is not recognized by the peripheral monitoring sensor 30. The line icon PIn arranged in the vicinity of the host vehicle icon Pv is the recognition line Li1. On the other hand, the line icon PIn drawn at a position away from the host vehicle icon Pv is the unrecognized line Li2.[P-71]
The 3D drawing unit 73 hides the line icon PIn or the road edge icons Pre away from the driver's vehicle lane VLs, and then displays the pedestrian icon Pa (see FIG. 9) when the warning target information in which the warning target such as a pedestrian is detected is acquired. The 3D drawing unit 73 increases the number of line icons PIn drawn in the peripheral image Dpm in accordance with a start of display of the pedestrian icon Pa. The 3D drawing unit 73 redisplays the road edge icons Pre or the line icon PIn in the vicinity of the pedestrian icon Pa.[P-103]
Hence, when in combination with Ermilios teaching of displaying images representative of the vehicle’s surroundings as it travels along its path including pavements, curb, and wall(Page 3, Paragraph 4; Page 6, Paragraph 2; Page 8, Paragraph 5, Ermilios)
In order to provide a reliable and effective operation of the calibration method according to a purpose of the method of the invention is detecting above ground or below the average of the road beside the elongated environmental characteristics or the presence of an object, such as a wall, curb, ditch, vegetation or standing vehicle, because they has the negative influence to the calibration method. Here, it does not need to classify the attributes of these objects or characterization, but only detects their presence in a pair of image frames obtained from at least one camera mounted on the vehicle, the vehicle is almost vertical traveling to or in parallel with the object, such as a curb. Thus, it is possible to provide autonomous road based on based on the image captured by the camera calibration method, which will not be damaged by the presence of these objects and deflection. In particular, comprising such elongate features identification image is marked as not suitable frame, it is rejected. The method can be performed by the computing device of the vehicle side, the vehicle side computing device is based on at least two images captured by at least one camera to identify the object.[Pg3, P-4]
In particular, external calibration of the camera, which is called the motion tracking calibration (MTC) operation, is performed based on image captured by at least one camera, the image at least partially illustrates the pavement or the ground in the vicinity of the motor vehicle. and a texture (such as asphalt gravel ground) present on the surface of, but not displayed object. using the road surface texture of those image display, the computing device further adapted to calibrate a camera, especially does not need any existing on the image of the particular characteristic of interest, such as corners, profile, edge or line.[Pg 6, P-2]
Here, the motor vehicle 1 along the road 10 parallel to elongated body 12. under the current situation, the object 12 is a curb. object 12 can be wall, canal, vegetation or standing vehicle. these objects 12 present in the lower condition of the image captured by camera 4, calibration may be biased. Therefore, to be detected above the average ground level or 10 beside the road under those elongated environment object 12 or the existing feature. It does not attempt to attribute classification or characterization of the object, and only in order to detect their presence in a pair of image frames, e.g., from the provided unadjusted camera 4 to obtain on the vehicle 1, the vehicle 1 is substantially straight and parallel to the curb or object 12. in the identification condition of the elongated features 12, each frame may be marked as not suitable frame, therefore, is rejected to produce by the presence of 12 object and the deviation of calibration. In order to detect these objects 12, a computing device 3 adapted to perform schematic shown in FIG. 2 of method.[Pg 8, P-5]
Thereby would be obvious to one of ordinary skill in the art to substitute an edge icon for another in Kume’s selective display of icons in different scenarios, allowing for the selective configuration of the display to enable the displaying a road edge structure icon corresponding to a road edge structure next closest to the host vehicle among the recognized multiple types of road edge structures on the in-vehicle display when the closest road edge structure to the host vehicle is lost when multiple types of road edge structures are recognized on either the left or right side of the host vehicle.
In regards to claim 7, Kume modified via Ermilios teaches a pavement edge recognition unit configured to recognize a pavement edge of a road positioned on either the left or right side of the host vehicle based on a detection result of a sensor of the host vehicle, (Page 6, Paragraph 2, Ermilios)
In particular, external calibration of the camera, which is called the motion tracking calibration (MTC) operation, is performed based on image captured by at least one camera, the image at least partially illustrates the pavement or the ground in the vicinity of the motor vehicle. and a texture (such as asphalt gravel ground) present on the surface of, but not displayed object. using the road surface texture of those image display, the computing device further adapted to calibrate a camera, especially does not need any existing on the image of the particular characteristic of interest, such as corners, profile, edge or line.[Pg 6, P-2]
Here, we see a pavement recognition device such as a camera, and a pavement illustration displayed within a vehicle.
Furthermore, Kume teaches displaying one icon and not the other icon(Paragraph 103, Kume)
The 3D drawing unit 73 hides the line icon PIn or the road edge icons Pre away from the driver's vehicle lane VLs, and then displays the pedestrian icon Pa (see FIG. 9) when the warning target information in which the warning target such as a pedestrian is detected is acquired. The 3D drawing unit 73 increases the number of line icons PIn drawn in the peripheral image Dpm in accordance with a start of display of the pedestrian icon Pa. The 3D drawing unit 73 redisplays the road edge icons Pre or the line icon PIn in the vicinity of the pedestrian icon Pa.[P-103]
Thereby combining both teachings enables the selective configuration such that the display control unit is configured to display a pavement edge icon corresponding to the pavement edge on the in-vehicle display and not to display the road edge structure icon when the pavement edge is recognized.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ANTHONY D AFRIFA-KYEI whose telephone number is (571)270-7826. The examiner can normally be reached Monday-Friday 10am-7pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, BRIAN ZIMMERMAN can be reached at 571-272-3059. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ANTHONY D AFRIFA-KYEI/Examiner, Art Unit 2686
/BRIAN A ZIMMERMAN/Supervisory Patent Examiner, Art Unit 2686