Notice of Pre-AIA or AIA Status
The present application is being examined under the pre-AIA first to invent provisions.
This office action is in response to application filed 06/10/2024 in which claim 1-38 are pending.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 06/10/2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Rejections - 35 USC § 112
The following is a quotation of the first paragraph of 35 U.S.C. 112(a):
(a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention.
The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112:
The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention.
Claims 3, 22, 33 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claims contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventors, at the time the application was filed, had possession of the claimed invention.
Claims 3, 22, 33 recite “wherein the driver-viewing camera comprises a thermal sensing camera”. Specification only has support for “forward-viewing camera comprises a thermal sensing camera” ( as claimed in claim 2, 21 , 32) and not “driver-viewing camera comprises a thermal sensing camera”. Para[0046] in page 16 recites Optionally, the system may be operable to distinguish dead animals from live animals (live animals move while dead animals do not, and live animals are warm and dead animals typically are not; and this may be detected by a heat sensing device or visible or near-infrared or thermal- infrared sensors or the like). Thus, the vision system may detect and identify animals on the road or in the path of travel of the vehicle, and may provide an alert or may take evasive action to avoid the detected animal.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of pre-AIA 35 U.S.C. 103(a) which forms the basis for all obviousness rejections set forth in this Office action:
(a) A patent may not be obtained though the invention is not identically disclosed or described as set forth in section 102, if the differences between the subject matter sought to be patented and the prior art are such that the subject matter as a whole would have been obvious at the time the invention was made to a person having ordinary skill in the art to which said subject matter pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under pre-AIA 35 U.S.C. 103(a) are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1, 8-20 are rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Niibe et al (JP2002019491A ) (machine translation attached) in view of Lu et al. (US 2008/0129541 A1) and Saka et al. (JP 2010009372 A) (machine translation attached) in further view of Vincent et al. (US 2009/0314942 A1).
Regarding claim 1, Niibe discloses a vehicular control system, the vehicular control system comprising: a forward-viewing camera disposed (para[0029] teaches reference numeral 12 denotes an infrared camera that uses infrared rays to capture images of the area ahead of vehicle C); wherein the forward-viewing camera is operable to capture image data (para[0029] teaches Reference numeral 12 denotes an infrared camera that uses infrared rays to capture images of the area ahead of vehicle C); a gaze detection device that views a driver of the equipped vehicle, wherein the gaze detection device comprises a driver-viewing camera that is operable to capture image data (Para[0026] teaches and a driver imaging camera 62 that captures the infrared light projected from the infrared projector lamp 61 reflected off the head and face) , para[0040] As shown in Figure 8, the driver monitor unit 63 extracts an image of the driver's head and face by performing general binarization processing and feature point extraction processing in the image processing unit 63a based on the video signal output from the driver imaging camera 62, and based on the extracted image of the head and face, the gaze point detection unit 63b detects the driver's head and face direction, gaze direction and pupil diameter, and detects the driver's gaze point (center point of the field of view range) from the gaze direction); an image processor operable to process image data captured by the forward- viewing camera as the equipped vehicle travels along a road (para[0038] teaches the CPU 20 receives signals from each element.. infrared camera 12); wherein image data captured by the gaze detection device is processed to detect a gaze direction of the driver of the equipped vehicle (Para[0040]-[0041] teaches the gaze direction axis detection unit 63c determines the line connecting the detected driver's gaze point and the center of the driver's left and right eyes as the axis of the driver's gaze direction, and outputs this axis of the driver's gaze direction to the CPU 20, The detection process for the axis of the driver's line of sight in the driver monitor unit 63 is performed as shown in FIG. That is, in the first step SA1, infrared light is projected onto the head and face of the driver by the infrared projector lamp 61, and an analog video signal of the head and face captured by the driver imaging camera 62 is taken into the image processing unit 63a, and the video signal is subjected to a general binarization process to convert it into digital multi-value image data for each pixel); wherein the vehicular control system, via processing by the image processor of captured image data, detects an object present ahead of the equipped vehicle and being gazed at by the driver of the equipped vehicle (Para[0016], [0060] teaches the information acquisition means has an obstacle detection means for detecting an obstacle present in front of the vehicle, and the control means is configured to, when an obstacle is detected in front of the vehicle by the obstacle detection means, set the display position of information regarding the obstacle by the display means to a position on or near the axis of the driver's line of sight, If the determination in step SC8 is NO, the process returns, whereas if the determination is YES, the display position of the information about the obstacle is gradually moved while guiding the driver's line of sight in the direction of the obstacle, and then the process returns. para[0069] teaches In the next step SE3, it is determined based on the detection data from the forward obstacle radar 11 whether or not an obstacle exists within a first predetermined distance from the vehicle C on the travel path); and wherein the vehicular control system, responsive at least in part to the detected gaze direction of the driver of the equipped vehicle, and via processing by the image processor of captured image data, classifies the detected object as being one selected from the group consisting of (i) a pedestrian (para[0098] teaches First, in the first step SG1, information on obstacles (accidents, objects fallen on the road, traffic congestion information, etc.) on the road ahead of vehicle C is input from the road-to-vehicle communication unit 15, and in the next step SG2, it is determined whether or not there are any obstacles on the road ahead. para[0189] teaches forward obstacle radar 11 detects an obstacle such as a vehicle ahead or a pedestrian).
Niibe does not explicitly disclose a forward-viewing camera disposed at an in-cabin side of a windshield of a vehicle equipped with the vehicular control system, the forward-viewing camera viewing forward of the equipped vehicle through the windshield of the equipped vehicle; and via processing by the image processor of captured image data, classifies the detected object as being one selected from the group consisting of (i) a pedestrian, (ii) a live animal and (iii) a dead animal. However Lu discloses a forward-viewing camera disposed at an in-cabin side of a windshield of a vehicle equipped with the vehicular control system, the forward-viewing camera viewing forward of the equipped vehicle through the windshield of the equipped vehicle (Para[0039] teaches the imaging sensor or camera preferably has a forward field of view through the windshield of the vehicle and preferably through an area cleaned or wiped by a windshield wiper of the vehicle when the windshield wiper is activated. Imaging sensor is disposed at an interior rearview mirror assembly of the vehicle and has a forward field of view through the windshield of the vehicle). It would have been obvious to one having ordinary skill in the art before the effective filing date of the invention to use the method that detects the direction of the driver's eyes is detected, and the kind of the information to be recognized by the driver is judged of Niibe with the method of detecting forward field of view in a direction of forward travel of the vehicle of Lu in order to provide system operable to detect surface in front of the vehicle in response to said image processing.
Niibe in view of Liu does not explicitly disclose and via processing by the image processor of captured image data, classifies the detected object as being one selected from the group consisting of (i) a pedestrian, (ii) a live animal and (iii) a dead animal. However Saka discloses and via processing by the image processor of captured image data, classifies the detected object as being one selected from the group consisting of (i) a pedestrian, an animal (Para[0023] teaches next, when it is determined that the object is not an artificial structure (FIG. 4 / STEP 44... NO), it is determined whether or not the object corresponds to a pedestrian (FIG. 4 / STEP 45). Specifically, it is determined whether or not the object corresponds to a pedestrian from the characteristics such as the shape and size of the image area of the object on the gray scale image and luminance dispersion. Then, when it is determined that the object corresponds to a pedestrian (FIG. 4 / STEP 45... YES), the detected object is determined as a notification or alerting target (FIG. 4 / STEP 46). [0024] On the other hand, when it is determined that the object is an artificial structure (FIG. 4 / STEP 44... YES), or when it is determined that the object is not a pedestrian (FIG. 4 / STEP 45. It is determined whether or not it corresponds to a paw animal (FIG. 4 / STEP 47). When it is determined that the target object corresponds to a quadruped animal (FIG. 4 / STEP 47... YES), the image processing unit 1 determines the detected target object as a report target (FIG. 4 / STEP 46). On the other hand, when it is determined that the object does not correspond to a quadruped animal (FIG. 4 / STEP 47... NO), the image processing unit 1 excludes the detected object from the report target (FIG. 4 / STEP 48). It would have been obvious to one having ordinary skill in the art before the effective filing date of the invention to use the method that detects the direction of the driver's eyes is detected, and the kind of the information to be recognized by the driver is judged in a direction of forward travel of the vehicle of Niibe in view of Liu with the method which detects an obstacle such as a vehicle ahead or a pedestrian, information about the obstacle is displayed on the axis of the driver's line of sight or in a position close to it, and then the radar moves while guiding the driver's line of sight in the direction of the obstacle allowing the driver to quickly and reliably confirm the presence of the obstacle, further improving safety of with the method of detecting the object corresponds to a pedestrian or quadruped animal of Saka in order to provide a system in which the recognition accuracy can be improved.
Niibe in view of Liu and Saka does not explicitly disclose classifies the detected object as being one selected from the group consisting of (i) a pedestrian, (ii) a live animal and (iii) a dead animal. However Vincent discloses classifies the detected object as being one selected from the group consisting of (i) a pedestrian, (ii) a live animal and (iii) a dead animal (para[0029] teaches detecting the heat signature of objects in the area, This would allow one to distinguish between, for instance, living or dead animals based upon heat signature and keratin signature, Para[0058]- [0060] teaches the present invention may be used to detect the presence of humans or animals even in situation and environments where night vision or other detectors may not be able to function as effectively. In this same regard, the invention may be used to detect the presence of persons or animals near vehicles . May be used as an onboard system to detect animals (living or non-living) in a vehicle's path, such as a child behind a car or a deer, dead or alive, near or in a roadway. even detect target persons or animals regardless whether they exhibit a heat signature (such as in the case where the person or animal is not living ). It would have been obvious to one having ordinary skill in the art before the effective filing date of the invention to use the method while guiding the driver's line of sight in the direction of the obstacle allowing the driver to quickly and reliably confirm the presence of the obstacle and detecting the object corresponds to a pedestrian or quadruped animal, further improving safety of Niibe in view of Liu and Saka with the method of detecting the heat signature of objects in the area of Vincent in order to provide a system distinguish between, for instance, living or dead animals.
Regarding claim 8, Niibe discloses the vehicular control system of claim 1, wherein the vehicular control system is part of an adaptive cruise control system of the equipped vehicle (para[0028] teaches auto cruise and collision warning system with distance maintenance function (ICCW (Intelligent Cruise Control & Collision Warning))).
Regarding claim 9, Niibe discloses the vehicular control system of claim 1, wherein the vehicular control system is operable to adjust an adaptive cruise control system of the equipped vehicle responsive to determination of a road condition ahead of the equipped vehicle (para[0030] teaches 15 is a road-to-vehicle communication unit that receives information regarding road conditions, etc. ahead of the vehicle C's driving path (information on accidents, fallen objects on the road, traffic jams, etc.) transmitted from an information providing device (infrastructure) outside the vehicle C, and this road-to-vehicle communication unit 15 is used to obtain information regarding road conditions, etc. outside the vehicle C (information communication system. Para[00988]- [0099] teaches if the determination is YES, the information provision flag Fe1 is set to 1, and the left and right front speakers 29, 30 are driven to output a single artificial sound and the process returns. para [0102]-[0104] teaches If the determination in step SG23 is YES, the process proceeds to step SG24, where an attention-calling display such as that shown in FIG. 28 (in the case of traffic congestion information) is displayed on the windshield 7, and the process returns. information regarding obstacles on the road ahead, obtained by the road-to-vehicle communication unit 15, can be provided to the driver. Para[0155]-[0157]).
Regarding claim 11, Niibe discloses the vehicular control system of claim 1, wherein the vehicular control system estimates a separation gap between the equipped vehicle and a leading vehicle traveling along the road ahead of the equipped vehicle (para[0028] teaches In Figure 3, reference numeral 11 denotes a forward obstacle radar 11, which consists of a laser radar, millimeter wave radar, etc., that detects obstacles in front of vehicle C and measures the positional relationship and distance between vehicle C and the obstacle).
Regarding claim 12, Niibe discloses the vehicular control system of claim 11, wherein, at least in part via processing by the image processor of image data captured by the forward-viewing camera, the vehicular control system determines a target separation gap between the equipped vehicle and the leading vehicle (Para[0028] teaches , vehicle C travels while maintaining a constant distance from the vehicle in front of vehicle C (preceding vehicle)), and wherein the vehicular control system controls the equipped vehicle to maintain the separation gap between the equipped vehicle and the leading vehicle at the determined target separation gap (para[0057] teaches driver has performed a danger avoidance operation at the following times: in the ICCW, a safe inter-vehicle distance has been achieved; para[0072]- Para[0074] teaches step SE10, the acceleration/deceleration means 52 is driven and controlled (driving while maintaining a safe distance) so that the distance L between the vehicle C and the obstacle (vehicle ahead) becomes a preset distance (which may be set by the driver), and the process returns. Then, in step SE16, the information provision flag Fc1 and the primary warning flag Fc2 are both reset to 0, and then in step SE17, the brake unit 18 of the acceleration/deceleration means 52 is activated (automatically braked) so that the distance L between the vehicle C and the obstacle approaches a predetermined distance (which may be set by the driver), and the process returns).
Regarding claim 13, Niibe discloses the vehicular control system of claim 12, wherein the determined target separation gap is adjusted based on a current driving condition (para[0074] teaches the brake unit 18 of the acceleration/deceleration means 52 is activated (automatically braked) so that the distance L between the vehicle C and the obstacle approaches a predetermined distance (which may be set by the driver), and the process returns).
Regarding claim 14, Niibe discloses the vehicular control system of claim 12, wherein the vehicular control system adjusts the determined target separation gap based on a driving capability of the driver of the equipped vehicle (para[0082] teaches inter-vehicle distance information is provided as decision support information for the driver, a warning is given to the driver if there is a high risk of collision, and if the driver does not take appropriate evasive action, automatic braking is performed to avoid a collision.).
Regarding claim 15, Niibe discloses the vehicular control system of claim 14, wherein the driving capability of the driver of the equipped vehicle is determined at least in part by processing by the image processor of image data captured by the gaze detection device (Para[0058] teaches when the driver performs a danger avoidance operation, even if the information is type 2, the driver is already aware of the information, so as will be described later (see step SC7), there is no need to display the information on the windshield 7 on the axis of the driver's line of sight or in a position nearby, but rather it is displayed in the basic display position. Furthermore, even if the display position of the information was set on the axis of the driver's line of sight on the windshield or in a position close to it in the previous step SC7, if the driver's danger avoidance operation is detected, the display position of the information will return to the basic display position, Para[0189] teaches when the forward obstacle radar 11 detects an obstacle such as a vehicle ahead or a pedestrian, information about the obstacle is displayed on the axis of the driver's line of sight or in a position close to it, and then the radar moves while guiding the driver's line of sight in the direction of the obstacle, allowing the driver to quickly and reliably confirm the presence of the obstacle, further improving safety).
Regarding claim 16, Niibe discloses the vehicular control system of claim 12, wherein the vehicular control system adjusts the determined target separation gap at least in part responsive to determination of the change in traction condition ahead of the equipped vehicle (para[0057] teaches in the curve entry speed warning system, the vehicle has slowed down to a safe speed; in the ICCW, a safe inter-vehicle distance has been achieved).
Regarding claim 17, Niibe discloses the vehicular control system of claim 16, wherein the vehicular control system increases the determined target separation gap responsive at least in part to the determined change in traction condition being indicative of a reduction in traction at the road ahead of the equipped vehicle (Para[0057] teaches That is, in the curve entry speed warning system, it is determined that the driver has performed a danger avoidance operation when: in the curve entry speed warning system, the vehicle has slowed down to a safe speed; in the ICCW, a safe inter-vehicle distance has been achieved; in the ICCW, the relative speed of vehicle C to the vehicle in front becomes negative (the direction of approach is positive); or when an object no longer exists in the path of vehicle C due to vehicle C or the vehicle in front changing lanes, etc).
Regarding claim 18, Niibe discloses the vehicular control system of claim 16, wherein the vehicular control system increases the determined target separation gap responsive at least in part to determination of a curve in the road ahead of the equipped vehicle (Para[0057] teaches That is, in the curve entry speed warning system, it is determined that the driver has performed a danger avoidance operation when: in the curve entry speed warning system, the vehicle has slowed down to a safe speed; in the ICCW, a safe inter-vehicle distance has been achieved; in the ICCW, the relative speed of vehicle C to the vehicle in front becomes negative (the direction of approach is positive); or when an object no longer exists in the path of vehicle C due to vehicle C or the vehicle in front changing lanes, etc.).
Regarding claim 19, Niibe discloses the vehicular control system of claim 1, wherein image data captured by the forward-viewing camera is processed for at least one driving assist system of the equipped vehicle (para[0111] teaches images of the driving scene ahead in adverse environments (such as at night, during thick fog, or during rain) visualized using the infrared camera 12 can be provided as information to assist the driver's awareness).
Regarding claim 20, Niibe discloses the vehicular control system of claim 19, wherein the at least one driving assist system of the equipped vehicle comprises at least one selected from the group consisting of (i) a lane departure warning system of the equipped vehicle (Para[0057] teaches in the lane departure warning system, it is determined that the driver has performed a danger avoidance operation when: the steering wheel 6 is steered in the direction to avoid departure; the steering wheel 6 or the brake is operated; or when vehicle C returns to the center of the lane, FIG. 50 shows the control processing operation of the lane departure warning system), (ii) a forward collision warning system of the equipped vehicle (Para[0028] teaches collision warning system with distance maintenance function, para[0082] teaches Through the above control operations, inter-vehicle distance information is provided as information to assist the driver in making decisions, a warning is given to the driver if there is a high risk of collision, and automatic braking is performed to avoid a collision if the driver does not take appropriate evasive action ), (iii) a headlamp control system of the equipped vehicle and (iv) a traffic sign recognition system of the equipped vehicle (para[0132] –[0133] teaches indicate the presence of a stop intersection, para[0139] –[0143] teaches when the distance Da from the vehicle C to the stop intersection is smaller than the second predetermined value D2, the first display section 8 within the display frame 5 displays "Slow down and stop" to prompt the driver to slow down and stop).
Claim 2, 21, 27-31 is rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Niibe et al (JP2002019491A ) (machine translation attached) in view of Lu et al. (US 2008/0129541 A1) and Saka et al. (JP 2010009372 A) (machine translation attached) in further view of Vincent et al. (US 2009/0314942 A1) and Groves et al. (5,414,439).
Regarding claim 2, Niibe in view of Liu and Saka in further view of Vincent discloses the vehicular control system of claim 1, Niibe in view of Liu and Saka in further view of Vincent does not explicitly disclose wherein the forward-viewing camera comprises a thermal sensing camera. However Groves discloses wherein the forward-viewing camera comprises a thermal sensing camera (Col 3 lines 3-13 teaches the camera senses the thermal pattern or the image of infrared radiation in its field of view. Preferably the camera should be sensitive to radiation in the mid-infrared range or about 8-12 micrometres. Since hot objects radiate more infrared energy then cool objects, the hot or warm objects will yield stronger signals. Thus animals, including humans, which are warm show up prominently in the image). It would have been obvious to one having ordinary skill in the art before the effective filing date of the invention to use the method while guiding the driver's line of sight in the direction of the obstacle allowing the driver to quickly and reliably confirm the presence of the obstacle and detecting the object corresponds to a pedestrian or quadruped animal, by detecting the heat signature of objects in the area of Niibe in view of Liu and Saka in further view of Vincent with the method of producing a video signal representing the thermal pattern of the scene of Groves in order to provide system to attract the driver's attention to the warmest objects which are generally the most important ones to be made aware of.
Regarding claim 21, Niibe discloses a vehicular control system, the vehicular control system comprising: a forward-viewing camera (para[0029] teaches reference numeral 12 denotes an infrared camera that uses infrared rays to capture images of the area ahead of vehicle C); wherein the forward-viewing camera is operable to capture image data (para[0029] teaches Reference numeral 12 denotes an infrared camera that uses infrared rays to capture images of the area ahead of vehicle C); a gaze detection device that views a driver of the equipped vehicle, wherein the gaze detection device comprises a driver-viewing camera that is operable to capture image data (Para[0026] teaches and a driver imaging camera 62 that captures the infrared light projected from the infrared projector lamp 61 reflected off the head and face , para[0040] As shown in Figure 8, the driver monitor unit 63 extracts an image of the driver's head and face by performing general binarization processing and feature point extraction processing in the image processing unit 63a based on the video signal output from the driver imaging camera 62, and based on the extracted image of the head and face, the gaze point detection unit 63b detects the driver's head and face direction, gaze direction and pupil diameter, and detects the driver's gaze point (center point of the field of view range) from the gaze direction);an image processor operable to process image data captured by the forward- viewing camera as the equipped vehicle travels along a road (para[0038] teaches the CPU 20 receives signals from each element.. infrared camera 12); wherein image data captured by the gaze detection device is processed to detect a gaze direction of the driver of the equipped vehicle (Para[0040]-[0041] teaches the gaze direction axis detection unit 63c determines the line connecting the detected driver's gaze point and the center of the driver's left and right eyes as the axis of the driver's gaze direction, and outputs this axis of the driver's gaze direction to the CPU 20, The detection process for the axis of the driver's line of sight in the driver monitor unit 63 is performed as shown in FIG. That is, in the first step SA1, infrared light is projected onto the head and face of the driver by the infrared projector lamp 61, and an analog video signal of the head and face captured by the driver imaging camera 62 is taken into the image processing unit 63a, and the video signal is subjected to a general binarization process to convert it into digital multi-value image data for each pixel); wherein the vehicular control system, via processing by the image processor of captured image data, detects an object present ahead of the equipped vehicle and being gazed at by the driver of the equipped vehicle (Para[0016], [0060] teaches the information acquisition means has an obstacle detection means for detecting an obstacle present in front of the vehicle, and the control means is configured to, when an obstacle is detected in front of the vehicle by the obstacle detection means, set the display position of information regarding the obstacle by the display means to a position on or near the axis of the driver's line of sight, If the determination in step SC8 is NO, the process returns, whereas if the determination is YES, the display position of the information about the obstacle is gradually moved while guiding the driver's line of sight in the direction of the obstacle, and then the process returns. para[0069] teaches In the next step SE3, it is determined based on the detection data from the forward obstacle radar 11 whether or not an obstacle exists within a first predetermined distance from the vehicle C on the travel path); wherein the vehicular control system, responsive at least in part to the detected gaze direction of the driver of the equipped vehicle, and via processing by the image processor of captured image data, classifies the detected object as being one selected from the group consisting of (i) a pedestrian (para[0098] teaches First, in the first step SG1, information on obstacles (accidents, objects fallen on the road, traffic congestion information, etc.) on the road ahead of vehicle C is input from the road-to-vehicle communication unit 15, and in the next step SG2, it is determined whether or not there are any obstacles on the road ahead. para[0189] teaches forward obstacle radar 11 detects an obstacle such as a vehicle ahead or a pedestrian); and wherein image data captured by the forward-viewing camera is processed for at least one driving assist system of the equipped vehicle.
Niibe does not explicitly disclose a forward-viewing camera disposed at an in-cabin side of a windshield of a vehicle equipped with the vehicular control system, the forward-viewing camera viewing forward of the equipped vehicle through the windshield of the equipped vehicle; wherein the forward-viewing camera comprises a thermal sensing camera; and via processing by the image processor of captured image data, classifies the detected object as being one selected from the group consisting of (i) a pedestrian ii) a live animal and (iii) a dead animal. However Lu discloses a forward-viewing camera disposed at an in-cabin side of a windshield of a vehicle equipped with the vehicular control system, the forward-viewing camera viewing forward of the equipped vehicle through the windshield of the equipped vehicle(Para[0039] teaches the imaging sensor or camera preferably has a forward field of view through the windshield of the vehicle and preferably through an area cleaned or wiped by a windshield wiper of the vehicle when the windshield wiper is activated. imaging sensor is disposed at an interior rearview mirror assembly of the vehicle and has a forward field of view through the windshield of the vehicle). It would have been obvious to one having ordinary skill in the art before the effective filing date of the invention to use the method that detects the direction of the driver's eyes is detected, and the kind of the information to be recognized by the driver is judged of Niibe with the method of detecting forward field of view in a direction of forward travel of the vehicle of Lu in order to provide system operable to detect surface in front of the vehicle in response to said image processing.
Niibe in view of Liu does not explicitly disclose wherein the forward-viewing camera comprises a thermal sensing camera; and via processing by the image processor of captured image data, classifies the detected object as being one selected from the group consisting of (i) a pedestrian,(ii) a live animal and (iii) a dead animal. However Saka discloses and via processing by the image processor of captured image data, classifies the detected object as being one selected from the group consisting of (i) a pedestrian, an animal (Para[0023] teaches next, when it is determined that the object is not an artificial structure (FIG. 4 / STEP 44... NO), it is determined whether or not the object corresponds to a pedestrian (FIG. 4 / STEP 45). Specifically, it is determined whether or not the object corresponds to a pedestrian from the characteristics such as the shape and size of the image area of the object on the gray scale image and luminance dispersion. Then, when it is determined that the object corresponds to a pedestrian (FIG. 4 / STEP 45... YES), the detected object is determined as a notification or alerting target (FIG. 4 / STEP 46). [0024] On the other hand, when it is determined that the object is an artificial structure (FIG. 4 / STEP 44... YES), or when it is determined that the object is not a pedestrian (FIG. 4 / STEP 45. It is determined whether or not it corresponds to a paw animal (FIG. 4 / STEP 47). When it is determined that the target object corresponds to a quadruped animal (FIG. 4 / STEP 47... YES), the image processing unit 1 determines the detected target object as a report target (FIG. 4 / STEP 46). On the other hand, when it is determined that the object does not correspond to a quadruped animal (FIG. 4 / STEP 47... NO), the image processing unit 1 excludes the detected object from the report target (FIG. 4 / STEP 48). It would have been obvious to one having ordinary skill in the art before the effective filing date of the invention to use the method that detects the direction of the driver's eyes is detected, and the kind of the information to be recognized by the driver is judged in a direction of forward travel of the vehicle of Niibe in view of Liu with the method which detects an obstacle such as a vehicle ahead or a pedestrian, information about the obstacle is displayed on the axis of the driver's line of sight or in a position close to it, and then the radar moves while guiding the driver's line of sight in the direction of the obstacle allowing the driver to quickly and reliably confirm the presence of the obstacle, further improving safety of with the method of detecting the object corresponds to a pedestrian or quadruped animal of Saka in order to provide a system in which the recognition accuracy can be improved.
Niibe in view of Liu and Saka does not explicitly disclose wherein the forward-viewing camera comprises a thermal sensing camera; classifies the detected object as being one selected from the group consisting of (i) a pedestrian, (ii) a live animal and (iii) a dead animal. However Vincent discloses classifies the detected object as being one selected from the group consisting of (i) a pedestrian, (ii) a live animal and (iii) a dead animal (para[0029] teaches detecting the heat signature of objects in the area, This would allow one to distinguish between, for instance, living or dead animals based upon heat signature and keratin signature, Para[0058]- [0060] teaches the present invention may be used to detect the presence of humans or animals even in situation and environments where night vision or other detectors may not be able to function as effectively. In this same regard, the invention may be used to detect the presence of persons or animals near vehicles . May be used as an onboard system to detect animals (living or non-living) in a vehicle's path, such as a child behind a car or a deer, dead or alive, near or in a roadway. even detect target persons or animals regardless whether they exhibit a heat signature (such as in the case where the person or animal is not living). It would have been obvious to one having ordinary skill in the art before the effective filing date of the invention to use the method while guiding the driver's line of sight in the direction of the obstacle allowing the driver to quickly and reliably confirm the presence of the obstacle and detecting the object corresponds to a pedestrian or quadruped animal, further improving safety of Niibe in view of Liu and Saka with the method of detecting the heat signature of objects in the area of Vincent in order to provide a system distinguish between, for instance, living or dead animals.
Niibe in view of Liu and Saka and Vincent does not explicitly disclose wherein the forward-viewing camera comprises a thermal sensing camera. However Groves discloses wherein the forward-viewing camera comprises a thermal sensing camera (col 3 lines 5-13 teaches the camera senses the thermal pattern or the image of infrared radiation in its field of view. Preferably the camera should be sensitive to radiation in the mid-infrared range or about 8-12 micrometres. Since hot objects radiate more infrared energy then cool objects, the hot or warm objects will yield stronger signals. Thus animals, including humans, which are warm show up prominently in the image). It would have been obvious to one having ordinary skill in the art before the effective filing date of the invention to use the method while guiding the driver's line of sight in the direction of the obstacle allowing the driver to quickly and reliably confirm the presence of the obstacle and detecting the object corresponds to a pedestrian or quadruped animal, by detecting the heat signature of objects in the area of Niibe in view of Liu and Saka in further view of Vincent with the method of producing a video signal representing the thermal pattern of the scene of Groves in order to provide system to attract the driver's attention to the warmest objects which are generally the most important ones to be made aware of.
Regarding claim 27, Nibbe discloses the vehicular control system of claim 21, wherein the at least one driving assist system of the equipped vehicle comprises an adaptive cruise control system of the equipped vehicle (para[0028] teaches (auto cruise and collision warning system with distance maintenance function (ICCW (Intelligent Cruise Control & Collision Warning))Para[0037] & Fig. 7 teaches 44 is an auto cruise main switch for activating the ICCW. FIG. 14 shows the control processing operation of the ICCW, which starts when the auto-cruise main switch 44 is turned on. Para[0070] ) .
Regarding claim 28, Niibe discloses the vehicular control system of claim 27, wherein, at least in part via processing by the image processor of image data captured by the forward-viewing camera, the vehicular control system determines a target separation gap between the equipped vehicle and a leading vehicle traveling along the road ahead of the equipped vehicle (Para[0028] teaches , vehicle C travels while maintaining a constant distance from the vehicle in front of vehicle C (preceding vehicle)), and wherein the vehicular control system controls the equipped vehicle to maintain the separation gap between the equipped vehicle and the leading vehicle at the determined target separation gap para[0057] teaches driver has performed a danger avoidance operation at the following times: in the ICCW, a safe inter-vehicle distance has been achieved; para[0072]- Para[0074] teaches step SE10, the acceleration/deceleration means 52 is driven and controlled (driving while maintaining a safe distance) so that the distance L between the vehicle C and the obstacle (vehicle ahead) becomes a preset distance (which may be set by the driver), and the process returns. Then, in step SE16, the information provision flag Fc1 and the primary warning flag Fc2 are both reset to 0, and then in step SE17, the brake unit 18 of the acceleration/deceleration means 52 is activated (automatically braked) so that the distance L between the vehicle C and the obstacle approaches a predetermined distance (which may be set by the driver), and the process returns).
Regarding claim 29, Niibe discloses the vehicular control system of claim 28, wherein the determined target separation gap is adjusted based on at least one selected from the group consisting of (i) a current driving condition (para[0082] teaches inter-vehicle distance information is provided as decision support information for the driver, a warning is given to the driver if there is a high risk of collision, and if the driver does not take appropriate evasive action, automatic braking is performed to avoid a collision) and (ii) determination of the change in traction condition ahead of the equipped vehicle (para[0057] teaches in the curve entry speed warning system, the vehicle has slowed down to a safe speed; in the ICCW, a safe inter-vehicle distance has been achieved).
Regarding claim 30, Niibe discloses the vehicular control system of claim 28, wherein the determined target separation gap is adjusted based on a driving capability of the driver of the equipped vehicle, and wherein the driving capability of the driver of the equipped vehicle is determined at least in part by processing by the image processor of image data captured by the gaze detection device (Para[0058] teaches when the driver performs a danger avoidance operation, even if the information is type 2, the driver is already aware of the information, so as will be described later (see step SC7), there is no need to display the information on the windshield 7 on the axis of the driver's line of sight or in a position nearby, but rather it is displayed in the basic display position. Furthermore, even if the display position of the information was set on the axis of the driver's line of sight on the windshield or in a position close to it in the previous step SC7, if the driver's danger avoidance operation is detected, the display position of the information will return to the basic display position, Para[0189] teaches when the forward obstacle radar 11 detects an obstacle such as a vehicle ahead or a pedestrian, information about the obstacle is displayed on the axis of the driver's line of sight or in a position close to it, and then the radar moves while guiding the driver's line of sight in the direction of the obstacle, allowing the driver to quickly and reliably confirm the presence of the obstacle, further improving safety).
Regarding claim 31, Niibe discloses the vehicular control system of claim 21, wherein the at least one driving assist system of the equipped vehicle comprises at least one selected from the group consisting of (i) a lane departure warning system of the equipped vehicle (Para[0057] teaches in the lane departure warning system, it is determined that the driver has performed a danger avoidance operation when: the steering wheel 6 is steered in the direction to avoid departure; the steering wheel 6 or the brake is operated; or when vehicle C returns to the center of the lane, FIG. 50 shows the control processing operation of the lane departure warning system), (ii) a forward collision warning system of the equipped vehicle (Para[0028] teaches collision warning system with distance maintenance function, para[0082] teaches Through the above control operations, inter-vehicle distance information is provided as information to assist the driver in making decisions, a warning is given to the driver if there is a high risk of collision, and automatic braking is performed to avoid a collision if the driver does not take appropriate evasive action ), (iii) a headlamp control system of the equipped vehicle and (iv) a traffic sign recognition system of the equipped vehicle (para[0132] –[0133] teaches indicate the presence of a stop intersection, para[0139] –[0143] teaches when the distance Da from the vehicle C to the stop intersection is smaller than the second predetermined value D2, the first display section 8 within the display frame 5 displays "Slow down and stop" to prompt the driver to slow down and stop).
Claim 3 is rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Niibe et al (JP2002019491A ) (machine translation attached) in view of Lu et al. (US 2008/0129541 A1) and Saka et al. (JP 2010009372 A) (machine translation attached) in further view of Vincent et al. (US 2009/0314942 A1) and Graessley et al. (US 2009/0082951 A1).
Regarding claim 3, Niibe in view of Liu and Saka in further view of Vincent discloses the vehicular control system of claim 1, Niibe in view of Liu and Saka in further view of Vincent does not explicitly disclose wherein the driver-viewing camera comprises a thermal sensing camera. However Graessley discloses wherein the driver-viewing camera comprises a thermal sensing camera (para[0020] teaches an infrared sensor can be included in navigation system 102 to detect body heat from a passenger position or driver position). It would have been obvious to one having ordinary skill in the art before the effective filing date of the invention to use the method while guiding the driver's line of sight in the direction of the obstacle allowing the driver to quickly and reliably confirm the presence of the obstacle and detecting the object corresponds to a pedestrian or quadruped animal, by detecting the heat signature of objects in the area of Niibe in view of Liu and Saka in further view of Vincent with the method of difference in the temperature can be used to determine whether a passenger is in the vehicle and operable to input data in order to provide a system improves the safety of the driver and the passenger by restricting the operation of the device that averts the driver's attention away from road.
Claims 4-5 are rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Niibe et al (JP2002019491A ) (machine translation attached) in view of Lu et al. (US 2008/0129541 A1) and Saka et al. (JP 2010009372 A) (machine translation attached) in further view of Vincent et al. (US 2009/0314942 A1) and Paulus et al. (DE 102006005021 A1) (machine translation attached)
Regarding claim 4, Niibe in view of Liu and Saka in further view of Vincent discloses the vehicular control system of claim 1, Niibe in view of Liu and Saka in further view of Vincent does not explicitly disclose wherein, at least in part responsive to the detected object being a dead animal, the vehicular control system determines a location of the dead animal relative to a path of travel of the equipped vehicle. However Paulus wherein, at least in part responsive to the detected object being a dead animal, the vehicular control system determines a location of the dead animal relative to a path of travel of the equipped vehicle (para[0002] –[0004] teaches danger zones on or on roadways can be of the most different kind, dead animals. a method for indicating a hazard location on or in a roadway, the invention provides that a warning signal containing warning information indicating the hazard location and position information describing the hazard location is generated by a signal generating device of a first motor vehicle during or after passing the hazard location Para[0030] teaches If he detects a danger spot, e.g. B. a dead animal lying on the road or an object lost by another vehicle, etc. para[0037] teaches While Fig. 2 describes an example of a danger point 14 in the form of an ice plate 15, the danger point 14 can be any desired danger point, e.g. B. a dead animal lying on the road, a cyclist, a pedestrian, a vehicle broken down at the side of the road, etc.). It would have been obvious to one having ordinary skill in the art before the effective filing date of the invention to use the method while guiding the dri