DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
The amendment filed on 12/05/2025 has been entered. Claims 1, 3-5, 7-8, 10-11, 13-14, and 16-20 are amended, claims 2, 9, 12, 15, and 20 are cancelled, and claims 21-24 are new in this application. Claims 1, 3-8, 10-11, 13-14, 16-19, and 21-24 are pending in this application.
Claim Rejections - 35 USC § 112
The following is a quotation of the first paragraph of 35 U.S.C. 112(a):
(a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention.
Claims 1, 3-8, 10-11, 13-14, 16-19, and 21-24 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention.
The independent claims 1, 14, and 19 state that the dynamic object is illuminated with an illumination source “when the detection status of the dynamic object by the optical sensor includes dynamic object not detected during the low ambient lighting conditions and the object class determined includes a predetermined object class”. Both of these conditions occurring before the object is illuminated is not supported by applicant’s disclosure. Instead, applicant’s disclosure states that the dynamic object is illuminated “If the detection status for the object 70 is ‘not detected’ or if the object 70 is classified in a group that is automatically illuminated” ([0038]).
Further, one of ordinary skill in the art would not have concluded that the inventor had possession of a system that illuminates the object when both conditions are true. It is recognized the object class being determined as a predetermined object class can only occur if the object is able to be detected by the optical sensors. If the detection status of the dynamic object by the optical sensor includes that the dynamic object is not detected, it is not clear how the system would be able to determine the object’s class as the object needs to be detected in order for its class to be detected. As a result, the conditions for illumination as presented by the independent claims would never occur; is impossible for an object to not be able to be detected, but also detected and identified as a certain class at the same time. Therefore, for the mapping of prior art, the independent claims have been interpreted so that either “the detection status of the dynamic object by the optical sensor includes dynamic object not detected during the low ambient lighting conditions” or “the object class determined includes a predetermined object class” are both conditions that can separately trigger the illumination of the object.
The dependent claims are rejected under 35 U.S.C. 112(a) by virtue of their dependency to the independent claims.
Additionally, claim 7 is further rejected as it states that the illumination is performed “by projecting a pattern of light including directional markers onto the dynamic object”. The directional markers being projected “onto the dynamic object” is not supported by applicant’s disclosure. Applicant’s disclosure instead teaches that “the adaptive head lamp system 50 can be used to project a pattern of light 80 [onto] the roadway in the direction of the object 70. In the illustrated example of FIG. 3, the pattern of light 80 can include at least one of ground highlighting 84 or directional markers 82” ([0040]). Further, as seen in Fig. 3, the directional markers are not projected onto the dynamic object, but rather are projected onto the road surface. This idea is reinforced by applicant’s disclosure referring to these directional markers as “ground highlighting and directional markers” ([0040]).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1, 3, 6, 14, 16, and 19 are rejected under 35 U.S.C. 103 as being obvious over Kwon et al. (US 20180170373 A1) in view of Hsu et al. (US 20230219488 A1) and Lindsay (US 9771021 B1).
Regarding claim 1, Kwon teaches a method of operating a vehicle, the method comprising:
identifying a dynamic object with a distance sensor on the vehicle and the distance sensor includes at least one of a radar sensor or a light detection and ranging sensor ([0045-0046], the capturer includes a radar sensor);
determining if an area surrounding the vehicle includes low ambient light conditions ([0073], detect illumination around the vehicle to determine if the vehicle is being driven in night or a dark region);
determining an object class for the dynamic object ([0055], determine type of object captured); …
and illuminating the dynamic object with an illumination source from the vehicle when the detection status of the dynamic object by the optical sensor includes dynamic object not detected during the low ambient lighting conditions ([0083], “irradiate light towards” the object),
wherein the illumination source includes a head lamp system on the vehicle ([0039]).
Kwon teaches that the illumination of the dynamic object always occurs during night conditions if the object is not currently in an illumination range of its headlight ([0078-80], illustrated in Figs. 6 & 7) in order to ensure that an image capturer can accurately detect objects within its range ([0080]). It does not teach determining a detection status of the dynamic object with an optical sensor during the low ambient light conditions, and illuminating the object based on this detection status.
In the same field of endeavor, Hsu describes method of operating a vehicular lighting system for detecting objects. Hsu teaches determining a detection status of the dynamic object with an optical sensor during the low ambient light conditions ([0034]; [0094] and [0118], detection status is determined by analyzing if image quality obtained is less than a threshold level), and subsequently illuminating the object based on this detection status ([0094] and [0118], illumination occurs if image quality, i.e. detection status, is below a certain threshold).
A skilled artisan would have been able to integrate the operation step of determining the detection status into the invention of Kwon, which would allow the vehicle to determine if an object can be accurately detected by the image capturer before it is illuminated. This avoids a situation where the vehicle would illuminate an object that is out of its headlight illumination range even though the object is still be able to be detected and identified by the image capturer.
Therefore, it would have been obvious to integrate the determination of an object’s detection status and consequent possible illumination into the method of Kwon for the motivation of avoiding unnecessary illumination of objects that are otherwise already able to be detected and identified by the vehicle. This avoids the vehicle wasting power on illuminating objects that are already sufficiently illuminated, such as a pedestrian that is located under a street lamp.
The prior combination does not teach illuminating the object when the object class determined includes a predetermined object class.
In the same field of endeavor, Lindsay teaches that illuminating an object detected by a vehicle occurs when the object class determined includes a predetermined object class (Col. 12, lines 52-55 and Col. 13, lines 13-18; see Fig. 6, where only an object being determined to be of a pedestrian class leads to it being illuminated).
Therefore, it would have been obvious to one of ordinary skill in the art at the effective date of filing to modify the prior combination to illuminate certain objects for the motivation, as taught by Lindsay, of illuminating detected pedestrians without blinding them, thus avoiding potentially hazardous conditions (Col. 14, lines 11-19). Illuminating certain objects allows the conditions of illumination to be controlled so that illumination occurs in a manner that does not blind pedestrians.
Regarding claim 3, the prior art remains as applied in claim 1. Kwon further teaches that the method includes:
determining a direction of travel of the dynamic object relative to the vehicle when the detection status of the dynamic object by the optical sensor includes that the dynamic object is detected prior to illuminating the dynamic object with the illumination source ([0086-0088], the object is detected, its direction of travel are determined so that the risk of collision is high, then the vehicle irradiates light to the corresponding object).
Regarding claim 6, the prior art remains as applied in claim 1. Kwon further teaches:
determining if an area surrounding the vehicle includes the low ambient light conditions by measuring an ambient light condition in the area surrounding the vehicle with a light sensor on the vehicle ([0073]).
Regarding claim 14, Kwon teaches a non-transitory computer-readable storage medium embodying programmed instructions which, when executed by a processor, are operable for performing a method ([0030]) comprising:
identifying a dynamic object with a distance sensor on the vehicle and the distance sensor includes at least one of a radar sensor or a light detection and ranging sensor ([0045-0046], the capturer includes a radar sensor);
determining if an area surrounding the vehicle includes low ambient light conditions ([0073], detect illumination around the vehicle to determine if the vehicle is being driven in night or a dark region);
determining an object class for the dynamic object ([0055], determine type of object captured); …
and illuminating the dynamic object with an illumination source from the vehicle when the detection status of the dynamic object by the optical sensor includes dynamic object not detected during the low ambient lighting conditions ([0083], “irradiate light towards” the object),
wherein the illumination source includes a head lamp system on the vehicle ([0039]).
Kwon teaches that the illumination of the dynamic object always occurs during night conditions if the object is not currently in an illumination range of its headlight ([0078-80], illustrated in Figs. 6 & 7) in order to ensure that an image capturer can accurately detect objects within its range ([0080]). It does not teach determining a detection status of the dynamic object with an optical sensor during the low ambient light conditions, and illuminating the object based on this detection status.
In the same field of endeavor, Hsu describes method of operating a vehicular lighting system for detecting objects. Hsu teaches determining a detection status of the dynamic object with an optical sensor during the low ambient light conditions ([0034]; [0094] and [0118], detection status is determined by analyzing if image quality obtained is less than a threshold level), and subsequently illuminating the object based on this detection status ([0094] and [0118], illumination occurs if image quality, i.e. detection status, is below a certain threshold).
A skilled artisan would have been able to integrate the operation step of determining the detection status into the invention of Kwon, which would allow the vehicle to determine if an object can be accurately detected by the image capturer before it is illuminated. This avoids a situation where the vehicle would illuminate an object that is out of its headlight illumination range even though the object is still be able to be detected and identified by the image capturer.
Therefore, it would have been obvious to integrate the determination of an object’s detection status and consequent possible illumination into the method of Kwon for the motivation of avoiding unnecessary illumination of objects that are otherwise already able to be detected and identified by the vehicle. This avoids the vehicle wasting power on illuminating objects that are already sufficiently illuminated, such as a pedestrian that is located under a street lamp.
The prior combination does not teach illuminating the object when the object class determined includes a predetermined object class.
In the same field of endeavor, Lindsay teaches that illuminating an object detected by a vehicle occurs when the object class determined includes a predetermined object class (Col. 12, lines 52-55 and Col. 13, lines 13-18; see Fig. 6, where only an object being determined to be of a pedestrian class leads to it being illuminated).
Therefore, it would have been obvious to one of ordinary skill in the art at the effective date of filing to modify the prior combination to illuminate certain objects for the motivation, as taught by Lindsay, of illuminating detected pedestrians without blinding them, thus avoiding potentially hazardous conditions (Col. 14, lines 11-19). Illuminating certain objects allows the conditions of illumination to be controlled so that illumination occurs in a manner that does not blind pedestrians.
Regarding claim 16, the prior art remains as applied in claim 14. Kwon further teaches:
determining a direction of travel of the dynamic object relative to the vehicle when the detection status of the dynamic object by the optical sensor includes that the dynamic object is detected prior to illuminating the dynamic object with the illumination source ([0086-0088], the object is detected, its direction of travel are determined so that the risk of collision is high, then the vehicle irradiates light to the corresponding object).
Regarding claim 19, Kwon teaches a vehicle comprising:
a body defining a passenger compartment ([0058]);
a plurality of wheels supporting the body ([0038]);
a plurality of sensors fixed relative to the body ([0043] and [0045]);
an adaptive head lamp system fixed relative to the body ([0039-0040], headlamps that adapt the amount and direction of lighting);
and a controller in communication with the plurality of sensors and the adaptive head lamp system ([0040] and [0042]), the controller being configured to:
identify a dynamic object with a distance sensor on the vehicle and the distance sensor includes at least one of a radar sensor or a light detection and ranging sensor ([0046]);
determine if an area surrounding the vehicle includes low ambient light conditions ([0073], detect illumination around the vehicle to determine if the vehicle is being driven in night or a dark region);
determine an object class for the dynamic object ([0055], determine type of object captured); …
and illuminate the dynamic object with an illumination source from the vehicle when the detection status of the dynamic object by the optical sensor includes dynamic object not detected during the low ambient lighting conditions ([0083], “irradiate light towards” the object),
wherein the illumination source includes a head lamp system on the vehicle ([0039]).
Kwon teaches that the illumination of the dynamic object always occurs during night conditions if the object is not currently in an illumination range of its headlight ([0078-80], illustrated in Figs. 6 & 7) in order to ensure that an image capturer can accurately detect objects within its range ([0080]). It does not teach determining a detection status of the dynamic object with an optical sensor during the low ambient light conditions, and illuminating the object based on this detection status.
In the same field of endeavor, Hsu describes method of operating a vehicular lighting system for detecting objects. Hsu teaches determining a detection status of the dynamic object with an optical sensor during the low ambient light conditions ([0034]; [0094] and [0118], detection status is determined by analyzing if image quality obtained is less than a threshold level), and subsequently illuminating the object based on this detection status ([0094] and [0118], illumination occurs if image quality, i.e. detection status, is below a certain threshold).
A skilled artisan would have been able to integrate the operation step of determining the detection status into the invention of Kwon, which would allow the vehicle to determine if an object can be accurately detected by the image capturer before it is illuminated. This avoids a situation where the vehicle would illuminate an object that is out of its headlight illumination range even though the object is still be able to be detected and identified by the image capturer.
Therefore, it would have been obvious to integrate the determination of an object’s detection status and consequent possible illumination into the method of Kwon for the motivation of avoiding unnecessary illumination of objects that are otherwise already able to be detected and identified by the vehicle. This avoids the vehicle wasting power on illuminating objects that are already sufficiently illuminated, such as a pedestrian that is located under a street lamp.
The prior combination does not teach illuminating the object when the object class determined includes a predetermined object class.
In the same field of endeavor, Lindsay teaches that illuminating an object detected by a vehicle occurs when the object class determined includes a predetermined object class (Col. 12, lines 52-55 and Col. 13, lines 13-18; see Fig. 6, where only an object being determined to be of a pedestrian class leads to it being illuminated).
Therefore, it would have been obvious to one of ordinary skill in the art at the effective date of filing to modify the prior combination to illuminate certain objects for the motivation, as taught by Lindsay, of illuminating detected pedestrians without blinding them, thus avoiding potentially hazardous conditions (Col. 14, lines 11-19). Illuminating certain objects allows the conditions of illumination to be controlled so that illumination occurs in a manner that does not blind pedestrians.
Claims 4-5 and 17-18 are rejected under 35 U.S.C. 103 as being obvious over Kwon in view of Hsu and Lindsay as applied to claims 3 and 16 above, and in further view of Schofield et al. (US 20020040962 A1) and Neukam (US 20190329699 A1).
Regarding claim 4, the prior art remains as applied in claim 3. Hsu teaches determining if a detected object is an oncoming vehicle, and modifying its headlight system accordingly by turning on high beams if it is not, and turning off high beams if it is ([0063] and [0102]). This is performed by using a plurality of camera sensors to determine the illumination levels of an image ([0071-0073]). However, this analysis is for detecting any light in an image, and the combination of Kwon and Hsu does not explicitly teach that the method includes identifying an illumination status of at least one of head lamps or turn lamps on the dynamic object when the direction of travel of the dynamic object is opposing a direction of travel of the vehicle and illuminating the dynamic object if the illumination status is off or unknown.
However, in the same field of endeavor, Schofield discloses an improved method of analyzing imaging sensors to control a vehicle lighting system. The method includes identifying an illumination status of at least one of head lamps or turn lamps on the dynamic object when the direction of travel of the dynamic object is opposing a direction of travel of the vehicle ([0035-0037], white pixels corresponding to the headlights of another vehicle are detected, and if they exceed a threshold, illumination status is determined to be off, and the high beams are activated). Note that Schofield discloses that high beam headlights are understood to illuminate oncoming vehicles ([0003], driver of a vehicle turns off high beams so as to not dazzle another driver by illuminating their vehicle).
These image analysis techniques, when integrated into the image analysis of the prior combination, would allow the system to differentiate between lights corresponding to a headlight and lights corresponding to another light such as a street light. These techniques would enhance the prior combination’s detection of other vehicles, ensuring that high beam control is not incorrectly performed for non-vehicle lights. Further, it is understood that high beams, when activated, illuminate oncoming vehicles as there would be no reason to turn them off for other vehicles otherwise. Therefore, a skilled artisan would have understood that activating the high beams in the manner described in Schofield would result in illuminating the dynamic object (i.e. the oncoming vehicle) if the illumination status is off or unknown, thereby allowing the other vehicle to be detected by an occupant of the controlled vehicle.
As Schofield is analogous to the art of detecting illumination status of images for operation of a vehicle lighting system, it would have been obvious to modify the light image analysis of the prior combination with the process described in Schofield. The motivation for this, as taught by Schofield, is to better analyze light sources so that non-vehicular lights are not considered to be vehicular lights when determining whether or not to modify the output of a vehicle’s headlamp system ([0004]).
Kwon additionally teaches that illuminating the dynamic object includes projecting a pattern of light onto a road surface between the illumination source and the dynamic object (see Fig. 7, where the illumination includes the road surface), but does not teach that the pattern of light includes directional markers extending from the illumination source in a direction of the dynamic object.
In the same field of endeavor, Neukam teaches that the pattern of light used to illuminate objects includes directional markers extending from the illumination source in a direction of the dynamic object ([0045] and Figs. 3-5, where the headlights project directional stripes from the headlights of the vehicle in the direction of the detected object).
It would have been obvious to one of ordinary skill in the art at the effective date of filing to modify the pattern of light of Kwon with the directional marker stripes of Neukam based on a reasonable expectation of success and motivation, as taught by Neukam, of enabling better detection and classification of objects by using a contrasting directional pattern of light in dark environments ([0007-0008]).
Regarding claim 5, the prior art remains as applied in claim 3. Hsu teaches determining if a detected object is a vehicle, and modifying its headlight system accordingly by turning on high beams if the object is not a vehicle, and turning off high beams if it is ([0063] and [0102]). This is performed by using a plurality of camera sensors to determine the illumination levels of an image ([0071-0073]). However, this analysis is for detecting any light in an image, and the combination of Kwon and Hsu does not explicitly teach that the method includes identifying an illumination status of at least one of brake lamps or turn lamps on the dynamic object when the direction of travel of the dynamic object is common with a direction of travel of the vehicle and illuminating the dynamic object if the illumination status is off or unknown.
However, in the same field of endeavor, Schofield discloses an improved method of analyzing imaging sensors to control a vehicle lighting system. The method includes identifying an illumination status of at least one of brake lamps or turn lamps on the dynamic object when the direction of travel of the dynamic object is common with a direction of travel of the vehicle ([0034] and [0036-0037], red pixels corresponding to the taillights, i.e. brake lamps, of another vehicle are detected, and if they exceed a threshold, illumination status is determined to be off, and the high beams are activated). Note that Schofield discloses that high beam headlights are understood to illuminate leading vehicles ([0003], driver of a vehicle turns off high beams so as to not dazzle another driver by illuminating their vehicle).
These image analysis techniques, when integrated into the image analysis of the prior combination, would allow the system to differentiate between lights corresponding to a taillight and lights corresponding to another light such as a street light. These techniques would enhance the prior combination’s detection of other vehicles, ensuring that high beam control is not incorrectly performed when for non-vehicle lights. Further, it is understood that high beams, when activated, illuminate leading vehicles as there would be no reason to turn them off for other vehicles otherwise. Therefore, a skilled artisan would have understood that activating the high beams in the manner described in Schofield would result in illuminating the dynamic object (i.e. the leading vehicle) if the illumination status is off or unknown, thereby allowing the other vehicle to be detected by an occupant of the controlled vehicle.
As Schofield is analogous to the art of detecting illumination status of images for operation of a vehicle lighting system, it would have been obvious to modify the light image analysis of the prior combination with the process described in Schofield. The motivation for this, as taught by Schofield, is to better analyze light sources so that non-vehicular lights are not considered to be vehicular lights when determining whether or not to modify the output of a vehicle’s headlamp system ([0004]).
Kwon additionally teaches that illuminating the dynamic object includes projecting a pattern of light onto a road surface between the illumination source and the dynamic object (see Fig. 7, where the illumination includes the road surface), but does not teach that the pattern of light includes directional markers extending from the illumination source in a direction of the dynamic object.
In the same field of endeavor, Neukam teaches that the pattern of light used to illuminate objects includes directional markers extending from the illumination source in a direction of the dynamic object ([0045] and Figs. 3-5, where the headlights project directional stripes from the headlights of the vehicle in the direction of the detected object).
It would have been obvious to one of ordinary skill in the art at the effective date of filing to modify the pattern of light of Kwon with the directional marker stripes of Neukam based on a reasonable expectation of success and motivation, as taught by Neukam, of enabling better detection and classification of objects by using a contrasting directional pattern of light in dark environments ([0007-0008]).
Regarding claim 17, the prior art remains as applied in claim 16. Hsu teaches determining if a detected object is an oncoming vehicle, and modifying its headlight system accordingly by turning on high beams if it is not, and turning off high beams if it is ([0063] and [0102]). This is performed by using a plurality of camera sensors to determine the illumination levels of an image ([0071-0073]). However, this analysis is for detecting any light in an image, and the combination of Kwon and Hsu does not explicitly teach that the method includes identifying an illumination status of at least one of head lamps or turn lamps on the dynamic object when the direction of travel of the dynamic object is opposing a direction of travel of the vehicle and illuminating the dynamic object if the illumination status is off or unknown.
However, in the same field of endeavor, Schofield discloses an improved method of analyzing imaging sensors to control a vehicle lighting system. The method includes identifying an illumination status of at least one of head lamps or turn lamps on the dynamic object when the direction of travel of the dynamic object is opposing a direction of travel of the vehicle ([0035-0037], white pixels corresponding to the headlights of another vehicle are detected, and if they exceed a threshold, illumination status is determined to be off, and the high beams are activated). Note that Schofield discloses that high beam headlights are understood to illuminate oncoming vehicles ([0003], driver of a vehicle turns off high beams so as to not dazzle another driver by illuminating their vehicle).
These image analysis techniques, when integrated into the image analysis of the prior combination, would allow the system to differentiate between lights corresponding to a headlight and lights corresponding to another light such as a street light. These techniques would enhance the prior combination’s detection of other vehicles, ensuring that high beam control is not incorrectly performed for non-vehicle lights. Further, it is understood that high beams, when activated, illuminate oncoming vehicles as there would be no reason to turn them off for other vehicles otherwise. Therefore, a skilled artisan would have understood that activating the high beams in the manner described in Schofield would result in illuminating the dynamic object (i.e. the oncoming vehicle) if the illumination status is off or unknown, thereby allowing the other vehicle to be detected by an occupant of the controlled vehicle.
As Schofield is analogous to the art of detecting illumination status of images for operation of a vehicle lighting system, it would have been obvious to modify the light image analysis of the prior combination with the process described in Schofield. The motivation for this, as taught by Schofield, is to better analyze light sources so that non-vehicular lights are not considered to be vehicular lights when determining whether or not to modify the output of a vehicle’s headlamp system ([0004]).
Kwon additionally teaches that illuminating the dynamic object includes projecting a pattern of light onto a road surface between the illumination source and the dynamic object (see Fig. 7, where the illumination includes the road surface), but does not teach that the pattern of light includes directional markers extending from the illumination source in a direction of the dynamic object.
In the same field of endeavor, Neukam teaches that the pattern of light used to illuminate objects includes directional markers extending from the illumination source in a direction of the dynamic object ([0045] and Figs. 3-5, where the headlights project directional stripes from the headlights of the vehicle in the direction of the detected object).
It would have been obvious to one of ordinary skill in the art at the effective date of filing to modify the pattern of light of Kwon with the directional marker stripes of Neukam based on a reasonable expectation of success and motivation, as taught by Neukam, of enabling better detection and classification of objects by using a contrasting directional pattern of light in dark environments ([0007-0008]).
Regarding claim 18, the prior art remains as applied in claim 16. Hsu teaches determining if a detected object is a vehicle, and modifying its headlight system accordingly by turning on high beams if the object is not a vehicle, and turning off high beams if it is ([0063] and [0102]). This is performed by using a plurality of camera sensors to determine the illumination levels of an image ([0071-0073]). However, this analysis is for detecting any light in an image, and the combination of Kwon and Hsu does not explicitly teach that the method includes identifying an illumination status of at least one of brake lamps or turn lamps on the dynamic object when the direction of travel of the dynamic object is common with a direction of travel of the vehicle and illuminating the dynamic object if the illumination status is off or unknown.
However, in the same field of endeavor, Schofield discloses an improved method of analyzing imaging sensors to control a vehicle lighting system. The method identifying an illumination status of at least one of brake lamps or turn lamps on the dynamic object when the direction of travel of the dynamic object is common with a direction of travel of the vehicle ([0034] and [0036-0037], red pixels corresponding to the taillights, i.e. brake lamps, of another vehicle are detected, and if they exceed a threshold, illumination status is determined to be off, and the high beams are activated). Note that Schofield discloses that high beam headlights are understood to illuminate leading vehicles ([0003], driver of a vehicle turns off high beams so as to not dazzle another driver by illuminating their vehicle).
These image analysis techniques, when integrated into the image analysis of the prior combination, would allow the system to differentiate between lights corresponding to a taillight, and lights corresponding to another light such as a street light. These techniques would enhance the prior combination’s detection of other vehicles, ensuring that high beam control is not incorrectly performed for non-vehicle lights. Further, it is understood that high beams, when activated, illuminate leading vehicles as there would be no reason to turn them off for other vehicles otherwise. Therefore, a skilled artisan would have understood that activating the high beams in the manner described in Schofield would result in illuminating the dynamic object (i.e. the leading vehicle) if the illumination status is off or unknown, thereby allowing the other vehicle to be detected by an occupant of the controlled vehicle.
As Schofield is analogous to the art of detecting illumination status of images for operation of a vehicle lighting system, it would have been obvious to modify the light image analysis of the prior combination with the process described in Schofield. The motivation for this, as taught by Schofield, is to better analyze light sources so that non-vehicular lights are not considered to be vehicular lights when determining whether or not to modify the output of a vehicle’s headlamp system ([0004]).
Kwon additionally teaches that illuminating the dynamic object includes projecting a pattern of light onto a road surface between the illumination source and the dynamic object (see Fig. 7, where the illumination includes the road surface), but does not teach that the pattern of light includes directional markers extending from the illumination source in a direction of the dynamic object.
In the same field of endeavor, Neukam teaches that the pattern of light used to illuminate objects includes directional markers extending from the illumination source in a direction of the dynamic object ([0045] and Figs. 3-5, where the headlights project directional stripes from the headlights of the vehicle in the direction of the detected object).
It would have been obvious to one of ordinary skill in the art at the effective date of filing to modify the pattern of light of Kwon with the directional marker stripes of Neukam based on a reasonable expectation of success and motivation, as taught by Neukam, of enabling better detection and classification of objects by using a contrasting directional pattern of light in dark environments ([0007-0008]).
Claims 7-8, 10, 13, and 21-24 are rejected under 35 U.S.C. 103 as being obvious over Kwon in view of Hsu and Lindsay as applied to claims 1 and 14 above, and in further view Neukam (US 20190329699 A1).
Regarding claim 7, the prior art remains as applied in claim 1. Lindsay teaches:
determining a vertical height of the dynamic object relative to a road surface with the distance sensor (Col. 12, lines 55-58),
and wherein illuminating the dynamic object with the illumination source includes illuminating the dynamic object for a predetermined vertical distance from a road surface (Col. 14, lines 11-19, the coordinates determined must include the height of the pedestrian as the vehicle would not be able to illuminate the body of a pedestrian and not their face if only their two dimensional coordinates were known; vertical distance is predetermined to be until no further than the face of the pedestrian).
The prior combination does not teach that the illumination is done by projecting a pattern of light including directional markers onto the dynamic object.
In the same field of endeavor, Neukam teaches that illuminating of objects detected by a vehicle is done by projecting a pattern of light including directional markers onto the dynamic object ([0045] and Figs. 3-5, where the headlights project directional stripes from the headlights of the vehicle in the direction of the detected object).
It would have been obvious to one of ordinary skill in the art at the effective date of filing to modify the pattern of light of Kwon with the directional marker stripes of Neukam based on a reasonable expectation of success and motivation, as taught by Neukam, of enabling better detection and classification of objects by using a contrasting directional pattern of light in dark environments ([0007-0008]).
Regarding claim 8, the prior art remains as applied in claim 1. Hsu teaches:
wherein illuminating the dynamic object includes projecting a pattern of light onto a road surface between the illumination source and the dynamic object ([0118] and Fig. 7).
The prior combination does not teach that the pattern of light includes directional markers extending from the illumination source in a direction of the dynamic object.
In the same field of endeavor, Neukam teaches that illuminating of objects detected by a vehicle is done projecting a pattern of light that includes directional markers extending from the illumination source in a direction of the dynamic object ([0045] and Figs. 3-5, where the headlights project directional stripes from the headlights of the vehicle in the direction of the detected object).
It would have been obvious to one of ordinary skill in the art at the effective date of filing to modify the pattern of light of Kwon with the directional marker stripes of Neukam based on a reasonable expectation of success and motivation, as taught by Neukam, of enabling better detection and classification of objects by using a contrasting directional pattern of light in dark environments ([0007-0008]).
Regarding claim 10, the prior art remains as applied in claim 8. Kwon teaches:
wherein illuminating the dynamic object with the illumination source includes tracking movement of the dynamic object relative to the vehicle with the pattern of light projected onto the road surface by the illumination source ([0088] and Fig. 8, tracking movement of object for collision risk assessment).
Regarding claim 13, the prior art remains as applied in claim 8. Neukam teaches:
wherein the head lamp system includes an adaptive head lamp system having a light emitting diode projector type head lamp having a plurality of light emitting diodes ([0021], light sources are light-emitting diodes (LED)),
and configured to selectively control illumination of individual diodes of the plurality of light emitting diodes to project the pattern of light on the road surface ([0021-0022], LEDs selectively controlled to produce first and second light beam bundle patterns).
Regarding claim 21, the prior art remains as applied in claim 1. Kwon teaches:
projecting a pattern of light onto a road surface between the illumination source and the dynamic object (see Fig. 7, where the light is projected onto the road between the vehicle and the dynamic object).
Kwon does not teach wherein the head lamp system includes a physical shutter and illuminating the dynamic object includes projecting a pattern of light utilizing the shutter, and the pattern of light includes directional markers extending from the illumination source in a direction of the dynamic object.
In the same field of endeavor, Neukam teaches:
wherein the head lamp system includes a physical shutter and illuminating the dynamic object includes projecting a pattern of light utilizing the shutter ([0040] and Figs. 1-2, where the light modulation unit 12 is equivalent and functionally equivalent to the claimed shutter as it controls the illumination from a light source),
and the pattern of light includes directional markers extending from the illumination source in a direction of the dynamic object ([0045] and Figs. 3-5, where the headlights project directional stripes from the headlights of the vehicle in the direction of the detected object).
It would have been obvious to one of ordinary skill in the art at the effective date of filing to modify the pattern of light of Kwon with the directional marker stripes of Neukam based on a reasonable expectation of success and motivation, as taught by Neukam, of enabling better detection and classification of objects by using a contrasting directional pattern of light in dark environments ([0007-0008]).
Regarding claim 22, the prior art remains as applied in claim 8. Neukam teaches:
wherein the directional markers includes at least one of arrows projected onto the road surface from the illumination source to the dynamic object or a line projected on the road surface from the illumination source to the dynamic object ([0045] and Figs. 3-5, where the light pattern includes directional stripes/lines, i.e. directional markers, projected on the road surface from the illumination source to the dynamic object).
Regarding claim 23, the prior art remains as applied in claim 10. Kwon teaches that tracking the movement of the dynamic object relative to the vehicle includes tracking the dynamic object in a forward direction based on the direction of travel of the vehicle ([0086-0088] and see Fig. 8). It doesn’t disclose an explicit angular range in this direction in which objects are analyzed, and doesn’t teach that objects are detected within an angular range from zero degrees aligning with a heading of the vehicle up to 75 degrees off the heading.
However, one of ordinary skill in the art would have recognized that objects within a 75 degree range from the heading direction of the vehicle present a collision risk based on the forward traveling direction of the vehicle. Therefore, it would have been obvious to one of ordinary skill in the art at the effective date of filing to track objects within an angular range from zero degrees aligning with a heading of the vehicle up to 75 degrees off the heading based on a reasonable expectation of success and motivation to ensure that objects that may potentially collide with the ego vehicle are properly analyzed and have their risk of collision predicted as part of the calculations already performed by Kwon.
Regarding claim 24, the prior art remains as applied in claim 14. Kwon teaches:
illuminating the dynamic object includes projecting a pattern of light onto a road surface between the illumination source and the dynamic object (Figs 6 and 7, the light is projected onto the road surface between the illumination source and the dynamic object);
and illuminating the dynamic object with the illumination source includes tracking movement of the dynamic object relative to the vehicle with the pattern of light projected onto the road surface by the illumination source ([0086-0088] and Fig. 8, where pedestrians are tracked so as to determine a risk of collision).
Kwon does not teach that the pattern of light includes directional markers extending from the illumination source in a direction of the dynamic object.
In the same field of endeavor, Neukam teaches that:
the pattern of light includes directional markers extending from the illumination source in a direction of the dynamic object ([0045] and Figs. 3-5, where the headlights project directional stripes from the headlights of the vehicle in the direction of the detected object).
It would have been obvious to one of ordinary skill in the art at the effective date of filing to modify the pattern of light of Kwon with the directional marker stripes of Neukam based on a reasonable expectation of success and motivation, as taught by Neukam, of enabling better detection and classification of objects by using a contrasting directional pattern of light in dark environments ([0007-0008]).
Regarding claim 11, the prior art remains as applied in claim 1. Lindsay teaches:
illuminating the dynamic object with the illumination source from the vehicle if the detection status with the optical sensor includes that the dynamic object is detected (Col. 12, lines 52-55 and Col. 13, lines 13-18; see Fig. 6, where only an object being determined to be of a pedestrian class leads to it being illuminated).
Although Lindsay teaches illuminating pedestrians in this manner, it does not explicitly teach that this illumination is also performed when the object class determined for the dynamic object is a bicycle.
In the same field of endeavor, Hahn teaches illuminating objects with a vehicle headlight system, wherein:
Illumination specifically occurs when the object class determined for the dynamic object is a bicycle ([0037], cyclists are specifically illuminated so as to not dazzle/blind them).
It would have been obvious to one of ordinary skill in the art at the effective date of filing to modify the prior combination so that cyclists are illuminated as well as pedestrians based on a reasonable expectation of success and motivation, as taught by Hahn, to allow for specific illumination control so as to avoid dazzling oncoming cyclists ([0037]). Avoiding the dazzling effect on cyclists confers the same advantages as avoiding the dazzling effect on pedestrians.
Response to Arguments
Applicant's arguments filed 12/05/2025 have been fully considered.
Regarding the independent claims 1, 14, and 19, applicant argues over the amended limitation "identifying a dynamic object with a distance sensor on a vehicle and the distance sensor includes at least one of a radar sensor or a light detection and ranging sensor", contending that “The rejection identifies the capturer 350 from Kwon as the claimed sensor. However, the capturer 350 is optical-based (e.g., imaging device, camera, video camera, etc. - Kwon [0041]).” This argument is unpersuasive. While the capturer of Kwon does comprise optical-based sensors, it further includes a distance sensor in the form of a radar sensor ([0045]). Therefore, Kwon remains applied so as to teach the present amended limitation as argued, and a new rejection is furnished over the claims as otherwise necessitated by the other amendments to the claims.
Conclusion
The following prior art made of record and not relied upon is considered pertinent to applicant’s disclosure:
Bush et al. (US 20210018928 A1)
Biswal et al. (US 10183614 B1)
Stam et al. (US 20040143380 A1)
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JACK R. BREWER whose telephone number is (571)272-4455. The examiner can normally be reached 9AM-6PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Angela Ortiz can be reached at 571-272-1206. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
JACK R. BREWER
Examiner
Art Unit 3663
/ADAM D TISSOT/Primary Examiner, Art Unit 3663