DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-3, 5-10, 13-15, and 17-20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Glatfelter et al., US 2019/0074382.
Regarding claim 1, Glatfelter discloses a system (fig. 3, para 0025; an anti-glare system 150 for reducing glare) comprising a computing device (fig. 3; para 0025; controller 310), the computing device including a processor and a memory (figs. 1 and 3; para 0020 and 0025; processor 152 and memory 154), the memory storing instructions executable by the processor including instructions to:
determine a first illumination value based on an image of a space about a vehicle (figs. 2, 4, 6, and 8; para 0022, 0029, 0035, and 0037; In step 202, the camera(s) 110 capture image data of an environment in front of the vehicle 100. That is, one or more of the cameras 110 may obtain an image of a field of view that is in front of the windshield 101. In step 204, one or more of the optical sensors 121-128 monitor light intensity for an area (e.g., one of the zones 111-118) of the windshield 101; (Examiner broadly interpret a first illumination value as a monitored light intensity before detecting a glare);
upon the first illumination value exceeding a threshold, determine second illumination values for respective zones of the image, the zones being defined based on a vehicle component (figs. 2, 4, 6, and 8; para 0022-0023, 0029-0030, 0035, and 0037; In step 206, the processor 152 of the anti-glare system 150 detects glare (Examiner broadly interpret the first illumination value is exceeding a threshold as the detected glare) impinging on the area of the windshield 101. The processor 152 may obtain sensor data from one or more optical sensors 121-128 and/or image data from one or more cameras 110 to detect the glare; In step 208, the processor 152 analyzes the image data captured for the area of the windshield 101. In step 210, the processor 152 determines a location of the glare in the area based on the image data. In step 212, the processor 152 determines a characteristic of the environment for the area based on the image data. For example, the processor 152 may process the image data to detect a dominant color, a luminance or brightness value, and/or an object in the environment that is within the field of view of the camera 110 (Examiner broadly interpret a second illumination value as the detect a dominant color, a luminance or brightness value of the glare). The processor 152 may activate and/or direct the camera 110 to capture the image data in response to detecting the glare via the sensor(s) 121-128. For instance, the processor 152 may instruct the camera 110 to obtain image data for a particular one of the zones 111-118 of the windshield 101 for which glare has been detected); and
actuate the vehicle component based on a comparison of at least some of the second illumination values of respective zones to one another (figs. 2, 4, 7, and 9; para 0024, 0029-0030, 0036, and 0038; In step 214, the processor 152 generates a counteracting image based on the characteristic of the environment for the area. And, in step 216, the processor 152 directs one or more of the projectors 140-141 to display the counteracting image on the location of the windshield to reduce the glare; As such, the processor 152 may categorize light intensity (e.g., glare) by different areas, zones, or regions of the windshield 101; In step 404, the processor 152 analyzes the image data 331 to determine a color that is dominant in the environment for the area surrounding the glare. In step 406, the processor 152 generates a counteracting image based on the size and the shape of the glare and the color that is around the glare for the area. The processor 152 may therefore determine which zones are to be activated, detect and categorize the dominant color in each active zone, and detect the position of the glare within a zone).
Regarding claim 2, the system of claim 1, Glatfelter further discloses the instructions including further instructions to determine the first and second illumination values after compensating for sensor saturation (para 0018, 0022, and 0025; camera or sensor indicates changes in light intensity is considered compensating saturation level).
Regarding claim 3, the system of claim 1, Glatfelter further discloses wherein the image is of the space through a window outside of the vehicle (fig. 5-9; para 0017 and 0022).
Regarding claim 5, the system of claim 1, Glatfelter further discloses wherein the vehicle component is a vehicle display (para 0018).
Regarding claim 6, the system of claim 1, Glatfelter further discloses wherein the zones are further defined based on a resolution of a sensor (fig. 6-9; para 0037-0038; The anti-glare system 150 receives a signal from one of the optical sensors corresponding to regions of the windshield 101 having the glare 850 (e.g., the optical sensors 122-123 and 126-127 corresponding with zones 112-113 and 116-117).
Regarding claim 7, the system of claim 1, Glatfelter further discloses wherein the first and second illumination values are irradiance per unit area measured in lux (para 0022-0023; light intensity and brightness is commonly measured in lux).
Regarding claim 8, the system of claim 1, Glatfelter further discloses the instructions including further instructions to collect a plurality of images of the space over a specified time (para 0023-0025).
Regarding claim 9, the system of claim 8, Glatfelter further discloses wherein the second illumination values are weighted by respective acquisition times of the images (para 0023-0025).
Regarding claim 10, the system of claim 1, Glatfelter further discloses wherein some of the zones overlap (fig. 6-9; para 0037-0038).
Regarding claim 13, this claim recites substantially the same limitations that are performed by claim 1 above, and it is rejected for the same reasons.
Regarding claim 14, this claim recites substantially the same limitations that are performed by claim 2 above, and it is rejected for the same reasons.
Regarding claim 15, this claim recites substantially the same limitations that are performed by claim 3 above, and it is rejected for the same reasons.
Regarding claim 17, this claim recites substantially the same limitations that are performed by claim 5 above, and it is rejected for the same reasons.
Regarding claim 18, this claim recites substantially the same limitations that are performed by claim 7 above, and it is rejected for the same reasons.
Regarding claim 19, this claim recites substantially the same limitations that are performed by claim 8 above, and it is rejected for the same reasons.
Regarding claim 20, this claim recites substantially the same limitations that are performed by claim 9 above, and it is rejected for the same reasons.
age has an acquisition time respectively.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 4, 11-12, and 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Glatfelter et al., US 2019/0074382 in view of Hikida et al., US 2023/0242079.
Regarding claim 4, the system of claim 1, Glatfelter does not explicitly disclose wherein the vehicle component is a climate control system as claimed.
Hikida discloses the air conditioner ECU 600 controls the air conditioner of the vehicle according to the control contents of the sun-light signal input from the sun-light calculation unit 157 (para 0088-0089).
Therefore, taking the combined disclosures of Glatfelter and Hikida as a whole, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the air conditioner ECU 600 controls the air conditioner of the vehicle according to the control contents of the sun-light signal input from the sun-light calculation unit 157 as taught by Hikida into the invention of Glatfelter for the benefit of controlling the air conditioner of the vehicle (Hikida: para 0089).
Regarding claim 11, the system of claim 1, Glatfelter further discloses the instructions including further instructions to compare the second illumination values by a machine learning circuitry (para 0027).
Glatfelter discloses claim 11 as enumerated above, but Glatfelter does not explicitly disclose a neural network as claimed.
Hikida discloses a DNN (Deep Neural Network) (para 0078).
Therefore, taking the combined disclosures of Glatfelter and Hikida as a whole, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate a DNN (Deep Neural Network) as taught by Hikida into the invention of Glatfelter for the benefit of recognizing raindrops and dirt included in the windshield image (Hikida: para 0078).
Regarding claim 12, the system of claim 11, Hikida in the combination further disclose wherein the neural network compensates for at least one of saturation, gain, and exposure time (para 0084 and 0155).
Regarding claim 16, this claim recites substantially the same limitations that are performed by claim 4 above, and it is rejected for the same reasons.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Tryndin et al., US 2023/0186593 discloses contrast values corresponding to pixels of one or more images generated using one or more sensors of a vehicle may be computed to detect and identify objects that trigger glare mitigating operations.
Elooz et al., US 2018/0120441 discloses systems and methods that use LIDAR technology to detect objects in the surrounding environment.
Cantin et al., US 7,855,376 discloses lighting systems and more particularly to object-detection using visible light emitted by a lighting system.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to VAN D HUYNH whose telephone number is (571)270-1937. The examiner can normally be reached 8AM-6PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Stephen R Koziol can be reached at (408) 918-7630. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/VAN D HUYNH/Primary Examiner, Art Unit 2665