Prosecution Insights
Last updated: April 19, 2026
Application No. 18/499,362

VEHICLE GLASS CONTAMINATION ASSESSMENT FOR OPTIMIZED AUTO-ACTIVATION OF CLEANING SYSTEM

Non-Final OA §103§112
Filed
Nov 01, 2023
Examiner
BERGNER, ERIN FLANAGAN
Art Unit
1713
Tech Center
1700 — Chemical & Materials Engineering
Assignee
GM Global Technology Operations LLC
OA Round
1 (Non-Final)
77%
Grant Probability
Favorable
1-2
OA Rounds
2y 8m
To Grant
99%
With Interview

Examiner Intelligence

Grants 77% — above average
77%
Career Allow Rate
491 granted / 640 resolved
+11.7% vs TC avg
Strong +31% interview lift
Without
With
+31.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
32 currently pending
Career history
672
Total Applications
across all art units

Statute-Specific Performance

§101
4.4%
-35.6% vs TC avg
§103
48.9%
+8.9% vs TC avg
§102
18.8%
-21.2% vs TC avg
§112
22.0%
-18.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 640 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Election/Restrictions Claims 1-7 are withdrawn from further consideration pursuant to 37 CFR 1.142(b) as being drawn to a nonelected method, there being no allowable generic or linking claim. Election was made without traverse in the reply filed on 10-21-25. Applicant’s election without traverse of claims 8-20 in the reply filed on 10-21-25 is acknowledged. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 15-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 15 recites the limitation "the surface" and “the contaminant” in line 2. There is insufficient antecedent basis for this limitation in the claim. The remaining claims are rejected as being dependent on an indefinite claim. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 8-9 and 11 are rejected under 35 U.S.C. 103 as being unpatentable over Bacchus et al. US 2019/0106085 (US’085) in view of Baldovino et al. US 2019/0337489 (US’489). Regarding claim 8, US’085 teaches a system for cleaning a contaminant from a surface of a vehicle, comprising: a camera for obtaining an image of the surface, the surface including the contaminant (cleaning a sensor lens cover for an optical vehicle sensor, abstract, para. 1-2); a plurality of cleaning devices for cleaning the contaminant from the surface (one of one or more cleaning systems to activate a respective cleaning modality for the sensor lens cover, the cleaning modality provided by the selected cleaning system comprises one of the following group of cleaning modalities: pressurized air, pressurized fluid, mechanical wiping action, centrifugal force and ultrasonic vibration Para. 13 and 20-22); and a processor configured to: determine a contamination measure from the image, the contamination measure indicative of a contamination level of the surface from the image; determine a contaminated region and a contaminant type from the image (the processor determines presence of a contaminant on at least some of the cells of the sensor lens cover, a location of the contaminant on the sensor lens cover and a contaminant type using the sensor information and the contaminant information, para. 13-22); select a cleaning approach for cleaning the surface based on the contamination measure, and the contaminant type; the cleaning approach including selecting a cleaning device from the plurality of cleaning devices, and selecting a cleaning duration; (the processor activates a selected cleaning system depending upon contamination presence and the contaminant type and determines whether the contaminate has been removed after the completion of the cleaning modality, depending upon the amount or location of the contaminant, the decision function 204 also provides an intensity and duration recommendation 304 for cleaning operation to remove the contaminant, para. 22 and 37-42), and control the cleaning device using the cleaning approach (the sensor lens cover is cleaned, para. 37-42). US’085 does not teach selecting a cleaning direction and selecting the cleaning approach based on the contaminated region. US’489 teaches a processor programmed to receive data indicating a contaminant on a sensor window of a vehicle (abstract). The processor 14 is programmed to receive data indicating a contaminant on a sensor window 18 of the vehicle 12; determine a combination of an air pressure setting of an air source 20 and a liquid pressure setting of a liquid source 22 to remove the contaminant from the sensor window 18; and set the air source 20 to the air pressure setting and the liquid source 22 to the liquid pressure setting. The computer 10 described below applies the combination of air pressure settings for the air source 20 and liquid pressure settings for the liquid source 22 to reach a desired zone 54 of sensor window 18 and remove the contaminant. The computer 10 described below applies the combination of air pressure settings for the air source 20 and liquid pressure settings for liquid source 22 to reach a desired zone 54 of the sensor window 18 and remove the contaminant in the zone 54. The computer 10 described below applies the combination of air pressure settings for the air source 20 and liquid pressure settings for liquid source 22 to reach a desired zone 54 of the sensor window 18 and remove the contaminant in the zone 54 (para. 16-46, fig. 5-8). Therefore, US’489 teaches maximizing the cleaning effect by selecting a cleaning direction and selecting the cleaning approach based on the contaminated region by selecting the zone and operating conditions. The vehicle sensor cleaning of US’085 can be combined with the vehicle sensor cleaning of US’489 to maximizing the cleaning effect in the particular contaminated zone It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the apparatus of US’085 to include selecting a cleaning direction and selecting the cleaning approach based on the contaminated region because US’489 teaches it maximizes the cleaning effect in the particular contaminated zone and use of known technique to improve similar methods in the same way is obvious, see MPEP 2141 III (C). Regarding claim 9, the modified apparatus of US’085 teaches the system of claim 8. The modified apparatus of US’085 further teaches wherein the processor is further configured to select the cleaning device, the duration, using a velocity of the vehicle (the decision function 204 can be integrated into the processor 202 as indicated at 203. Decision function 204 receives various sensor information inputs from vehicle sensors 206 throughout the vehicle. Non-limiting examples of such sensors include speed sensors, para.36, 42 of US’085) and the orientation using a velocity of the vehicle (where the vehicle 12 is in motion, the processor 14 may consider the pressure exerted on the sensor window 18 based on the speed of the vehicle 12 and change the fluid path, para. 41-77, see fig. 7a-c of US’489). Regarding claim 11, the modified apparatus of US’085 teaches the system of claim 8. US’085 further teaches wherein the processor is further configured to determine the contaminant type and the contamination level from one of: (ii) a plurality of temporally spaced images when the vehicle is in motion (by comparing successive video frames, it can be determined that the dirt particle 602N is present N frames later as shown by video frame 600N continuing to detect the presence of the dirt particle 602N, para. 41, fig. 6a-e). Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over US’085 in view of US’489 as applied to claim 8 above, and further in view of Tariq et al. US2021/0201464 (US’464). Regarding claim 10, the modified apparatus of US’085 teaches the system of claim 8. The modified system of US’085 does not teach wherein the processor is further configured to determine the contamination level based on an average size of the contaminant and a dispersion of the contaminant over the surface. US’464 teaches a sensor degradation detection and remediation system includes one or more sensors configured to collect image data from an environment. (abstract). Computing device(s) 140 may implement one or more processes to generate synthetic degradation data, based on a predetermined set of characteristics associated with different types or sources of degradations. Because the size, density, and distribution of raindrops that may accumulate on a vehicle sensor 104 during a rain shower are not random, computing device(s) 140 may implement processes using preprogrammed raindrop sizes, shapes, and distribution patterns configured to model different types of rainstorms. For instance, a first pattern for misting rain may be stored, a second for heavy rain, a third for heavy rain in high traffic, a fourth for heavy rain at a slow speed, a fifth for heavy rain at a high speed, and so on. In some implementations, the computing device(s) 140 may execute a synthetic data generation process for a particular type/pattern of raindrops, using a set of preconfigured raindrop parameters for droplet birth rate, average size, size variation, death size, and/or droplet longevity. Using the set of preprogrammed parameters, along with randomization functions and distributions applied to the parameters, the synthetic degradation data generation process may generate a synthetic image of a raindrop pattern corresponding to a specific type and/or severity of rainstorm. Referring again to 518 and 618, when the operations of the autonomous vehicle 102 are controlled in response to the detection of the degradation in the image data 106, one or a combination of remediation operations may be performed. In some cases, an automated cleaning process may be initiated to clean a surface of the sensor 104 (e.g., a camera lens) from which the degraded image data 106 was captured (para. 41 and 68-79). Therefore, US’784 teaches it is well known in the art to perform image degradation analysis of vehicle vision sensors that includes average size and distribution of contaminants for the purposes of determining when to perform cleaning. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the modified apparatus of US’085 to include wherein the processor is further configured to determine the contamination level based on an average size of the contaminant and a dispersion of the contaminant over the surface because US’464 teaches it is well known in the art to perform image degradation analysis of vehicle vision sensors that includes average size and distribution of contaminants for the purposes of determining when to perform cleaning and combining prior art elements according to known methods to yield predictable results is obvious, see MPEP 2141 III (A). Claim 12 is rejected under 35 U.S.C. 103 as being unpatentable over US’085 in view of US’489 as applied to claim 8 above, and further in view of Herman et al. US2020/0094784 (US’784). Regarding claim 12, the modified apparatus of US’085 teaches the system of claim 8. The modified system of US’085 does not teach wherein the processor is further configured to determine the contaminated region using semantic segmentation of the image. US’784 teaches a system includes a processor and a memory. The memory stores instructions executable by the processor to detect an occlusion on a surface in a vehicle sensor optical path based on a segmentation of sensor image data, and to select a cleaning plan for the surface based on the detected occlusion, map data, and vehicle route data (abstract). An occluded area (or occlusion) on a surface in a vehicle sensor optical path may be detected based on a semantic segmentation of sensor image data. A cleaning plan for the surface may be selected based on the detected occlusion, map data, vehicle route data (para. 26-35 and 52-57, see fig. 5-6). Therefore, US’784 teaches it is well known in the art to detect contaminated regions in the viewing area of a sensor using semantic segmentation of the image. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the modified apparatus of US’085 to include wherein the processor is further configured to determine the contaminated region using semantic segmentation of the image because US’784 teaches it is well known in the art to detect contaminated regions in the viewing area of a sensor using semantic segmentation of the image and combining prior art elements according to known methods to yield predictable results is obvious, see MPEP 2141 III (A). Claim(s) 13-14 are rejected under 35 U.S.C. 103 as being unpatentable over US’085 in view of US’489 as applied to claim 8 above, and further in view of Du et al. US 2020/0198587 (US’586). Regarding claims 13-14, the modified apparatus of US’085 teaches the system of claim 8. The modified system of US’085 does not teach wherein the processor is further configured to operate one of a predictive model and a machine learning model to determine the contaminant type and the contamination level based on the image, with regard to claim 13 and wherein the processor is further configured to compare the image of the surface to a contamination model of the vehicle, with regard to claim 14. US’586 teaches a method of monitoring a windshield of a vehicle including: capturing an image of a windshield of a vehicle; determining that debris is located on the windshield in response to the image of the windshield; determining a location of the debris on the windshield; determining to remove the debris from the windshield; and actuating a drive motor of a wiper system to clean the location, the drive motor being operably connected to a wiper arm having a wiper blade (abstract). The camera 39 may capture images of the debris 15 on the windshield 16 and transmit the images to the windshield monitoring system 12. The windshield monitoring system 12 uses image processing (e.g., machine learning) to analyze the images to determine the location of the debris 15 on the windshield and degree of accumulation of the debris 15 on the windshield 16. The image processing may include training a software model, such as, for example, CNN/Fast R-CNN model or a similar model to detect different types of debris 15 and accumulation of debris 15. Image processing may include comparing the images captured by the camera 39 to stock images of a clean windshield 16. The windshield monitoring system 12 may also determine the type of debris 15 through an image processing, such as, for example, CNN/Fast R-CNN model that uses a plurality of stock images of different types of debris 15 located on the windshield 16 to train a CNN/Fast R-CNN model. (para. 44-47). Therefore, US’586 teaches that it is well known in the art to determine use machine learning and image comparisons to analysis contamination type and level for a vehicle monitoring system. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the modified apparatus of US’085 to include wherein the processor is further configured to operate one of a machine learning model to determine the contaminant type and the contamination level based on the image, with regard to claim 13 and wherein the processor is further configured to compare the image of the surface to a contamination model of the vehicle, with regard to claim 14 because US’586 teaches it is well known in the art o determine use machine learning and image comparisons to analysis contamination type and level for a vehicle monitoring system and combining prior art elements according to known methods to yield predictable results is obvious, see MPEP 2141 III (A). Claim(s) 15-16 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Bacchus et al. US 2019/0106085 (US’085) in view of Baldovino et al. US 2019/0337489 (US’489). Regarding claim 15, US’085 teaches a vehicle, comprising: a camera for obtaining an image of the surface, the surface including the contaminant (cleaning a sensor lens cover for an optical vehicle sensor, abstract, para. 1-2); a plurality of cleaning devices for cleaning the contaminant from the surface (one of one or more cleaning systems to activate a respective cleaning modality for the sensor lens cover, the cleaning modality provided by the selected cleaning system comprises one of the following group of cleaning modalities: pressurized air, pressurized fluid, mechanical wiping action, centrifugal force and ultrasonic vibration Para. 13 and 20-22); and a processor configured to: determine a contamination measure from the image, the contamination measure indicative of a contamination level of the surface from the image; determine a contaminated region and a contaminant type from the image (the processor determines presence of a contaminant on at least some of the cells of the sensor lens cover, a location of the contaminant on the sensor lens cover and a contaminant type using the sensor information and the contaminant information, para. 13-22); select a cleaning approach for cleaning the surface based on the contamination measure, and the contaminant type, the cleaning approach including selecting a cleaning device from the plurality of cleaning devices, and selecting a cleaning duration (the processor activates a selected cleaning system depending upon contamination presence and the contaminant type and determines whether the contaminate has been removed after the completion of the cleaning modality, depending upon the amount or location of the contaminant, the decision function 204 also provides an intensity and duration recommendation 304 for cleaning operation to remove the contaminant, para. 22 and 37-42); and control the cleaning device using the cleaning approach (the sensor lens cover is cleaned, para. 37-42). US’085 does not teach selecting a cleaning direction and selecting the cleaning approach based on the contaminated region. US’489 teaches a processor programmed to receive data indicating a contaminant on a sensor window of a vehicle (abstract). The processor 14 is programmed to receive data indicating a contaminant on a sensor window 18 of the vehicle 12; determine a combination of an air pressure setting of an air source 20 and a liquid pressure setting of a liquid source 22 to remove the contaminant from the sensor window 18; and set the air source 20 to the air pressure setting and the liquid source 22 to the liquid pressure setting. The computer 10 described below applies the combination of air pressure settings for the air source 20 and liquid pressure settings for the liquid source 22 to reach a desired zone 54 of sensor window 18 and remove the contaminant. The computer 10 described below applies the combination of air pressure settings for the air source 20 and liquid pressure settings for liquid source 22 to reach a desired zone 54 of the sensor window 18 and remove the contaminant in the zone 54. The computer 10 described below applies the combination of air pressure settings for the air source 20 and liquid pressure settings for liquid source 22 to reach a desired zone 54 of the sensor window 18 and remove the contaminant in the zone 54 (para. 16-46, fig. 5-8). Therefore, US’489 teaches maximizing the cleaning effect by selecting a cleaning direction and selecting the cleaning approach based on the contaminated region by selecting the zone and operating conditions. The vehicle sensor cleaning of US’085 can be combined with the vehicle sensor cleaning of US’489 to maximizing the cleaning effect in the particular contaminated zone It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the apparatus of US’085 to include selecting a cleaning direction and selecting the cleaning approach based on the contaminated region because US’489 teaches it maximizes the cleaning effect in the particular contaminated zone and use of known technique to improve similar methods in the same way is obvious, see MPEP 2141 III (C). Regarding claim 16, the modified apparatus of US’085 teaches the system of claim 15. The modified apparatus of US’085 further teaches wherein the processor is further configured to select the cleaning device, the duration, using a velocity of the vehicle (the decision function 204 can be integrated into the processor 202 as indicated at 203. Decision function 204 receives various sensor information inputs from vehicle sensors 206 throughout the vehicle. Non-limiting examples of such sensors include speed sensors, para.36, 42 of US’085) and the orientation using a velocity of the vehicle (where the vehicle 12 is in motion, the processor 14 may consider the pressure exerted on the sensor window 18 based on the speed of the vehicle 12 and change the fluid path, para. 41-77, see fig. 7a-c of US’489). Regarding claim 18, the modified apparatus of US’085 teaches the vehicle of claim 15. US’085 further teaches wherein the processor is further configured to determine the contaminant type and the contamination level from one of: (ii) a plurality of temporally spaced images when the vehicle is in motion (by comparing successive video frames, it can be determined that the dirt particle 602N is present N frames later as shown by video frame 600N continuing to detect the presence of the dirt particle 602N, para. 41, fig. 6a-e). Claim 17 is rejected under 35 U.S.C. 103 as being unpatentable over US’085 in view of US’489 as applied to claim 15 above, and further in view of Tariq et al. US2021/0201464 (US’464). Regarding claim 17, the modified apparatus of US’085 teaches the system of claim 15. The modified system of US’085 does not teach wherein the processor is further configured to determine the contamination level based on an average size of the contaminant and a dispersion of the contaminant over the surface. US’464 teaches a sensor degradation detection and remediation system includes one or more sensors configured to collect image data from an environment. (abstract). Computing device(s) 140 may implement one or more processes to generate synthetic degradation data, based on a predetermined set of characteristics associated with different types or sources of degradations. Because the size, density, and distribution of raindrops that may accumulate on a vehicle sensor 104 during a rain shower are not random, computing device(s) 140 may implement processes using preprogrammed raindrop sizes, shapes, and distribution patterns configured to model different types of rainstorms. For instance, a first pattern for misting rain may be stored, a second for heavy rain, a third for heavy rain in high traffic, a fourth for heavy rain at a slow speed, a fifth for heavy rain at a high speed, and so on. In some implementations, the computing device(s) 140 may execute a synthetic data generation process for a particular type/pattern of raindrops, using a set of preconfigured raindrop parameters for droplet birth rate, average size, size variation, death size, and/or droplet longevity. Using the set of preprogrammed parameters, along with randomization functions and distributions applied to the parameters, the synthetic degradation data generation process may generate a synthetic image of a raindrop pattern corresponding to a specific type and/or severity of rainstorm. Referring again to 518 and 618, when the operations of the autonomous vehicle 102 are controlled in response to the detection of the degradation in the image data 106, one or a combination of remediation operations may be performed. In some cases, an automated cleaning process may be initiated to clean a surface of the sensor 104 (e.g., a camera lens) from which the degraded image data 106 was captured (para. 41 and 68-79). Therefore, US’784 teaches it is well known in the art to perform image degradation analysis of vehicle vision sensors that includes average size and distribution of contaminants for the purposes of determining when to perform cleaning. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the modified apparatus of US’085 to include wherein the processor is further configured to determine the contamination level based on an average size of the contaminant and a dispersion of the contaminant over the surface because US’464 teaches it is well known in the art to perform image degradation analysis of vehicle vision sensors that includes average size and distribution of contaminants for the purposes of determining when to perform cleaning and combining prior art elements according to known methods to yield predictable results is obvious, see MPEP 2141 III (A) Claim 19 is rejected under 35 U.S.C. 103 as being unpatentable over US’085 in view of US’489 as applied to claim 15 above, and further in view of Herman et al. US2020/0094784 (US’784). Regarding claim 19, the modified apparatus of US’085 teaches the system of claim 15. The modified system of US’085 does not teach wherein the processor is further configured to determine the contaminated region using semantic segmentation of the image. US’784 teaches a system includes a processor and a memory. The memory stores instructions executable by the processor to detect an occlusion on a surface in a vehicle sensor optical path based on a segmentation of sensor image data, and to select a cleaning plan for the surface based on the detected occlusion, map data, and vehicle route data (abstract). An occluded area (or occlusion) on a surface in a vehicle sensor optical path may be detected based on a semantic segmentation of sensor image data. A cleaning plan for the surface may be selected based on the detected occlusion, map data, vehicle route data (para. 26-35 and 52-57, see fig. 5-6). Therefore, US’784 teaches it is well known in the art to detect contaminated regions in the viewing area of a sensor using semantic segmentation of the image. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the modified apparatus of US’085 to include wherein the processor is further configured to determine the contaminated region using semantic segmentation of the image because US’784 teaches it is well known in the art to detect contaminated regions in the viewing area of a sensor using semantic segmentation of the image and combining prior art elements according to known methods to yield predictable results is obvious, see MPEP 2141 III (A). Claim 20 is rejected under 35 U.S.C. 103 as being unpatentable over US’085 in view of US’489 as applied to claim 15 above, and further in view of Du et al. US 2020/0198587 (US’586). Regarding claim 20, the modified apparatus of US’085 teaches the system of claim 15. The modified system of US’085 does not teach wherein the processor is further configured to operate one of a predictive model and a machine learning model to determine the contaminant type and the contamination level based on the image. US’586 teaches a method of monitoring a windshield of a vehicle including: capturing an image of a windshield of a vehicle; determining that debris is located on the windshield in response to the image of the windshield; determining a location of the debris on the windshield; determining to remove the debris from the windshield; and actuating a drive motor of a wiper system to clean the location, the drive motor being operably connected to a wiper arm having a wiper blade (abstract). The camera 39 may capture images of the debris 15 on the windshield 16 and transmit the images to the windshield monitoring system 12. The windshield monitoring system 12 uses image processing (e.g., machine learning) to analyze the images to determine the location of the debris 15 on the windshield and degree of accumulation of the debris 15 on the windshield 16. The image processing may include training a software model, such as, for example, CNN/Fast R-CNN model or a similar model to detect different types of debris 15 and accumulation of debris 15. Image processing may include comparing the images captured by the camera 39 to stock images of a clean windshield 16. The windshield monitoring system 12 may also determine the type of debris 15 through an image processing, such as, for example, CNN/Fast R-CNN model that uses a plurality of stock images of different types of debris 15 located on the windshield 16 to train a CNN/Fast R-CNN model. (para. 44-47). Therefore, US’586 teaches that it is well known in the art to determine use machine learning and image comparisons to analysis contamination type and level for a vehicle monitoring system. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the modified apparatus of US’085 to include wherein the processor is further configured to operate one of a machine learning model to determine the contaminant type and the contamination level based on the image because US’586 teaches it is well known in the art to determine use machine learning and image comparisons to analysis contamination type and level for a vehicle monitoring system and combining prior art elements according to known methods to yield predictable results is obvious, see MPEP 2141 III (A). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to ERIN FLANAGAN BERGNER whose telephone number is (571)270-1133. The examiner can normally be reached M-F 8:00-5:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Joshua Allen can be reached at 571-270-3176. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ERIN F BERGNER/Primary Examiner, Art Unit 1713
Read full office action

Prosecution Timeline

Nov 01, 2023
Application Filed
Jan 13, 2026
Non-Final Rejection — §103, §112
Mar 27, 2026
Interview Requested
Apr 07, 2026
Applicant Interview (Telephonic)
Apr 07, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12594585
MONITORING SOLVENT IN A FIBER CLEANING DEVICE
2y 5m to grant Granted Apr 07, 2026
Patent 12594587
CARRIER SYSTEM AND METHOD FOR LASER CLEANING ADHESIVE FASTENERS HAVING AXIAL COMPONENTS
2y 5m to grant Granted Apr 07, 2026
Patent 12589403
SUBSTRATE PROCESSING APPARATUS AND SUBSTRATE PROCESSING METHOD
2y 5m to grant Granted Mar 31, 2026
Patent 12584216
MINIMIZING TIN OXIDE CHAMBER CLEAN TIME
2y 5m to grant Granted Mar 24, 2026
Patent 12576414
ELECTRODE ARRANGEMENT FOR A ROTARY ATOMIZER AND ASSOCIATED OPERATING METHOD
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
77%
Grant Probability
99%
With Interview (+31.3%)
2y 8m
Median Time to Grant
Low
PTA Risk
Based on 640 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month