Prosecution Insights
Last updated: April 19, 2026
Application No. 18/006,157

METHOD FOR DETECTING DIRT ACCUMULATED ON AN OPTICAL SENSOR ARRANGEMENT

Non-Final OA §102§112
Filed
Jan 20, 2023
Examiner
WIGGER, BENJAMIN DAVID
Art Unit
3645
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Mercedes-Benz Group AG
OA Round
1 (Non-Final)
Grant Probability
Favorable
1-2
OA Rounds
2y 12m
To Grant

Examiner Intelligence

Grants only 0% of cases
0%
Career Allow Rate
0 granted / 0 resolved
-52.0% vs TC avg
Minimal +0% lift
Without
With
+0.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 12m
Avg Prosecution
20 currently pending
Career history
20
Total Applications
across all art units

Statute-Specific Performance

§103
48.6%
+8.6% vs TC avg
§102
24.3%
-15.7% vs TC avg
§112
25.7%
-14.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 0 resolved cases

Office Action

§102 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application is being examined under the pre-AIA first to invent provisions. Claims 11 – 20 are presented for examination. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. Claims 14 and 20 are rejected under 35 U.S.C. 112(b), as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Regarding Claim 14, the term “typical” is a relative term which renders the claim indefinite. The term “typical” is not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. See MPEP 2173.05(b). Regarding Claim 20, the “Use of” claim terminology renders the claim ambiguous as to what type of claim is being pursued. Amending claim 20 to read “the method of claim 11, wherein the optical sensor array is incorporated into a vehicle and/or robot to perform a fully automated or autonomous operation” would overcome this rejection. The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. Claims 15-16 are rejected under 35 U.S.C. 112(a) as failing to comply with the written description requirement. The claims contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, at the time the application was filed, had possession of the claimed invention. Regarding Claims 15-16, they both include the term linear structures (presumably linear crosstalk structures). However, there is no teaching or explanation in the claims or specification that makes clear what a linear crosstalk structure is or how a potential infringer would understand the scope of the claim as currently drafted. Claims 15 and 16 also apply the term “blurred” and “blurring increases” to the already undescribed term linear crosstalk structures as a way of ascertaining the presence of crosstalk. Searching the specification there does not appear to be any language providing a description of what a crosstalk structure is or how a linear crosstalk structure would be distinguished from an ordinary one. The only guidance outside the claim language appears to be that linear crosstalk structures would be a result of using a linear LIDAR scanner. Further, the drawings fail to identify crosstalk structures at all, let alone linear ones. The drawings also fail to show examples of blurring of a linear crosstalk structure. For at least the aforementioned reasons, the subject matter contained in Claims 15 and 16 is not described in sufficient detail so that a person having ordinary skill in the art can reasonably conclude the inventor had possession of the claimed invention. Claim Rejections - 35 USC § 102 (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 11-20 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by US2020/0249354 (hereinafter Yeruhami). Regarding Claim 11, Yeruhami teaches a method for detecting dirt in the signal path of an optical sensor array, comprising: objects are detected by using multiple photodetector elements in the sensor array to detect light signals reflected on the objects ([0266] and FIG. 10D – 10E show multiple pixels 1034 capturing an obstruction pattern), each object is classified according to its type, and when it is classified the object is assigned to an object class ([0268] describes classifying a detected obstruction based on its obstruction pattern to identify a type of obstruction) with a predetermined reflectivity ([0280] & [0282] describe obstruction object models as including transparency levels and/or opacity parameters), a distance to the object is determined ([0277] describes the use of timing (i.e. measured distance) for blockage detection), crosstalk in the detected light signals onto multiple photodetector elements is identified ([0266] and FIG. 10D – 10E show multiple pixels 1034 capturing an obstruction pattern), and a degree of dirtiness is determined ([0267] when obstruction is detected, analyzed signal information is used to estimate ability of system 1000 to operate) based on the predetermined reflectivity ascertained during classification ([0280] & [0282] describe obstruction object models as including transparency levels and/or opacity parameters), the distance ([0277] discussion of timing, i.e. distance, for obstruction identification), and a magnitude of the crosstalk ([0277] discusses the use of signal intensity for obstruction identification). Regarding Claim 12, Yeruhami teaches the method as in claim 11, wherein the degree of dirtiness is determined using at least one look-up table ([0289] describing use of database 1104 storing reference obstruction patterns). Regarding Claim 13, Yeruhami teaches the method as in claim 12, wherein the at least one look-up table is generated based on at least one reference measurement taken by the sensor array ([0265] describes storing an obstruction pattern associated with a detected obstruction). Regarding Claim 14, Yeruhami teaches the method as in claim 11, wherein crosstalk is identified by testing an image detected by the sensor array for typical crosstalk structures (while the scope of the claim term “typical” is unclear as explained above, the comparison of the detected obstruction pattern to reference obstruction pattern described in [0269] amounts to a test and reference obstruction patterns probably correspond to what the Applicant meant by typical crosstalk structures). Regarding Claims 15, Yeruhami teaches the method as in claim 14, wherein linear structures are used as the structures and there is then determined to be crosstalk if the linear structures are blurred (given the lack of description of what a linear crosstalk structure should look like, the grid shown in FIG. 10F marked Mud is deemed to shows linear structures, [0268] describes how obstructions are characterized based on a pixel by pixel analysis and that blurring of a linear structure would cause an enlarged size increasing a likelihood of it being categorized as an obstruction). Regarding Claim 16, Yeruhami teaches the method as in claim 15, wherein as the degree of blurring increases, a higher degree of crosstalk is identified ([0268] describes how obstructions are characterized based on a pixel by pixel analysis and increased blurring of a linear structure would consequently result in more pixel blockage and greater amounts of crosstalk occurring). Regarding Claim 17, Yeruhami teaches the method as in claim 11, wherein crosstalk is identified as follows: dimensions of the detected object are compared to expected dimensions for such an object, and an increasing degree of crosstalk is identified with increasingly positive deviation of the dimensions for the detected object from the expected dimensions ([0282] describes how the reference obstruction pattern can include a size characteristic, when the obstruction is larger than the reference obstruction a larger amount of crosstalk would result). Regarding Claim 18, Yeruhami teaches the method as in claim 17, wherein the expected dimensions are determined from dimensions determined for an object class corresponding to the object based on at least one reference measurement taken by the sensor array ([0265] describes storing an obstruction pattern associated with a detected obstruction & [0282] describes how the reference obstruction pattern can include a size characteristic). Regarding Claim 19, Yeruhami teaches the method as in claim 18, wherein the expected dimensions are derived from an object class corresponding to the object, wherein objects belonging to the object class have standardized dimensions ([0282] describes how the reference obstruction pattern can include a size characteristic). Regarding Claim 20, Yeruhami teaches use of a method as in claim 11 in a vehicle and/or robot to perform a fully automated or autonomous operation ([0113] describes implementation of the invention in autonomous road-vehicles). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. In particular, US20200249354 at col 20, lines 2-15 describes the identification of crosstalk when multiple pixels are illuminated when illuminating only a single object. This reference also, at col 20, lines 16-17 describes storing predetermined reflectivities for objects to be detected in a data storage (analogous to a lookup table) in the form of a map. Any inquiry concerning this communication or earlier communications from the examiner should be directed to BENJAMIN WIGGER whose telephone number is (571)272-4208. The examiner can normally be reached 9:30am to 7:00pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Yuqing Xiao can be reached at 5712703603. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /BENJAMIN DAVID WIGGER/Examiner, Art Unit 3645 /LUKE D RATCLIFFE/Primary Examiner, Art Unit 3645
Read full office action

Prosecution Timeline

Jan 20, 2023
Application Filed
Dec 11, 2025
Non-Final Rejection — §102, §112
Mar 20, 2026
Response Filed
Mar 20, 2026
Response after Non-Final Action

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
Grant Probability
2y 12m
Median Time to Grant
Low
PTA Risk
Based on 0 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month