Prosecution Insights
Last updated: April 19, 2026
Application No. 18/703,725

RETINAL IMAGING

Non-Final OA §102§103§112
Filed
Apr 23, 2024
Examiner
HO, WAI-GA DAVID
Art Unit
2872
Tech Center
2800 — Semiconductors & Electrical Systems
Assignee
AMS-OSRAM AG
OA Round
1 (Non-Final)
25%
Grant Probability
At Risk
1-2
OA Rounds
3y 9m
To Grant
99%
With Interview

Examiner Intelligence

Grants only 25% of cases
25%
Career Allow Rate
1 granted / 4 resolved
-43.0% vs TC avg
Strong +100% interview lift
Without
With
+100.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 9m
Avg Prosecution
52 currently pending
Career history
56
Total Applications
across all art units

Statute-Specific Performance

§101
0.6%
-39.4% vs TC avg
§103
51.2%
+11.2% vs TC avg
§102
18.4%
-21.6% vs TC avg
§112
29.5%
-10.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 4 resolved cases

Office Action

§102 §103 §112
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. Information Disclosure Statement The information disclosure statements submitted on 4/23/2024 and 9/9/2025 are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statements are being considered by the examiner. Response to Amendment This office action is in response to the communication filed 4/23/2024. Amendments to the abstract, specification, and/or claim(s) 1-17, filed 4/23/2024, are acknowledged and accepted. Specification The disclosure is objected to because of the following informalities: On pg. 1, line 18, “while… exist” should be followed by a comma On pg. 1, line 23, “According to… disclosure” should be followed by a comma On pg. 3, line 7, “integrating in” should read “integrating into” On pg. 3, line 4, “typically the an” should have either “the” or “an” removed On pg. 4, line 14, “1A” should read “FIG. 1A” On pg. 4, lines 2-3, “allows…be placed” should read “allows… to be placed” On pg. 6, line 16, “each comprises” should read “each comprise” Appropriate correction is required. Claim Objections Claims 1-17 are objected to because of the following informalities: In claim 1, lines 4-6, “determining”, “repeating”, and “combining” are missing indentation In claim 10, line 2, “integrating in” should read “integrating into” Claims not specifically addressed in the objections above inherit the objection of the claim from which they depend. Appropriate correction is required. Claim Rejections - 35 USC § 112(b) The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 5 and 10-17 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Regarding claim 5, line 2 recites "the detector" which lacks a proper antecedent basis. For examination purposes, the claim shall be read as depending on claim 3 (reciting “a detector” on line 3) rather than on claim 1. Regarding claim 10, line 1 recites "the device" which lacks a proper antecedent basis. For examination purposes, the limitation shall be read as “the optical device”, corresponding to “An optical device” recited earlier on the same line. Further regarding claim 10, lines 4-5 recite "a retina". However, “a retina” was already recited on line 1 of the same claim, overloading the phrase with multiple introductions and causing ambiguity as to whether each “retina” refers to a common object or distinct ones. For examination purposes, the limitation on lines 4-5 shall be read as “the retina”. Regarding claim 15, line 2 recites "the detector" which lacks a proper antecedent basis. For examination purposes, the claim shall be read as depending on claim 11 (reciting “a detector” on line 5) rather than on claim 1. Regarding claim 16, line 1 recites “A head mounted device comprising one or two optical devices”. However, “an optical device” and “a head mounted device” were already introduced on lines 1 and 2 of claim 10. Thus, the antecedent and structural relationships between currently claimed elements and those recited prior are currently obfuscated by their reintroduction/reinstantiation, which ultimately renders the claim indefinite. Regarding claim 17, lines 1-2 recite “the or each optical device” which lacks a proper antecedent basis. For examination purposes, the limitation shall be read as corresponding to “the one or two optical devices” recited on line 1 of claim 16. Further regarding claim 17, lines 2-3 recite "the optical element" which lacks a proper antecedent basis. For examination purposes, the limitation shall be read as “an optical element”. Claims not specifically addressed in the rejections above inherit the indefiniteness of the claim from which they depend. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-6, 10-12, 14-16 are rejected under 35 U.S.C 102(a)(2) as being anticipated by Khan and Susanibar (US 20220151489 A1, hereinafter “Khan”). Regarding claim 1, Khan discloses a method of imaging a retina of an eye, the method comprising (see ¶s 128-133, describing integrated eye-tracking and retinal imaging): determining a position of the eye (¶ 133: “Eye position can be recorded”); measuring light reflected or emitted from a point on the retina of the eye (¶ 130: “The fundus camera can be… used to provide narrow and wide field retinal imaging capabilities”); determining a location of the point on the retina based on the position of the eye (¶ 133: “Eye coordinate information is used to determine the portion of the retina captured”); repeating the steps of determining and measuring over time to provide multiple measurements of light reflected from points in different locations on the retina (¶ 130: “Data can be collected from a live video of the eye while the Fundus camera is taking a set of images”); and combining the measurements to form an image of the retina (¶ 130: “Eye-tracking and image processing can be used to combine images captured by the camera to generate an image that covers most or all of the retina.”). (See also FIGs. 10-13 and accompanying ¶s 134-153 – detailing multiple related embodiments that strongly overlap with one another and implement the relevant methods – as well as FIGs. 18(A-C), ¶s 167-175, further refining details applicable to these embodiments) Regarding claim 2, Khan discloses the method according to claim 1. Khan further discloses (see FIGs. 11(A-B), ¶s 141-148) wherein the step of determining the position comprises using an eye-tracking unit (eye-tracking cameras 1110). Regarding claim 3, Khan discloses the method according to claim 1. Khan further discloses (see FIGs. 18(A-C), ¶s 167-175) wherein measuring comprises: illuminating the retina of the eye (1830) (¶ 170: “light will enter the eye of a user 1830 and illuminate at least part of the retina”); and focusing light (1821) reflected from the point on the retina onto a detector (imaging camera 1828); and receiving the focused light (1821) with the detector (imaging camera 1828) (¶ 173: “Light 1821 reflected by the retina or other structure in the eye exits… lens 1826 focuses the light onto imaging camera 1826 [sic, read:1828]”). Regarding claim 4, Khan discloses the method according to claim 3. Khan further discloses (see FIGs. 18(A-C), ¶s 167-175) wherein the step of illuminating comprises emitting light (1805) with an emitter (light source 1804) and directing the emitted light (1805, 1817) onto the eye (1830) with an optical element (e.g. beam splitter 1810, mirrors 1814 and 1816). Regarding claim 5, Khan discloses the method according to claim 3 (rather than to claim 1, see Claim Rejections – 35 USC § 112 above). Khan further discloses (see FIGs. 18(A-C), ¶s 167-175) wherein the detector (imaging camera 1828) comprises a photodiode (“sensor element”). (Consider the following: ¶ 173: “Imaging camera 1828 can include internal lenses that operate to focus incoming light onto a sensor element” ¶ 174: “imaging camera 1820 [sic, read: 1828] can comprise a conventional high-resolution digital camera” and note that virtually all “conventional” cameras have image sensors (CMOS, CCD) with photodiodes.) Regarding claim 6, Khan discloses the method according to claim l. Khan further discloses (see FIGs. 18(A-C), ¶s 167-175) wherein the step of measuring comprises determining one or more of an intensity, a phase, an auto-fluorescence, and a polarisation. (Consider the following: ¶ 173: “Imaging camera 1828 can include internal lenses that operate to focus incoming light onto a sensor element” ¶ 174: “imaging camera 1820 [sic, read: 1828] can comprise a conventional high-resolution digital camera” and note that virtually all “conventional” cameras have image sensors (CMOS, CCD) with photodiodes. Photodiodes generate photocurrent directly related to incident light intensity, and they will therefore involve intensity determination as part of their core operating principle.) Regarding claim 10, Khan discloses (see FIG. 2(A-B), ¶s 61-67; FIG. 6, ¶s 68-74) an optical device (FIG. 2(A-B)’s optics module 110, including FIG. 6’s module system 600) for imaging a retina of an eye (¶ 72: “Different modules 110 can be specially configured for… retinal imaging”), the device being suitable for integrating in a head mounted device (modular headset system 100), the optical device (optics module 110) comprising (see also ¶s 130-133, describing integrated eye-tracking and retinal imaging, as well as FIGs. 10-13 and accompanying ¶s 134-153 – detailing multiple related embodiments that strongly overlap with one another and with those of preceding discussions, including optics module 110 and module system 600 cited above): an eye-tracking unit (e.g. FIG. 6’s/11’s eye tracking cameras 612/1110) configured to determine a position of an eye (¶ 133: “Eye position can be recorded”); a measuring unit (e.g. FIG. 10’s fundus camera modules 1008, FIG. 11-12’s fundus cameras 1108) configured to measure light reflected or emitted from a point on a retina of the eye (¶ 130: “The fundus camera can be… used to provide narrow and wide field retinal imaging capabilities”); a processing unit (e.g. FIG. 12’s (image processing and) analysis system 1206) configured to determine a location of the point on the retina based on the position of the eye (¶ 133: “Eye coordinate information is used to determine the portion of the retina captured”); and an imaging unit (e.g. FIG. 12’s (image processing and) analysis system 1206) configured to combine multiple measurements of reflected light to form an image of the retina (¶ 130: “Eye-tracking and image processing can be used to combine images captured by the camera to generate an image that covers most or all of the retina.”). Regarding claim 11, Khan discloses the optical device according to claim 10. Khan further discloses (see FIGs. 18(A-C), ¶s 167-175, further refining details applicable to the above-cited embodiments) wherein the measuring unit (fundus camera 1800) comprises an emitter (light source 1804) for illuminating the retina of the eye (1830) (¶ 170: “light will enter the eye of a user 1830 and illuminate at least part of the retina”); and a detector (imaging camera 1828) for receiving the light (1821) reflected from the point on the retina (¶ 173: “Light 1821 reflected by the retina or other structure in the eye exits… lens 1826 focuses the light onto imaging camera 1826 [sic, read:1828]”). Regarding claim 12, Khan discloses the optical device according to claim 11. Khan further discloses (See ¶ 169) wherein the emitter (light source 1804) comprises a light emitting diode, LED, or laser diode. Regarding claim 14, Khan discloses the optical device according to claim 11. Khan further discloses (see FIGs. 18(A-C), ¶s 167-175) wherein the measuring unit (fundus camera 1800) further comprises an optical element (e.g. beam splitter 1810, mirrors 1814 and 1816) for directing light (1805, 1817) from the emitter onto the eye (1830). Regarding claim 15, Khan discloses the optical device according to claim 11 (rather than to claim 10, see Claim Rejections – 35 USC § 112 above). Khan further discloses (see FIGs. 18(A-C), ¶s 167-175) wherein the detector (imaging camera 1828) comprises a photodiode (“sensor element”) for measuring an intensity of light (1821) incident on the photodiode (“sensor element”). (Consider the following: ¶ 173: “Imaging camera 1828 can include internal lenses that operate to focus incoming light onto a sensor element” ¶ 174: “imaging camera 1820 [sic, read: 1828] can comprise a conventional high-resolution digital camera” and note that virtually all “conventional” cameras have image sensors (CMOS, CCD) with photodiodes. Photodiodes generate photocurrent directly related to incident light intensity, and they will therefore measure intensity as part of their core operating principle.) Regarding claim 16, Khan discloses a head mounted device (modular headset system 100) comprising one or two optical devices (optics module 110) as claimed in claim 10 (see rejection above). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 7-9, 13, and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Khan, as applied to claims 1, 11, and 16 above, and further in view of Meyer et al (US 20230122222 A1, hereinafter “Meyer”). Regarding claim 7, Khan discloses the method according to claim l. Khan does not disclose wherein measuring comprises self-mixing interferometry, SMI, so that the light is emitted by an emitter and the reflected light is received by the same emitter and the output from or the input to the emitter is measured to determine a phase and/or amplitude of the reflected light. Khan and Meyer are commonly related to ocular measurement apparatuses/methods for head-mounted device applications. Meyer discloses (see FIGs. 1-3, ¶s 51-63) wherein measuring comprises self-mixing interferometry, SMI, so that the light is emitted by an emitter (laser/photodiode unit 130) and the reflected light is received by the same emitter (laser/photodiode unit 130) and the output from or the input to the emitter is measured to determine a phase and/or amplitude of the reflected light. (See also ¶s 5-26’s basic description of self-mixing interference measurements; reflected light interferes with the primary beam, resulting in intensity fluctuations (encompassing phase and/or amplitude variations) from which information about the reflecting – e.g. retina – object is obtained) It would have therefore been obvious for one of ordinary skill in the art, before the effective filing date of the claimed invention, to combine the teachings of Khan and Meyer, in order to provide enhanced biometrics with spatial resolution, facilitating feature extraction and identification/authentication accuracy. Regarding claim 8, modified Khan discloses the method according to claim 7. Khan further discloses (see FIGs. 18(A-C), ¶s 167-175) wherein a part of the light (1805) emitted by the emitter is directed to a detector (imaging camera 1828). Regarding claim 9, Khan discloses the method according to claim l. Khan further discloses wherein determining the position is repeated at a repetition rate greater than 60 Hz. (See ¶ 142: “The frame rate of cameras… [i.e. eye-tracking cameras for determining position] … at a range between 50 Hz to 100 Hz.”) Khan does not disclose wherein measuring the reflected light is repeated at a repetition rate greater than 60 Hz. Khan and Meyer are commonly related to ocular measurement apparatuses/methods for head-mounted device applications. Meyer discloses wherein measuring the reflected light is repeated at a repetition rate greater than 60 Hz. (See ¶s 74-79, listed are eye “variables” determined by analyzing the reflected light – among them are “saccades… with an occurrence frequency of 10 Hz to 100 Hz” (¶ 78). The upper value suggests a sampling rate of at least 200 Hz (= Nyquist frequency).) It would have therefore been obvious for one of ordinary skill in the art, before the effective filing date of the claimed invention, to combine the teachings of Khan and Meyer, in order to provide enhanced biometrics with spatial resolution, facilitating feature extraction and identification/authentication accuracy. Regarding claim 13, Khan discloses the optical device according to claim 11. Khan does not disclose wherein the emitter comprises a vertical cavity surface emitting laser, VCSEL, configured to emit light having a wavelength in the range of 800 nm to 1400 nm. Khan and Meyer are commonly related to ocular measurement apparatuses/methods for head-mounted device applications. Meyer discloses (See FIGs. 1-3, ¶s 51-63) wherein the emitter (laser/photodiode unit 130) comprises a vertical cavity surface emitting laser (¶ 53: “laser/photodiode unit 130 is… a ViP[= VCSEL-integrated-Photodiode]”), VCSEL, configured to emit light having a wavelength in the range of 800 nm to 1400 nm. (¶ 54: “wavelengths… in particular 780 nm to 1,040 nm, may be used”) It would have therefore been obvious for one of ordinary skill in the art, before the effective filing date of the claimed invention, to combine the teachings of Khan and Meyer, in order to provide enhanced biometrics with spatial resolution, facilitating feature extraction and identification/authentication accuracy. Regarding claim 17, Khan discloses the head mounted device according to claim 16. Khan does not disclose wherein the or each optical device is integrated in a stem of the head mounted device and the optical element is configured to at least reflect light having a wavelength substantially equal to the light of the emitter, whilst being substantially transparent to light in the visible spectrum. Khan and Meyer are commonly related to ocular measurement apparatuses/methods for head-mounted device applications. Meyer discloses (see FIGs. 3, ¶s 63) wherein the or each optical device (laser/photodiode unit 130) is integrated in a stem (temple 120) of the head mounted device (device 100) and the optical element (holographic optical element (HOE) 150) is configured to at least reflect light having a wavelength substantially equal to the light of the emitter (laser/photodiode unit 130), whilst being substantially transparent to light in the visible spectrum (HOEs are typically transparent to visible light and implemented in see-through displays; here HOE 150 is embedded in spectacle lens 110). It would have therefore been obvious for one of ordinary skill in the art, before the effective filing date of the claimed invention, to combine the teachings of Khan and Meyer, in order to provide enhanced biometrics with spatial resolution, facilitating feature extraction and identification/authentication accuracy. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to WAI-GA D. HO whose telephone number is (571)270-1624. The examiner can normally be reached Monday through Friday, 10AM - 6PM E.T.. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Stephone Allen can be reached at (571) 272-2434. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /W.D.H./Examiner, Art Unit 2872 /STEPHONE B ALLEN/Supervisory Patent Examiner, Art Unit 2872
Read full office action

Prosecution Timeline

Apr 23, 2024
Application Filed
Feb 21, 2026
Non-Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12493138
AIRGAP STRUCTURES FOR IMPROVED EYEPIECE EFFICIENCY
2y 5m to grant Granted Dec 09, 2025
Study what changed to get past this examiner. Based on 1 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
25%
Grant Probability
99%
With Interview (+100.0%)
3y 9m
Median Time to Grant
Low
PTA Risk
Based on 4 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month