Prosecution Insights
Last updated: April 19, 2026
Application No. 19/210,713

EYE TRACKING USING ASPHERIC CORNEA MODEL

Non-Final OA §102§103
Filed
May 16, 2025
Examiner
CASTIAUX, BRENT D
Art Unit
2623
Tech Center
2600 — Communications
Assignee
Magic Leap Inc.
OA Round
1 (Non-Final)
83%
Grant Probability
Favorable
1-2
OA Rounds
2y 1m
To Grant
99%
With Interview

Examiner Intelligence

Grants 83% — above average
83%
Career Allow Rate
434 granted / 523 resolved
+21.0% vs TC avg
Strong +16% interview lift
Without
With
+15.9%
Interview Lift
resolved cases with interview
Fast prosecutor
2y 1m
Avg Prosecution
23 currently pending
Career history
546
Total Applications
across all art units

Statute-Specific Performance

§101
1.3%
-38.7% vs TC avg
§103
55.9%
+15.9% vs TC avg
§102
30.2%
-9.8% vs TC avg
§112
10.8%
-29.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 523 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1, 5, 6, 8-11, 15, and 17 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by U.S. Pub. No. 2020/0257359 by Klingstrom (“Klingstrom”). As to claim 1, Klingstrom discloses a method (Klingstrom, eye tracking methods, Figures 3-8, ¶ [0031]) of determining one or more parameters including a center of rotation of an eye of a user (Klingstrom, This type of eye tracking is often referred to as pupil center corneal reflection (PCCR). The PCCR scheme 700 may for example be based on corneal reflections 112 (also called glints) of multiple illuminators 301, and/or images of the eye 100 captured by multiple cameras 302. ¶ [0087])(Klingstrom, PCCR eye tracking typically includes finding out where the eye ball is located in space, and finding out a rotation (or direction) or the eye ball. This is usually done by finding the pupil center 103 and the center of the glint 112. ¶ [0089]), the one or more parameters usable for rendering virtual image content in a display system configured to display the virtual image content in a vision field of the user (Klingstrom, The display 303 may for example be a liquid-crystal display (LCD) or a LED display. However, other types of displays may also be envisaged. The display may 303 may for example be flat or curved. The display 303 may for example be a TV screen, a computer screen, or may be part of a head-mounted device (HMD) such as a virtual reality (VR) or augmented reality (AR) device. The display 303 may for example be placed in front of one of the user's eyes. In other words, separate displays 303 may be employed for the left and right eyes. Separate eye tracking equipment (such as illuminators 301 and cameras 302) may for example be employed for the left and right eyes. ¶ [0037]), the method comprising: receiving a plurality of images of an eye of the user, the plurality of images captured using a plurality of eye tracking cameras (Klingstrom, The system 300 comprises one or more illuminators 301 for illuminating the eye 100 and one or more cameras 302 for capturing images of the eye 100 while the eye 100 looks at a display 303. Figure 3, ¶ [0033]), the plurality of images comprising a plurality of glints that are formed by plurality of light emitters emitting light that is reflected off of the eye (Klingstrom, FIG. 5 is a flow chart of a scheme for estimating 403 the cornea radius 105 in the method 400 from FIG. 4, according to an embodiment. In the present embodiment, the step 403 of estimating an updated value of the cornea radius 105 comprises: obtaining 501 at least two images of the eye 100 captured by respective cameras 302 when the eye 100 is illuminated by at least two illuminators 301; estimating 502, based on the images, positions of reflections 112 of the illuminators 301 at the cornea 101 of the eye 100; Figures 4 and 5, ¶¶ [0061-0063]); and obtaining an estimate of the center of rotation of the eye based on the plurality of glints (Klingstrom, estimating 503 an updated value of the cornea radius 105 based on the positions of the reflections 112 at the cornea 101 and positions of the illuminators 301 relative to the cameras 302 by which the images were captured. Figures 4 and 5, ¶ [0064]), wherein obtaining the estimate of the center of rotation of the eye comprises: determining a plurality of estimates of the center of corneal curvature of the eye based on the plurality of glints (Klingstrom, estimating 503 an updated value of the cornea radius 105 based on the positions of the reflections 112 at the cornea 101 and positions of the illuminators 301 relative to the cameras 302 by which the images were captured. Figures 4 and 5, ¶ [0064])(Klingstrom, The cornea 101 is often modeled as a spherical surface with a center of curvature 104 which is simply referred to as the cornea center 104. In such a spherical cornea model, the cornea 101 has a radius of curvature referred to as the radius 105 of the cornea 101 or simply the cornea radius 105. ¶ [0032]); generating a three-dimensional surface based at least partly on the plurality of estimates of the center of the corneal curvature (Klingstrom, The cornea 101 is often modeled as a spherical surface with a center of curvature 104 which is simply referred to as the cornea center 104. In such a spherical cornea model, the cornea 101 has a radius of curvature referred to as the radius 105 of the cornea 101 or simply the cornea radius 105. ¶ [0032]); The spherical surface, of Klingstrom, is the three-dimensional surface as claimed. determining the estimate of the center of rotation of the eye based at least partly on the three-dimensional surface (Klingstrom, estimating 503 an updated value of the cornea radius 105 based on the positions of the reflections 112 at the cornea 101 and positions of the illuminators 301 relative to the cameras 302 by which the images were captured. Figures 4 and 5, ¶ [0064]) (Klingstrom, The cornea 101 is often modeled as a spherical surface with a center of curvature 104 which is simply referred to as the cornea center 104. In such a spherical cornea model, the cornea 101 has a radius of curvature referred to as the radius 105 of the cornea 101 or simply the cornea radius 105. ¶ [0032]). As to claim 5, Klingstrom discloses the method wherein generating the three-dimensional surface based at least partly on the plurality of estimates of the center of the corneal curvature comprises fitting a surface to the plurality of estimates of the center of the corneal curvature (Klingstrom, The cornea 101 is often modeled as a spherical surface with a center of curvature 104 which is simply referred to as the cornea center 104. In such a spherical cornea model, the cornea 101 has a radius of curvature referred to as the radius 105 of the cornea 101 or simply the cornea radius 105. ¶ [0032])(Klingstrom, FIG. 5 is a flow chart of a scheme for estimating 403 the cornea radius 105 in the method 400 from FIG. 4, according to an embodiment. In the present embodiment, the step 403 of estimating an updated value of the cornea radius 105 comprises: obtaining 501 at least two images of the eye 100 captured by respective cameras 302 when the eye 100 is illuminated by at least two illuminators 301; estimating 502, based on the images, positions of reflections 112 of the illuminators 301 at the cornea 101 of the eye 100; estimating 503 an updated value of the cornea radius 105 based on the positions of the reflections 112 at the cornea 101 and positions of the illuminators 301 relative to the cameras 302 by which the images were captured. Figures 4 and 5, ¶¶ [0061-0064]). As to claim 6, Klingstrom discloses the method wherein generating the three-dimensional surface based at least partly on the plurality of estimates of the center of the corneal curvature comprises fitting a sphere to the plurality of estimates of the center of the corneal curvature (Klingstrom, The cornea 101 is often modeled as a spherical surface with a center of curvature 104 which is simply referred to as the cornea center 104. In such a spherical cornea model, the cornea 101 has a radius of curvature referred to as the radius 105 of the cornea 101 or simply the cornea radius 105. ¶ [0032]). As to claim 8, Klingstrom discloses the method wherein the plurality of images of the eye comprise images associated with different gaze directions of the eye (Klingstrom, The system 300 also comprises processing circuitry 304 configured to estimate where the eye 100 is located and/or where the eye 100 looking. The processing circuitry 304 may for example estimate a gaze direction (or gaze vector) of the eye 100 (corresponding to a direction of the visual axis 107), or a gaze point 111 of the eye 100 at the display 303 (as shown in FIG. 2). ¶ [0033]). As to claim 9, Klingstrom discloses the method further comprising mapping a cornea of the eye using a gaze target (Klingstrom, The processing circuitry 304 may also be communicatively connected to the display 303, for example for controlling (or triggering) the display 303 to show test stimulus points 305 for calibration of the eye tracking system 300. ¶ [0034]) (Klingstrom, estimating 503 an updated value of the cornea radius 105 based on the positions of the reflections 112 at the cornea 101 and positions of the illuminators 301 relative to the cameras 302 by which the images were captured. Figures 4 and 5, ¶ [0064]). As to claim 10, Klingstrom discloses a display system configured to display virtual image content in a vision field of a user (Klingstrom, The display 303 may for example be a liquid-crystal display (LCD) or a LED display. However, other types of displays may also be envisaged. The display may 303 may for example be flat or curved. The display 303 may for example be a TV screen, a computer screen, or may be part of a head-mounted device (HMD) such as a virtual reality (VR) or augmented reality (AR) device. The display 303 may for example be placed in front of one of the user's eyes. In other words, separate displays 303 may be employed for the left and right eyes. Separate eye tracking equipment (such as illuminators 301 and cameras 302) may for example be employed for the left and right eyes. ¶ [0037]), the display system comprising: a head-mountable display configured to project light into an eye of the user to display the virtual image content (Klingstrom, The display 303 may for example be a liquid-crystal display (LCD) or a LED display. However, other types of displays may also be envisaged. The display may 303 may for example be flat or curved. The display 303 may for example be a TV screen, a computer screen, or may be part of a head-mounted device (HMD) such as a virtual reality (VR) or augmented reality (AR) device. The display 303 may for example be placed in front of one of the user's eyes. In other words, separate displays 303 may be employed for the left and right eyes. Separate eye tracking equipment (such as illuminators 301 and cameras 302) may for example be employed for the left and right eyes. ¶ [0037]); first and second eye tracking cameras configured to image the eye (Klingstrom, Separate eye tracking equipment (such as illuminators 301 and cameras 302) may for example be employed for the left and right eyes. ¶ [0037]); and processing electronics in communication with the display and the first and second eye tracking cameras (Klingstrom, The processing circuitry 304 is communicatively connected to the illuminators 301 and the cameras 302, for example via a wired or wireless connection. The processing circuitry 304 may also be communicatively connected to the display 303, for example for controlling (or triggering) the display 303 to show test stimulus points 305 for calibration of the eye tracking system 300. ¶ [0034]), the processing electronics configured to: receive multiple pairs of captured images of the eye from the first and second eye tracking cameras (Klingstrom, FIG. 5 is a flow chart of a scheme for estimating 403 the cornea radius 105 in the method 400 from FIG. 4, according to an embodiment. In the present embodiment, the step 403 of estimating an updated value of the cornea radius 105 comprises: obtaining 501 at least two images of the eye 100 captured by respective cameras 302 when the eye 100 is illuminated by at least two illuminators 301; Figures 4 and 5, ¶¶ [0061-0062]).; for each pair of captured images received from the first and second eye tracking cameras, determine an estimate of a center of corneal curvature of the eye based at least in part on the respective pair of captured images (Klingstrom, FIG. 5 is a flow chart of a scheme for estimating 403 the cornea radius 105 in the method 400 from FIG. 4, according to an embodiment. In the present embodiment, the step 403 of estimating an updated value of the cornea radius 105 comprises: obtaining 501 at least two images of the eye 100 captured by respective cameras 302 when the eye 100 is illuminated by at least two illuminators 301; estimating 502, based on the images, positions of reflections 112 of the illuminators 301 at the cornea 101 of the eye 100; estimating 503 an updated value of the cornea radius 105 based on the positions of the reflections 112 at the cornea 101 and positions of the illuminators 301 relative to the cameras 302 by which the images were captured. Figures 4 and 5, ¶¶ [0061-0064]).; determine a three-dimensional surface based on the estimated centers of corneal curvature of the eye determined based on the multiple pairs of captured images of the eye (Klingstrom, The cornea 101 is often modeled as a spherical surface with a center of curvature 104 which is simply referred to as the cornea center 104. In such a spherical cornea model, the cornea 101 has a radius of curvature referred to as the radius 105 of the cornea 101 or simply the cornea radius 105. ¶ [0032]) (Klingstrom, estimating 503 an updated value of the cornea radius 105 based on the positions of the reflections 112 at the cornea 101 and positions of the illuminators 301 relative to the cameras 302 by which the images were captured. Figures 4 and 5, ¶ [0064]); The spherical surface, of Klingstrom, is the three-dimensional surface as claimed. identify a center of curvature of the three-dimensional surface (Klingstrom, estimating 503 an updated value of the cornea radius 105 based on the positions of the reflections 112 at the cornea 101 and positions of the illuminators 301 relative to the cameras 302 by which the images were captured. Figures 4 and 5, ¶ [0064]) (Klingstrom, The cornea 101 is often modeled as a spherical surface with a center of curvature 104 which is simply referred to as the cornea center 104. In such a spherical cornea model, the cornea 101 has a radius of curvature referred to as the radius 105 of the cornea 101 or simply the cornea radius 105. ¶ [0032]). ; and based at least partly on the center of curvature of the three-dimensional surface, determine an estimate of a center of rotation of the eye (Klingstrom, estimating 503 an updated value of the cornea radius 105 based on the positions of the reflections 112 at the cornea 101 and positions of the illuminators 301 relative to the cameras 302 by which the images were captured. Figures 4 and 5, ¶ [0064]) (Klingstrom, The cornea 101 is often modeled as a spherical surface with a center of curvature 104 which is simply referred to as the cornea center 104. In such a spherical cornea model, the cornea 101 has a radius of curvature referred to as the radius 105 of the cornea 101 or simply the cornea radius 105. ¶ [0032]). As to claim 11, Klingstrom discloses the display system wherein generating the three-dimensional surface includes fitting the three-dimensional surface to the estimated centers of corneal curvature of the eye (Klingstrom, estimating 503 an updated value of the cornea radius 105 based on the positions of the reflections 112 at the cornea 101 and positions of the illuminators 301 relative to the cameras 302 by which the images were captured. Figures 4 and 5, ¶ [0064]) (Klingstrom, The cornea 101 is often modeled as a spherical surface with a center of curvature 104 which is simply referred to as the cornea center 104. In such a spherical cornea model, the cornea 101 has a radius of curvature referred to as the radius 105 of the cornea 101 or simply the cornea radius 105. ¶ [0032]). As to claim 15, Klingstrom discloses the display system wherein the processing electronics is configured to use a render camera to render the virtual image content to be presented to the eye of the user, the render camera having a position determined based at least partly on the estimated center of rotation of the eye (Klingstrom, FIG. 3 is a schematic overview of an eye tracking system 300, according to an embodiment. The system 300 comprises one or more illuminators 301 for illuminating the eye 100 and one or more cameras 302 for capturing images of the eye 100 while the eye 100 looks at a display 303. The system 300 also comprises processing circuitry 304 configured to estimate where the eye 100 is located and/or where the eye 100 looking. The processing circuitry 304 may for example estimate a gaze direction (or gaze vector) of the eye 100 (corresponding to a direction of the visual axis 107), or a gaze point 111 of the eye 100 at the display 303 (as shown in FIG. 2). Figure 3, ¶ [0033]) (Klingstrom, The processing circuitry 304 is communicatively connected to the illuminators 301 and the cameras 302, for example via a wired or wireless connection. The processing circuitry 304 may also be communicatively connected to the display 303, for example for controlling (or triggering) the display 303 to show test stimulus points 305 for calibration of the eye tracking system 300. ¶ [0034]). As to claim 17, Klingstrom discloses the display system wherein at least a portion of the display is transparent and disposed at a location in front of the eye while the user is wearing the head-mountable display such that the transparent portion transmits light from a portion of the environment in front of the user and the head-mountable display to provide a view of the portion of the environment (Klingstrom, The display 303 may for example be a liquid-crystal display (LCD) or a LED display. However, other types of displays may also be envisaged. The display may 303 may for example be flat or curved. The display 303 may for example be a TV screen, a computer screen, or may be part of a head-mounted device (HMD) such as a virtual reality (VR) or augmented reality (AR) device. The display 303 may for example be placed in front of one of the user's eyes. In other words, separate displays 303 may be employed for the left and right eyes. Separate eye tracking equipment (such as illuminators 301 and cameras 302) may for example be employed for the left and right eyes. ¶ [0037]). The virtual and augmented reality display are transparent in order to place virtual images on the real world environment. Inventorship This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 16 is rejected under 35 U.S.C. 103 as being unpatentable over U.S. Pub. No. 2020/0257359 by Klingstrom (“Klingstrom”) in view of U.S. Pub. No. 2015/0346495 by Welch et al. (“Welch”). As to claim 16, Klingstrom does not expressly disclose the display system wherein the display is configured to project light into the eye to display the virtual image content to the vision field of the user at different amounts of at least one of divergence and collimation to cause the displayed virtual image content to appear to originate from different depths at different times. Welch teaches a virtual and augmented reality system wherein the display is configured to project light into the eye to display the virtual image content to the vision field of the user at different amounts of at least one of divergence and collimation to cause the displayed virtual image content to appear to originate from different depths at different times (Welch, a portion of the desired image, comprising an image of the sky at optical infinity may be injected at time 1 and the diffraction grating retaining collimation of light may be utilized; then an image of a closer tree branch may be injected at time 2 and a DOE configured to create a depth plane 10 meters away may be utilized; then an image of a pen may be injected at time 3 and a DOE configured to create a depth plane 1 meter away may be utilized. This kind of paradigm can be repeated in rapid time sequential fashion such that the eye/brain perceives the input to be all part of the same image, and such that the multiple image planes/slices are perceived almost simultaneously by the user. ¶ [0143]). At the time before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Klingstrom’s augmented reality display to include Welch’s augmented reality display system because such a modification is taught, suggested, or motivated by the art. More specifically, the motivation to modify Klingstrom to include Welch is expressly provided by Welch, stating that this kind of paradigm can be repeated in rapid time sequential fashion such that the eye/brain perceives the input to be all part of the same image, and such that the multiple image planes/slices are perceived almost simultaneously by the user. (Welch, ¶ [0143]). Therefore, it would have been obvious to one of ordinary skill in the art at the time before the effective filing date of the invention to modify Klingstrom’s augmented reality display to include Welch’s augmented reality display system with the motivation of presenting images at varied depths. The person of ordinary skill in the art would have recognized the benefit of providing improved image display to the user. Thus, Klingstrom, as modified by Welch, teaches the presented images appearing to originate at different depths. Allowable Subject Matter Claims 2-4, 7, and 12-14 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. The following is a statement of reasons for the indication of allowable subject matter: As to claim 2, Klingstrom teaches a gaze vector for the user’s eyes (Klingstrom, The processing circuitry 304 may for example estimate a gaze direction (or gaze vector) of the eye 100 (corresponding to a direction of the visual axis 107), or a gaze point 111 of the eye 100 at the display 303 (as shown in FIG. 2). ¶ [0033]). Klingstrom does not expressly disclose the method wherein determining the plurality of estimates of the corneal curvature of the user's eye comprises: determining a first vector directed toward the center of corneal curvature based on the locations of at least a portion of the plurality of light emitters and the location of a first camera of the plurality of eye tracking cameras; determining a second vector directed toward the center of corneal curvature based on locations of at least a portion of the plurality of light emitters and the location of a second camera of the plurality of eye tracking cameras; and determining a region of convergence between the first vector and second vector to determine an estimate of the center of corneal curvature of the user's eye. In addition, no other prior art was found which teaches, alone or in combination, the cited limitations. As to dependent claims 3 and 4, these claims are objected to the for the same reasons as claim 2 as they depend upon objected dependent claim 2. As to claim 7, Klingstrom teaches a gaze vector for the user’s eyes (Klingstrom, The processing circuitry 304 may for example estimate a gaze direction (or gaze vector) of the eye 100 (corresponding to a direction of the visual axis 107), or a gaze point 111 of the eye 100 at the display 303 (as shown in FIG. 2). ¶ [0033]). Klingstrom does not expressly disclose the method wherein determining the estimate of the center of rotation of the eye comprises: determining two or more vectors normal to the three-dimensional surface; and determining a region of convergence of the two or more vectors normal to the three-dimensional surface, wherein the region of convergence comprises the estimate of the center of rotation of the eye. In addition, no other prior art was found which teaches, alone or in combination, the cited limitations. As to claim 12, Klingstrom teaches a gaze vector for the user’s eyes (Klingstrom, The processing circuitry 304 may for example estimate a gaze direction (or gaze vector) of the eye 100 (corresponding to a direction of the visual axis 107), or a gaze point 111 of the eye 100 at the display 303 (as shown in FIG. 2). ¶ [0033]). Klingstrom does not expressly disclose the display system wherein determining the estimate of the center of corneal curvature of the eye based at least in part on the respective pair of captured images comprises: determining a first vector along which the center of corneal curvature of the eye is estimated to be located based on a first image received from the first eye tracking camera; determining a second vector along which the center of corneal curvature of the eye is estimated to be located based on a second image received from the second eye tracking camera, the first and second images included in the respective pair of images; and determining a region of convergence between paths extending in the directions of the first vector and the second vector to obtain the estimate of a center of corneal curvature of the eye. In addition, no other prior art was found which teaches, alone or in combination, the cited limitations. As to dependent claims 13 and 14, these claims are objected to the for the same reasons as claim 12 as they depend upon objected dependent claim 12. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. U.S. Pub. No. 2017/0263007 by Cavin et al. teaches an eye tracking system which captures images of a user’s cornea to determine the radius of the eye’s corneal sphere. U.S. Pub. No. 2017/0017299 by Biedert et al. teaches a gaze tracking based on detecting the glints or reflections on user’s eye to determine the cornea position. Any inquiry concerning this communication or earlier communications from the examiner should be directed to BRENT D CASTIAUX whose telephone number is (571)272-5143. The examiner can normally be reached Mon-Fri 7:30 AM- 4:00 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chanh Nguyen can be reached at (571)272-7772. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /BRENT D CASTIAUX/ Primary Examiner, Art Unit 2623
Read full office action

Prosecution Timeline

May 16, 2025
Application Filed
Feb 27, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596253
VEHICULAR DISPLAY DEVICE
2y 5m to grant Granted Apr 07, 2026
Patent 12578574
HEAD MOUNTED DISPLAY, AND NEAR-TO-EYE DISPLAY METHOD
2y 5m to grant Granted Mar 17, 2026
Patent 12579956
METHODS AND DEVICES FOR PERCEPTION-BASED RENDERING
2y 5m to grant Granted Mar 17, 2026
Patent 12578910
INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND RECORDING MEDIUM
2y 5m to grant Granted Mar 17, 2026
Patent 12572224
SYSTEM AND METHOD OF ASSEMBLING AN ELECTROMAGNETIC SPRING BACK RAPID CLICK MOUSE
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
83%
Grant Probability
99%
With Interview (+15.9%)
2y 1m
Median Time to Grant
Low
PTA Risk
Based on 523 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month