Prosecution Insights
Last updated: April 19, 2026
Application No. 19/197,641

REALTIME BACKGROUND EYE-TRACKING CALIBRATION

Non-Final OA §102§103
Filed
May 02, 2025
Examiner
LAM, VINH TANG
Art Unit
2628
Tech Center
2600 — Communications
Assignee
Google LLC
OA Round
1 (Non-Final)
72%
Grant Probability
Favorable
1-2
OA Rounds
2y 8m
To Grant
81%
With Interview

Examiner Intelligence

Grants 72% — above average
72%
Career Allow Rate
471 granted / 655 resolved
+9.9% vs TC avg
Moderate +9% lift
Without
With
+9.2%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
25 currently pending
Career history
680
Total Applications
across all art units

Statute-Specific Performance

§101
2.0%
-38.0% vs TC avg
§103
47.4%
+7.4% vs TC avg
§102
31.5%
-8.5% vs TC avg
§112
14.3%
-25.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 655 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention. 2. Claim(s) 1-2, 4, 7-8, 11, 13-14, and 16-20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Badrinarayanan et al. (US Patent/PGPub. No. 11775058). Regarding Claim 1, Bigham et al. teach a method (Col. 4, Ln. 34-40, FIG. 2, i.e. method for on-the-fly calibration of eye tracking) comprising: detecting a first user interaction (Col. 5, Ln. 64-67, Col. 6, Ln. 1-3, FIG. 2, i.e. user's eye looking) with a user interface (Col. 5, Ln. 64-67, Col. 6, Ln. 1-3, FIG. 2, i.e. the stimulus mark) of a wearable display device (Col. 3, Ln. 57-64, FIG. 1, i.e. display 180 may be a … head mounted display) by a user (Col. 5, Ln. 23-26, FIG. 1, i.e. user); associating the first user interaction (i.e. please see above citation(s)) with information regarding a gaze direction (Col. 4, Ln. 50-58, FIG. 2, i.e. eye vectors and … a position and orientation of each eye) of the user to determine a gaze-target pair (Col. 2, Ln. 46-48, FIG. 1-2, i.e. calibration pair may be obtained, which includes an uncalibrated gaze point and a stimulus mark location associated with the gaze); and prior to detecting a second user interaction (Col. 6, Ln. 25-28, FIG. 1-2, i.e. “block 235 and … as described at block 200” that means the process repeats again the second time) with the user interface (i.e. please see above citation(s)), updating a calibration (Col. 6, Ln. 55-60, FIG. 1-2, i.e. block 260 the current calibration pair is incorporated into the calibration model) of the wearable display device based on the determined gaze-target pair (i.e. please see above citation(s)), the calibration correlating gaze directional data (Col. 4, Ln. 50-58, FIG. 2, i.e. eye vectors and … a position and orientation of each eye) with one or more calibrated gaze positions (Col. 6, Ln. 29-33, FIG. 2, i.e. 240 where the calibration module 155 obtains a calibration pair of an uncalibrated gaze points at a screen location of the stimulus mark) of the wearable display device (i.e. please see above citation(s)). Regarding Claim 2, Bigham et al. teach the method of claim 1, further comprising generating the gaze directional data for a defined time period (Col. 5, Ln. 8-14, FIG. 1-2, i.e. dwell time) preceding the first user interaction by filtering raw gaze data (Col. 5, Ln. 8-14, FIG. 1-2, i.e. Kalman filter may be applied … Once the user is on target, selection of the target may be confirmed) from the defined time period using one or more of a minimum fixation duration (Col. 6, Ln. 8-14, FIG. 1-2, i.e. “230 … selection has not occurred, for example if the user shifts gaze away from the target area”; Col. 5, Ln. 8-14, FIG. 1-2, i.e. “dwell time”; all of which mean minimum dwelling time if user fails to gaze at a location for predetermined period) or a spatial dispersion threshold. Regarding Claim 4, Bigham et al. teach the method of claim 1, wherein updating the calibration comprises adjusting coefficients of a calibration matrix (Col. 4, Ln. 67, Col. 5, Ln. 1-4, FIG. 1-2, i.e. calculate a marker for the transformation matrix, with coefficients that best fit the points) based on a difference between a gaze direction (Col. 4, Ln. 50-58, FIG. 2, i.e. eye vectors and … a position and orientation of each eye) of a respective gaze-target pair (Col. 7, Ln. 22-31, FIG. 4, i.e. 415L … (for example, for a left eye), … 415R represents the … (for example, for a right eye)) and a user interface target (Col. 7, Ln. 19-24, FIG. 4, i.e. user interface 400) of the respective gaze-target pair (i.e. please see above citation(s)). Regarding Claim 7, Bigham et al. teach the method of claim 1, further comprising determining to update the calibration (i.e. please see above citation(s)) based at least in part on one or more criteria selected from a group that includes a distance between the gaze direction and a target of the first user interaction (Col. 7, Ln. 40-45, FIG. 4, i.e. location of the screen gaze estimation is within a predetermined distance threshold 410 of the stimulus mark), a stability metric of the gaze direction, or occurrence of a correction event. Regarding Claim 8, Bigham et al. teach the method of claim 1, further comprising evaluating cumulative calibration accuracy (Col. 2, Ln. 53-58, FIG. 1-2, i.e. calibration pair renders the calibration model more accurate than the calibration model) over a plurality of updates responsive to multiple determined gaze-target pairs (Col. 6, Ln. 25-28, FIG. 1-2, i.e. block 235 and … as described at block 200” that means the process repeats again and again to refine and update the calibration pairs), and suspending one or more additional updates of the calibration based at least in part on the evaluating (Col. 6, Ln. 51-55, FIG. 2, i.e. block 255 and the current calibration pair is discarded). Regarding Claim 11, Bigham et al. teach the method of claim 1, wherein the first user interaction comprises a selection of a graphical user interface element (Col. 6, Ln. 4-12, FIG. 2, i.e. selection of the user interface component associated with the stimulus mark). Regarding Claim 13, Bigham et al. teach a wearable display device (Col. 3, Ln. 57-64, FIG. 1, i.e. display 180 may be a … head mounted display), comprising: one or more sensors (Col. 3, Ln. 11-56, FIG. 1, i.e. sensors 175) to provide gaze directional data (Col. 4, Ln. 50-58, FIG. 2, i.e. eye vectors and … a position and orientation of each eye); a memory (Col. 4, Ln. 9-16, FIG. 1, i.e. memory 140) to store a calibration matrix (Col. 4, Ln. 58-62, FIG. 1, i.e. transform matrix), wherein the calibration matrix correlates the gaze directional data (i.e. please see above citation(s)) with one or more calibrated gaze positions (Col. 4, Ln. 50-58, FIG. 2, i.e. eye vectors and … a position and orientation of each eye) of the wearable display device (i.e. please see above citation(s)); and one more processors (Col. 4, Ln. 4-10, FIG. 1, i.e. eye vectors and … a position and orientation of each eye) to: detect a first user interaction (Col. 5, Ln. 64-67, Col. 6, Ln. 1-3, FIG. 2, i.e. user's eye looking) with a user interface (Col. 5, Ln. 64-67, Col. 6, Ln. 1-3, FIG. 2, i.e. the stimulus mark) of the wearable display device by a user (i.e. please see above citation(s)); associate the first user interaction (i.e. please see above citation(s)) with information regarding a gaze direction (Col. 4, Ln. 50-58, FIG. 2, i.e. eye vectors and … a position and orientation of each eye) of the user to determine a gaze-target pair (Col. 2, Ln. 46-48, FIG. 1-2, i.e. calibration pair may be obtained, which includes an uncalibrated gaze point and a stimulus mark location associated with the gaze); and prior to detection of a second user interaction (Col. 6, Ln. 25-28, FIG. 1-2, i.e. “block 235 and … as described at block 200” that means the process repeats again the second time) with the user interface (i.e. please see above citation(s)), update the calibration matrix (Col. 6, Ln. 55-60, FIG. 1-2, i.e. block 260 the current calibration pair is incorporated into the calibration model) based on the determined gaze-target pair (i.e. please see above citation(s)). Regarding Claim 14, Bigham et al. teach the wearable display device of claim 13, wherein the one or more processors are further to filter the gaze directional data using one or more of a minimum fixation duration (Col. 6, Ln. 8-14, FIG. 1-2, i.e. “230 … selection has not occurred, for example if the user shifts gaze away from the target area”; Col. 5, Ln. 8-14, FIG. 1-2, i.e. “dwell time”; all of which mean minimum dwelling time if user fails to gaze at a location for predetermined period), a spatial dispersion threshold, or a defined time period preceding the first user interaction. Regarding Claim 16, Bigham et al. teach the wearable display device of claim 13, wherein to update the calibration matrix (i.e. please see above citation(s)) comprises adjusting coefficients of the calibration matrix (Col. 4, Ln. 67, Col. 5, Ln. 1-4, FIG. 1-2, i.e. calculate a marker for the transformation matrix, with coefficients that best fit the points) based on a difference between a gaze direction (Col. 4, Ln. 50-58, FIG. 2, i.e. eye vectors and … a position and orientation of each eye) of the identified gaze-target pair (Col. 7, Ln. 22-31, FIG. 4, i.e. 415L … (for example, for a left eye), … 415R represents the … (for example, for a right eye)) and a target (Col. 7, Ln. 19-24, FIG. 4, i.e. user interface 400) of the first user interaction for the identified gaze-target pair (i.e. please see above citation(s)). Regarding Claim 17, Bigham et al. teach the wearable display device of claim 13, wherein the one or more processors are further to determine to update the calibration matrix (i.e. please see above citation(s)) based at least in part on one or more criteria selected from a group that includes a distance between the gaze direction and the user interface target (Col. 7, Ln. 40-45, FIG. 4, i.e. location of the screen gaze estimation is within a predetermined distance threshold 410 of the stimulus mark), a stability metric of the gaze direction, or occurrence of a correction event. Regarding Claim 18, Bigham et al. teach the wearable display device of claim 13, wherein the one or more processors are further to evaluate cumulative calibration accuracy (Col. 2, Ln. 53-58, FIG. 1-2, i.e. calibration pair renders the calibration model more accurate than the calibration model) over a plurality of calibration matrix updates responsive to multiple determined gaze-target pairs (Col. 6, Ln. 25-28, FIG. 1-2, i.e. block 235 and … as described at block 200” that means the process repeats again and again to refine and update the calibration pairs), and to suspend one or more additional updates of the calibration matrix based at least in part on the evaluation (Col. 6, Ln. 51-55, FIG. 2, i.e. block 255 and the current calibration pair is discarded). Regarding Claim 19, Bigham et al. teach a non-transitory computer-readable medium (Col. 4, Ln. 9-16, FIG. 1, i.e. memory 140) storing instructions (Col. 4, Ln. 9-16, FIG. 1, i.e. store various programming modules) that, when executed by one or more processors (Col. 4, Ln. 4-10, FIG. 1, i.e. eye vectors and … a position and orientation of each eye), manipulate the one or more processors (i.e. please see above citation(s))to: detect a first user interaction (Col. 5, Ln. 64-67, Col. 6, Ln. 1-3, FIG. 2, i.e. user's eye looking) with a user interface (Col. 5, Ln. 64-67, Col. 6, Ln. 1-3, FIG. 2, i.e. the stimulus mark) of a wearable display device (Col. 3, Ln. 57-64, FIG. 1, i.e. display 180 may be a … head mounted display) by a user (Col. 5, Ln. 23-26, FIG. 1, i.e. user); associate the first user interaction (i.e. please see above citation(s)) with information regarding a gaze direction (Col. 4, Ln. 50-58, FIG. 2, i.e. eye vectors and … a position and orientation of each eye) of the user to determine a gaze-target pair (Col. 2, Ln. 46-48, FIG. 1-2, i.e. calibration pair may be obtained, which includes an uncalibrated gaze point and a stimulus mark location associated with the gaze); and prior to detecting a second user interaction (Col. 6, Ln. 25-28, FIG. 1-2, i.e. “block 235 and … as described at block 200” that means the process repeats again the second time) with the user interface, update a calibration (Col. 6, Ln. 55-60, FIG. 1-2, i.e. block 260 the current calibration pair is incorporated into the calibration model) of the wearable display device based on the determined gaze-target pair (i.e. please see above citation(s)), the calibration correlating gaze directional data (Col. 4, Ln. 50-58, FIG. 2, i.e. eye vectors and … a position and orientation of each eye) with one or more calibrated gaze positions (Col. 6, Ln. 29-33, FIG. 2, i.e. 240 where the calibration module 155 obtains a calibration pair of an uncalibrated gaze points at a screen location of the stimulus mark) of the wearable display device (i.e. please see above citation(s)). Regarding Claim 20, Bigham et al. teach the non-transitory computer-readable medium of claim 19, wherein the instructions further manipulate the one or more processors to generate the gaze directional data for a defined time period (Col. 5, Ln. 8-14, FIG. 1-2, i.e. dwell time) preceding (Col. 5, Ln. 21-29, FIG. 1-2, i.e. “block 205” before block 215) the first user interaction by filtering raw gaze data from the defined time period using one or more of a minimum fixation duration (Col. 6, Ln. 8-14, FIG. 1-2, i.e. “230 … selection has not occurred, for example if the user shifts gaze away from the target area”; Col. 5, Ln. 8-14, FIG. 1-2, i.e. “dwell time”; all of which mean minimum dwelling time if user fails to gaze at a location for predetermined period) or a spatial dispersion threshold. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 3. Claim(s) 12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Bigham et al. (US Patent/PGPub. No. 11106280) in view of Hudman (US Patent/PGPub. No. 20240353921). Regarding Claim 12, Bigham et al. teach the method of claim 1. However, Bigham et al. do not explicitly teach the method is performed without providing an indication of gaze calibration to the user. In the same field of endeavor, Hudman teaches the method ([0066], FIG. 1, i.e. methods for tracking the eyes) is performed without providing an indication of gaze calibration to the user ([0013], FIG. 1, i.e. user interface elements are not indicated to the user as being calibration targets). It would have been obvious to a person having ordinary skill in the art at the time the invention’s effective date was filed to combine Bigham et al. teaching method of calibrating head-mount using calibration pair of gaze direction and user interface location with Hudman teaching of method of calibrating head-mount without providing user interface to effectively enhance user’s experience with the device by tracking user’s eyes without user knowing yet providing seamless activity and enjoyment (Hudman’s [0013]). Allowable Subject Matter 4. Claim(s) 3, 5-6, 9-10, and 15 is/are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. 5. The following is an examiner’s statement of reasons for allowance: Bigham et al. (US Patent/PGPub. No. 11106280) teach calibration of eye tracking is improved by collecting additional calibration pairs when user is using apps with eye tracking. A user input component is presented on a display of an electronic device, detecting a dwelling action for user input component, and in response to detecting the dwelling action, obtaining a calibration pair comprising an uncalibrated gaze point and a screen location of the user input component, wherein the uncalibrated gaze point is determined based on an eye pose during the dwelling action. A screen gaze estimation is determine based on the uncalibrated gaze point, and in response to determining that the calibration pair is a valid calibration pair, training a calibration model using the calibration pair. Hudman (US Patent/PGPub. No. 20240353921) teaches calibrating an eye tracking system based at least in part on a user interface element(s) presented on a display panel(s) of a head-mounted display (THMID) is disclosed. A processor(s) may present a user interface element on a display panel(s) of a HMD, and may receive, from a handheld controller, user input data indicating that a user wearing the H/ID has provided user input associated with the user interface element via the handheld controller. In response to the receiving of the user input data from the handheld controller, the processor(s) may receive, from an eye tracking sensor(s) of the HMD, eye data associated with one or more eyes of the user, and may calibrate the eye tracking system based at least in part on the eye data and location data indicating a location on the display panel(s) where the user interface element is presented. The subject matter of the independent claims could either not be found or was not suggested in the prior art of record. The subject matter not found was a display device including “…associating the first user interaction with the information regarding the gaze direction comprises identifying a centroid for one or more samples in the gaze directional data that occur during the defined time period.” (Claim 3; Claim 15 is similar), “…dynamically adjusting a forgetting factor based on a magnitude of the difference between the gaze direction and the user interface target.” (Claim 5), “…adjusting the coefficients of the calibration matrix comprises applying a recursive least squares algorithm.” (Claim 6), “…prioritizing one or more additional updates of the calibration corresponding to at least one determined gaze-target pair in a first region of the display device having fewer determined gaze-target pairs than one or more other regions of the display device.” (Claim 9), in combination with the other elements (or steps) of the device or apparatus and method recited in the claims. Any comments considered necessary by applicant must be submitted no later than the payment of the issue fee and, to avoid processing delays, should preferably accompany the issue fee. Such submissions should be clearly labeled “Comments on Statement of Reasons for Allowance.” Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to VINH TANG LAM whose telephone number is (571) 270-3704. The examiner can normally be reached Monday to Friday 8:00 AM to 5:00 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Nitin K Patel can be reached at (571) 272-7677. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /VINH T LAM/Primary Examiner, Art Unit 2628
Read full office action

Prosecution Timeline

May 02, 2025
Application Filed
Jan 27, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596512
CONTENT RENDERING METHOD AND APPARATUS, READABLE MEDIUM, AND ELECTRONIC DEVICE
2y 5m to grant Granted Apr 07, 2026
Patent 12592051
OPTIMIZATION OF EYE CAPTURE CONDITIONS FOR EACH USER AND USE CASE
2y 5m to grant Granted Mar 31, 2026
Patent 12579446
MACHINE-LEARNING TECHNIQUES FOR RISK ASSESSMENT BASED ON CLUSTERING
2y 5m to grant Granted Mar 17, 2026
Patent 12581829
DISPLAY DEVICE
2y 5m to grant Granted Mar 17, 2026
Patent 12566525
TOUCH DEVICE
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
72%
Grant Probability
81%
With Interview (+9.2%)
2y 8m
Median Time to Grant
Low
PTA Risk
Based on 655 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month