Prosecution Insights
Last updated: April 19, 2026
Application No. 19/027,019

System For Tracking A Gaze Direction Of A User

Final Rejection §103
Filed
Jan 17, 2025
Examiner
BLANCHA, JONATHAN M
Art Unit
2623
Tech Center
2600 — Communications
Assignee
Sony Interactive Entertainment Inc.
OA Round
2 (Final)
62%
Grant Probability
Moderate
3-4
OA Rounds
2y 7m
To Grant
71%
With Interview

Examiner Intelligence

Grants 62% of resolved cases
62%
Career Allow Rate
408 granted / 661 resolved
At TC average
Moderate +9% lift
Without
With
+9.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
17 currently pending
Career history
678
Total Applications
across all art units

Statute-Specific Performance

§101
0.3%
-39.7% vs TC avg
§103
69.4%
+29.4% vs TC avg
§102
23.2%
-16.8% vs TC avg
§112
4.9%
-35.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 661 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment The amendment filed on 12-02-25 has been entered and fully considered by the examiner. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 14 and 21-39 are rejected under 35 U.S.C. 103 as being unpatentable over Heideklang et al. (US 2024/0104967) in view of Chen et al. (US 2012/0281181). Regarding claim 14, Heideklang (Fig. 2 and 5) discloses a computer implemented method comprising: receiving, from a photosensor (called a “camera” in [0026]), data encoding a representation of light that was reflected from an eye of a user (“a portion of the IR light is reflected off the eye and captured by an eye tracking camera” discussed in [0026]); determining whether the representation of the light that was reflected from the eye of the user indicates that the user is wearing a contact lens (“the system may detect that the user is wearing the device differently, or is wearing contacts” discussed in [0035]); and selecting, from among multiple gaze tracking algorithms (“two or more gaze correction functions may be generated and stored” discussed in [0036]), a particular gaze tracking algorithm to apply to subsequently received data from the photosensor, based at least on determining whether the representation of the light that was reflected from the eye of the user indicates that the user is wearing a contact lens (“an appropriate one of the stored gaze correction functions may be selected, for example based on… detection of a contact lens” discussed in [0036]). However, Heideklang fails to teach or suggest a “contact lens that includes one or more reflective elements.” Chen (Fig. 1-3, 6, and 11-14) discloses a computer implemented method comprising: receiving, from a photosensor (1140), data encoding a representation of light that was reflected from an eye of a user (“detector of IR or NIR light (e.g., sensor 1140) may gather a plurality of incoherent and/or coherence rays 1150 reflected from reflective elements 1110” discussed in [0079]); a contact lens that includes one or more reflective elements (the reflective patterns seen in Fig. 12-14, and discussed in [0080], eg. to “distinguish between individual eyes or individual users” as discussed in [0081]); and selecting, from among multiple gaze tracking algorithms (the different reflection patterns each correspond to different algorithms, see “reflective areas can be varied based on detection routines” discussed in [0081]), a particular gaze tracking algorithm to apply to subsequently received data from the photosensor (eg. the tracking algorithm specific to the user, based on the specific reflection patterns of their contact lenses), based at least on determining whether the representation of the light that was reflected from the eye of the user (“contact lenses associated with a plurality of users' eyes are tracked using one or more sensors” as discussed in [0061], however different users are tracked by different reflection patterns, see “contact lenses of different users may further reflect different patterns” discussed in [0061], and so the detected reflected light for the user will have the corresponding detection routine selected). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Heideklang so the contact lens includes one or more reflective elements as taught by Chen because this provides for “eye tracking accuracy is improved” (see [0081]). Regarding claim 21, Heideklang (Fig. 2 and 5) discloses a computer implemented method comprising: one or more processors (“various types of processors” discussed in [0041]), and one or more non-transitory computer readable media that store computer instructions (“memory storing program instructions” discussed in [0021]) which, when executed by the one or more processors (the processor “configured to execute instructions” discussed in [0048]), cause the one or more processors to perform operations (the instructions are “executable to implement the operation” discussed in [0021]), comprising: receiving, from a photosensor (called a “camera” in [0026]), data encoding a representation of light that was reflected from an eye of a user (“a portion of the IR light is reflected off the eye and captured by an eye tracking camera” discussed in [0026]); determining whether the representation of the light that was reflected from the eye of the user indicates that the user is wearing a contact lens (“the system may detect that the user is wearing the device differently, or is wearing contacts” discussed in [0035]); and selecting, from among multiple gaze tracking algorithms (“two or more gaze correction functions may be generated and stored” discussed in [0036]), a particular gaze tracking algorithm to apply to subsequently received data from the photosensor, based at least on determining whether the representation of the light that was reflected from the eye of the user indicates that the user is wearing a contact lens (“an appropriate one of the stored gaze correction functions may be selected, for example based on… detection of a contact lens” discussed in [0036]). However, Heideklang fails to teach or suggest a “contact lens that includes one or more reflective elements.” Chen (Fig. 1-3, 6, 11-14, and 21) discloses a computer implemented method comprising: one or more processors (2105), and one or more non-transitory computer readable media (2115) that store computer instructions (2140) which, when executed by the one or more processors (“central processing units (CPUs) 2105 can include hardware and/or software elements configured for executing logic or program code” discussed in [0101]), cause the one or more processors to perform operations (“one or more applications configured to execute, perform, or otherwise implement techniques disclosed herein. These applications may be embodied as contact lens eye tracking/controlling data and program code 2140” discussed in [0105]), comprising: receiving, from a photosensor (1140), data encoding a representation of light that was reflected from an eye of a user (“detector of IR or NIR light (e.g., sensor 1140) may gather a plurality of incoherent and/or coherence rays 1150 reflected from reflective elements 1110” discussed in [0079]); a contact lens that includes one or more reflective elements (the reflective patterns seen in Fig. 12-14, and discussed in [0080], eg. to “distinguish between individual eyes or individual users” as discussed in [0081]); and selecting, from among multiple gaze tracking algorithms (the different reflection patterns each correspond to different algorithms, see “reflective areas can be varied based on detection routines” discussed in [0081]), a particular gaze tracking algorithm to apply to subsequently received data from the photosensor (eg. the tracking algorithm specific to the user, based on the specific reflection patterns of their contact lenses), based at least on determining whether the representation of the light that was reflected from the eye of the user (“contact lenses associated with a plurality of users' eyes are tracked using one or more sensors” as discussed in [0061], however different users are tracked by different reflection patterns, see “contact lenses of different users may further reflect different patterns” discussed in [0061], and so the detected reflected light for the user will have the corresponding detection routine selected). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Heideklang so the contact lens includes one or more reflective elements as taught by Chen because this provides for “eye tracking accuracy is improved” (see [0081]). Regarding claim 34, Heideklang (Fig. 2 and 5) discloses a non-transitory computer readable medium that stores computer instructions (“memory storing program instructions” discussed in [0021]) which, when executed by the one or more processors (the processor “configured to execute instructions” discussed in [0048]), cause the one or more processors to perform operations (the instructions are “executable to implement the operation” discussed in [0021]), comprising: receiving, from a photosensor (called a “camera” in [0026]), data encoding a representation of light that was reflected from an eye of a user (“a portion of the IR light is reflected off the eye and captured by an eye tracking camera” discussed in [0026]); determining whether the representation of the light that was reflected from the eye of the user indicates that the user is wearing a contact lens (“the system may detect that the user is wearing the device differently, or is wearing contacts” discussed in [0035]); and selecting, from among multiple gaze tracking algorithms (“two or more gaze correction functions may be generated and stored” discussed in [0036]), a particular gaze tracking algorithm to apply to subsequently received data from the photosensor, based at least on determining whether the representation of the light that was reflected from the eye of the user indicates that the user is wearing a contact lens (“an appropriate one of the stored gaze correction functions may be selected, for example based on… detection of a contact lens” discussed in [0036]). However, Heideklang fails to teach or suggest a “contact lens that includes one or more reflective elements.” Chen (Fig. 1-3, 6, 11-14, and 21) discloses a non-transitory computer readable medium (2115) that stores computer instructions (2140) which, when executed by the one or more processors (“central processing units (CPUs) 2105 can include hardware and/or software elements configured for executing logic or program code” discussed in [0101]), cause the one or more processors to perform operations (“one or more applications configured to execute, perform, or otherwise implement techniques disclosed herein. These applications may be embodied as contact lens eye tracking/controlling data and program code 2140” discussed in [0105]), comprising: receiving, from a photosensor (1140), data encoding a representation of light that was reflected from an eye of a user (“detector of IR or NIR light (e.g., sensor 1140) may gather a plurality of incoherent and/or coherence rays 1150 reflected from reflective elements 1110” discussed in [0079]); a contact lens that includes one or more reflective elements (the reflective patterns seen in Fig. 12-14, and discussed in [0080], eg. to “distinguish between individual eyes or individual users” as discussed in [0081]); and selecting, from among multiple gaze tracking algorithms (the different reflection patterns each correspond to different algorithms, see “reflective areas can be varied based on detection routines” discussed in [0081]), a particular gaze tracking algorithm to apply to subsequently received data from the photosensor (eg. the tracking algorithm specific to the user, based on the specific reflection patterns of their contact lenses), based at least on determining whether the representation of the light that was reflected from the eye of the user (“contact lenses associated with a plurality of users' eyes are tracked using one or more sensors” as discussed in [0061], however different users are tracked by different reflection patterns, see “contact lenses of different users may further reflect different patterns” discussed in [0061], and so the detected reflected light for the user will have the corresponding detection routine selected). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Heideklang so the contact lens includes one or more reflective elements as taught by Chen because this provides for “eye tracking accuracy is improved” (see [0081]). Regarding claim 21, Heideklang and Chen disclose a method as discussed above, and Chen further discloses selecting a particular gaze tracking algorithm that is associated with contact lenses with reflective elements (eg. tracking each user different based on the specific reflection pattern of the contact lens, as discussed in [0061]) based on determining that the representation of the light that was reflected from the eye of the user indicates that the user is wearing a contact lens that includes one or more reflective elements (eg. as seen in Fig. 2 and 3, the reflected light from the contact lenses, 310 and 320, is “discerned by a computer” as discussed in [0047]). It would have been obvious to one of ordinary skill in the art to combine Heideklang and Chen for the same reasons as discussed above. Regarding claim 22, Heideklang and Chen disclose a method as discussed above, and Heideklang further discloses selecting a pupil center corneal reflection (PCCR) gaze tracking algorithm (“method of gaze tracking may be referred to as PCCR” discussed in [0026]) based on determining that the representation of the light that was reflected from the eye of the user indicates that user is not wearing a contact lens that includes one or more reflective elements (eg. when “portion of the IR light is reflected off the eye and captured by an eye tracking camera” as discussed in [0026], which is not possible with the contact lens in the way). Regarding claim 23, Heideklang and Chen disclose a method as discussed above, and Chen further discloses wherein determining whether the representation of the light that was reflected from the eye of the user indicates that the user is wearing a contact lens that includes one or more reflective elements comprises determining whether the representation of the light indicates a particular pattern of reflective elements (as discussed above, differentiating each user can be accomplished by detecting different reflection patterns, eg. see “contact lenses of different users may further reflect different patterns” as discussed in [0061]). It would have been obvious to one of ordinary skill in the art to combine Heideklang and Chen for the same reasons as discussed above. Regarding claim 24, Heideklang and Chen disclose a method as discussed above, and Chen further discloses wherein determining whether the representation of the light that was reflected from the eye of the user indicates that the user is wearing a contact lens that includes one or more reflective elements comprises determining whether the representation of the light indicates a characteristic shape (as discussed above, differentiating each user can be accomplished by detecting different reflection patterns, eg. see “contact lenses of different users may further reflect different patterns” as discussed in [0061], which includes specific shapes, such as circles or squares, as seen in Fig. 12-14). It would have been obvious to one of ordinary skill in the art to combine Heideklang and Chen for the same reasons as discussed above. Regarding claim 25, Heideklang and Chen disclose a method as discussed above, and Heideklang further discloses wherein the photosensor is mounted on a head-mounted display (“cameras 1020 and 1050 may be integrated in or attached to the frame 1010” of the head-mounted display “HMD 1000” as discussed in [0040]). Regarding claim 26, Heideklang and Chen disclose a method as discussed above, and Heideklang further discloses wherein the particular gaze tracking algorithm is selected (eg. by implementing the method of Fig. 5) by one or more processors on a head-mounted display (“controller 1060 for the XR system may be implemented in the HMD 1000,” and “controller 1060 may include one or more of various types of processors” discussed in [0041]). Claims 28-33 are directed to a system instead of a computer implemented method, and are dependent upon claim 27 instead of claim 14, but otherwise recite limitations substantially identical to those of claims 21-26, and so are rejected for the same reasons as discussed above. Claims 35-39 are directed to a non-transitory computer readable medium instead of a computer implemented method, and are dependent upon claim 34 instead of claim 14, but otherwise recite limitations substantially identical to those of claims 21-25, and so are rejected for the same reasons as discussed above. Response to Arguments Applicant’s arguments with respect to claim 14 has been considered but are moot in view of the new grounds of rejection. In view of the amendments, the reference of Heideklang has been added for new grounds of rejection. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JONATHAN M BLANCHA whose telephone number is (571)270-5890. The examiner can normally be reached Monday to Friday, 9-5. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chanh Nguyen can be reached at 5712727772. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JONATHAN M BLANCHA/ Primary Examiner, Art Unit 2623
Read full office action

Prosecution Timeline

Jan 17, 2025
Application Filed
Sep 05, 2025
Non-Final Rejection — §103
Dec 02, 2025
Response Filed
Feb 05, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603033
SCANNING IMAGE DATA TO AN ARRAY OF PIXELS AT AN INTERMEDIATE SCAN RATE DURING A TRANSITION BETWEEN DIFFERENT REFRESH RATES
2y 5m to grant Granted Apr 14, 2026
Patent 12603060
Display Device
2y 5m to grant Granted Apr 14, 2026
Patent 12598285
OPTICAL DISPLAY, IMAGE CAPTURING DEVICE AND METHODS WITH VARIABLE DEPTH OF FIELD
2y 5m to grant Granted Apr 07, 2026
Patent 12585121
NEAR-EYE DISPLAY HAVING OVERLAPPING PROJECTOR ASSEMBLIES
2y 5m to grant Granted Mar 24, 2026
Patent 12578801
METHOD AND DEVICE FOR DETECTING AND RESPONDING TO USER INPUT
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
62%
Grant Probability
71%
With Interview (+9.4%)
2y 7m
Median Time to Grant
Moderate
PTA Risk
Based on 661 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month