DETAILED ACTION
1. This Office Action is responsive to claims filed for No. 18/735,634 on August 19, 2025. Please note Claims 1-20 are pending.
Notice of Pre-AIA or AIA Status
2. The present application is being examined under the pre-AIA first to invent provisions.
Claim Rejections - 35 USC § 102
3. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
4. Claim 20 is rejected under 35 U.S.C. 102(a)(2) as being anticipated by Gousev et al.
( US 2018/0173933 A1 ).
Gousev teaches in Claim 20:
A non-transitory computer-readable medium encoded with a plurality of computer executable instructions that, when executed by at least one processor ( [0273] discloses a non-transitory computer-readable medium storing instructions which can be executing by one or more processing units ), is configured to:
identify a first subregion of a plurality of pixel cells in a sensor that provide image information related to a specific part of a user's eye at a first time ( Figure 15, [0163], [0115] discloses a bounding box 1504 for the user’s face, including eyes, a first subregion of the overall visual image 1502 );
after identifying the first subregion, receive, from a sensor with DVS capability to generate events, generated events indicating changes in IR radiation reflected from a specific part of the user's eye as detected at pixel cells in the first subregion of the plurality of pixel cells, wherein each pixel cell of the plurality of pixel cells of the sensor generate an event in response to detecting a change in IR light sensed by the pixel cell greater than a threshold ( [0073] discloses the DVS sensor array can save electrical power by only reading out values for elements in the array that have changed since the previous read-out. This is accomplished by determining when a sensor reading reaches a certain threshold and/or changes by a certain threshold, a feature of a smart array );
adjust the threshold for pixel cells in the first subregion of the plurality of pixel cells based on a rate of events generated by the sensor to a first threshold, wherein the first threshold is less than a second threshold of pixel cells in a second subregion of the plurality of pixel cells outside the first subregion ( [0134] discloses aspects of a CD (change detection) operation, i.e. related to the amount and location of motion changes. The sensitivity when detecting motion events can set the threshold. [0132], [0135] discloses further details of adjusting a threshold intensity level and/or switching a predetermined threshold that requires a stronger indication of motion, an example of sensitivity. To clarify, the threshold of a certain area, where there is increased sampling rate or motion, can be different from another part of the visual image, i.e. two thresholds ); and
compute user gaze at a second time subsequent the first time, wherein the computing is at least in part by processing the events generated by the pixel cells in the first subregion, wherein the processing is limited to events generate by pixel cells within the first subregion. subregion ( [0235] discloses capturing subsequent images of the eye of the user. Furthermore, it is clear that when the threshold is adjusted to change sensitivity, this is accounted for in future detections of the user’s eye )
Claim Rejections - 35 USC § 103
5. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
6. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
7. Claims 1-19 are rejected under 35 U.S.C. 103 as being unpatentable over Gousev
et al. ( US 2018/0173933 A1 ) in view of Canberk et al. ( US 11,269,402 B1 ).
Gousev teaches in Claim 1:
A method of operating a cross reality system comprising an IR radiation source and sensor with DVS capability to generate events indicating changes in IR radiation ( [0074] discloses a smart array having a dynamic vision sensor (DVS). Figure 13, [0154] discloses an IR sensor array which also comprises a plurality of pixels ), the sensor comprising a plurality of pixel cells ( Figure 2A, [0069] discloses a pixel array ), the method comprising:
illuminating a user's eye with the IR radiation source ( Figures 12, 14 and 15, [0159] discloses infrared aspects to focus on a user’s eyes. Please note the infrared light source 1204, as shown in Figure 12, [0147] );
identifying a first subregion of the plurality of pixel cells that provide image information related to a specific part of the user's eye at a first time ( Figure 15, [0163], [0115] discloses a bounding box 1504 for the user’s face, including eyes, a first subregion of the overall visual image 1502 );
after identifying the first subregion, receiving from the sensor, events indicating changes in IR radiation reflected from the specific part of the user's eye as detected at pixel cells in the first subregion of the plurality of pixel cells, wherein each pixel cell of the plurality of pixel cells of the sensor generate an event in response to detecting a change in IR light sensed by the pixel cell greater than a threshold ( [0073] discloses the DVS sensor array can save electrical power by only reading out values for elements in the array that have changed since the previous read-out. This is accomplished by determining when a sensor reading reaches a certain threshold and/or changes by a certain threshold, a feature of a smart array );
adjusting the threshold for pixel cells in the first subregion of the plurality of pixel cells based on a rate of events generated by the sensor to a first threshold, wherein the first threshold is less than a second threshold of pixel cells in a second subregion of the plurality of pixel cells outside the first subregion ( [0134] discloses aspects of a CD (change detection) operation, i.e. related to the amount and location of motion changes. The sensitivity when detecting motion events can set the threshold. [0132], [0135] discloses further details of adjusting a threshold intensity level and/or switching a predetermined threshold that requires a stronger indication of motion, an example of sensitivity. To clarify, the threshold of a certain area, where there is increased sampling rate or motion, can be different from another part of the visual image, i.e. two thresholds ); and
computing user gaze at a second time subsequent the first time, wherein the computing is at least in part by processing the events generated by the pixel cells in the first subregion, wherein the processing is limited to events generate by pixel cells within the first subregion ( [0235] discloses capturing subsequent images of the eye of the user. Furthermore, it is clear that when the threshold is adjusted to change sensitivity, this is accounted for in future detections of the user’s eye ); but
Gousev does not explicitly teach wherein the device is “worn by a user and comprising a processor configured to process image information”.
However, in the same field of endeavor, eye tracking devices, Canberk teaches: Figure 1A, Column 10, Lines 15-34 disclose an eyewear device worn by the user with a processor to process captured images of the eye to track eye movement. As for the processor, Gousev teaches of a microprocessor 216 in Figures 2A/2B and Canberk teaches likewise in Column 4, Lines 30-41 of an image processor which can digitalize sensor data. As combined, Gousev’s eye tracking can be implemented in a head-worn device.
Therefore, it would have been obvious to one of ordinary skill in the art, at the effective filed date of the invention, to implement the head-worn device, as taught by Canberk, with the motivation that eye tracking is still taught by both references and can be used in an immersive reality setting, ( Canberk, Column 4, Lines 50-61 ).
Gousev teaches in Claim 2:
The method of claim 1, wherein generating events indicating changes in IR radiation reflected from the user's eye comprises:
storing, associated with a pixel cell of the plurality of pixel cells, an indication of IR radiation detected at the pixel cell at the first time; detecting, at the second time, that the change in IR radiation relative to the IR radiation at the first time exceeds the threshold; and in response to the detected change, outputting the event from the sensor. ( Figure 18B, [0171] discloses the infrared light source and sensor array aspects. Respectfully, IR light source is lit and the sensor array picks up the reflected light. [0073] discloses details of sampling rate and or changes by a certain threshold, and changes here mean differences between two time stamps, i.e. a first and second time )
Gousev teaches in Claim 3:
The method of claim 2, wherein:
outputting the event in response to the detected change exceeding the threshold comprises outputting the event in response to the IR radiation detected at the pixel cell decreasing by more than the threshold. ( [0073] discloses changes by a certain threshold, which can mean increases or decreases, naturally )
Gousev teaches in Claim 4:
The method of claim 1, wherein:
computing user gaze comprises tracking a position of a user's pupil based on the events. ( Figure 20A, [0215] discloses details of a pupil region of the eye of the user. The changes in threshold are applies to the eye of the user in general )
Gousev and Caberk teach in Claim 5:
The method of claim 1, further comprising:
rendering a virtual object on a display device adjacent the user's eye at a location determined based on the computed user gaze. ( Canberk, Column 4, Lines 37-61 disclose a virtual three-dimensional experience based on the camera mapping the user’s eye gaze. The field of view is adjusted, as described above )
Canberk teaches in Claim 6:
The method of claim 5, further comprising:
repeatedly updating the rendered location of the virtual object based on the generated events. ( Respectfully, the field of view being adjusted in light of the user’s gaze determination involves the displayed content, i.e. virtual objects, etc )
As per Claim 7:
Canberk does not explicitly teach wherein “the rendered location of the virtual object is updated at an average rate of at least 10 times per second.”
However, respectfully, Canberk teaches of storing initial and successive images of the ey to be able to updated the eye database and in this eye database, a determination of gaze can be determined, ( Column 10, Lines 34-47 ). In light of the successive images, the field of view can be updated accordingly, This rate of update, such as frame rate, or at least related to frame rate, is a design choice issue and often times, the greater the rate, the more smooth the experience. Respectfully, being more than ten times, etc, is a design choice issue given that Canberk teaches of capturing multiple, successive images.
Therefore, it would have been obvious to one of ordinary skill in the art, at the effective filed date of the invention, to implement the rate at which these image are captured and thus, the rate of displayed content being updated, with the motivation that it a design choice as to the exact rate of update, ( Column 10, Lines 34-47 ).
As per Claim 8:
Canberk does not explicitly teach wherein “the rendered location of the virtual object is updated at an average rate of at least 20 times per second.”
However, respectfully, Canberk teaches of storing initial and successive images of the ey to be able to updated the eye database and in this eye database, a determination of gaze can be determined, ( Column 10, Lines 34-47 ). In light of the successive images, the field of view can be updated accordingly, This rate of update, such as frame rate, or at least related to frame rate, is a design choice issue and often times, the greater the rate, the more smooth the experience. Respectfully, being more than ten times, etc, is a design choice issue given that Canberk teaches of capturing multiple, successive images.
Therefore, it would have been obvious to one of ordinary skill in the art, at the effective filed date of the invention, to implement the rate at which these image are captured and thus, the rate of displayed content being updated, with the motivation that it a design choice as to the exact rate of update, ( Column 10, Lines 34-47 ).
Gousev teaches in Claim 9:
The method of claim 1, wherein the specific part of the user's eye is a pupil of the user's eye. ( Figure 20A, [0215] discloses details of a pupil region of the eye of the user. The changes in threshold are applies to the eye of the user in general )
Gousev and Caberk teach in Claim 10:
The method of claim 1, further comprising:
updating a location of the first subregion in the plurality of pixel cells based on projected motion of the specific part of the user's eye. ( Gousev, [0117] discloses a reference motion threshold and [0126] discloses a direction of motion as well. Canberk, Column 9, Lines 45-56 disclose analyzing the pupil aspects to determine the eye gazing direction. Also, please note the combination with Gousev with respect to the first subregion aspects. Respectfully, it is clear the combination teaches of eye tracking/location and updating aspects accordingly )
As eper Claim 11:
Gousev and Canberk do not explicitly teach “wherein the second threshold is at least three times larger than the first threshold.”
However, given Gousev teaches of multiple thresholds and/or sensitivities to better sample the areas which are undergoing changes, it is a design choice issue as to the differences in thresholds between the interpreted subregions. It is clear that the thresholds are related to the processing power requirements and adjusting the thresholds in general is a trade off between processing power and accuracy. One of ordinary skill in the art would be able to optimize a reasonable balance.
Therefore, it would have been obvious to one of ordinary skill in the art, at the effective filed date of the invention, to implement the relative sizing of the thresholds to each other, with the motivation that it is a design choice issue and not a patentable distinction.
Gousev teaches in Claim 12:
A cross reality system configured to be worn by a user, the cross reality system ( While this limitation is in the preamble, please note the combination below ) comprising:
an IR radiation source configured to illuminate an eye of the user ( Figures 12, 14 and 15, [0159] discloses infrared aspects to focus on a user’s eyes. Please note the infrared light source 1204, as shown in Figure 12, [0147] );
a sensor comprising a plurality of pixel cells with DVS capability ( [0074] discloses a smart array having a dynamic vision sensor (DVS). Figure 13, [0154] discloses an IR sensor array which also comprises a plurality of pixels ) configured to generate events indicating changes in IR radiation greater than a threshold change, wherein the sensor is positioned to receive IR radiation reflected from the eye of the user ( [0073] discloses the DVS sensor array can save electrical power by only reading out values for elements in the array that have changed since the previous read-out. This is accomplished by determining when a sensor reading reaches a certain threshold and/or changes by a certain threshold, a feature of a smart array ); and
a processor ( [0146] discloses a microprocessor ) configured to:
identify a first subregion of the plurality of pixel cells that provide image information related to a specific part of the user's eye at a first time ( Figure 15, [0163], [0115] discloses a bounding box 1504 for the user’s face, including eyes, a first subregion of the overall visual image 1502 ),
adjust the threshold for pixel cells in the first subregion of the plurality of pixel cells based on a rate of events generated by the sensor, wherein the threshold is adjusted to a first threshold that is less than a second threshold of pixel cells in a second subregion of the plurality of pixel cells outside the first subregion ( [0134] discloses aspects of a CD (change detection) operation, i.e. related to the amount and location of motion changes. The sensitivity when detecting motion events can set the threshold. [0132], [0135] discloses further details of adjusting a threshold intensity level and/or switching a predetermined threshold that requires a stronger indication of motion, an example of sensitivity. To clarify, the threshold of a certain area, where there is increased sampling rate or motion, can be different from another part of the visual image, i.e. two thresholds ), and
compute, at a second time subsequent to the first time, user gaze at least in part by processing events generated by the pixel cells in the first subregion, wherein the processing is limited to events generate by pixel cells within the first subregion ( [0235] discloses capturing subsequent images of the eye of the user. Furthermore, it is clear that when the threshold is adjusted to change sensitivity, this is accounted for in future detections of the user’s eye ); but
Gousev does not explicitly teach of “a cross reality system configured to be worn by a user, the cross reality system”.
However, in the same field of endeavor, eye tracking devices, Canberk teaches: Figure 1A, Column 10, Lines 15-34 disclose an eyewear device worn by the user with a processor to process captured images of the eye to track eye movement. As for the processor, Gousev teaches of a microprocessor 216 in Figures 2A/2B and Canberk teaches likewise in Column 4, Lines 30-41 of an image processor which can digitalize sensor data. As combined, Gousev’s eye tracking can be implemented in a head-worn device.
Therefore, it would have been obvious to one of ordinary skill in the art, at the effective filed date of the invention, to implement the head-worn device, as taught by Canberk, with the motivation that eye tracking is still taught by both references and can be used in an immersive reality setting, ( Canberk, Column 4, Lines 50-61 ).
Gousev teaches in Claim 13:
The system of claim 12, wherein:
the sensor is further configured to, for each pixel cell of the plurality of pixel cells:
store, an indication of IR radiation detected at the first time; and detect at the second time after the first time that a change in IR radiation relative to the IR radiation at the first time exceeds the threshold for the pixel cell; and the sensor is further configured to output the event in response to detecting, for a pixel cell of the plurality of pixel cells, the change exceeds the threshold. ( Figure 18B, [0171] discloses the infrared light source and sensor array aspects. Respectfully, IR light source is lit and the sensor array picks up the reflected light. [0073] discloses details of sampling rate and or changes by a certain threshold, and changes here mean differences between two time stamps, i.e. a first and second time )
Gousev teaches in Claim 14:
The system of claim 13, wherein:
the sensor is configured to output the event in response to the IR radiation detected at the pixel cell decreasing by more than the threshold. ( [0073] discloses changes by a certain threshold, which can mean increases or decreases, naturally )
Gousev teaches in Claim 15:
The system of claim 12, wherein:
computing user gaze comprises tracking a position of the user's pupil based on the events. ( Figure 20A, [0215] discloses details of a pupil region of the eye of the user. The changes in threshold are applies to the eye of the user in general )
Gousev and Canberk teach in Claim 16:
The system of claim 12, wherein the processor is further configured to:
render a virtual object on a display device adjacent the user's eye at a location determined based on the computed user gaze. ( Canberk, Column 4, Lines 37-61 disclose a virtual three-dimensional experience based on the camera mapping the user’s eye gaze. The field of view is adjusted, as described above )
Canberk teaches in Claim 17:
The system of claim 16, wherein the processor is further configured to:
repeatedly update the rendered location of the virtual object based on the generated events. ( Respectfully, the field of view being adjusted in light of the user’s gaze determination involves the displayed content, i.e. virtual objects, etc )
Gousev teaches in Claim 18:
The system of claim 12, wherein the specific part of the user's eye is a pupil of the user's eye. ( Figure 20A, [0215] discloses details of a pupil region of the eye of the user. The changes in threshold are applies to the eye of the user in general )
Gousev and Canberk teach in Claim 19:
The system of claim 12, wherein the processor is further configured to:
update a location of the first subregion in the plurality of pixel cells based on projected motion of the specific part of the user's eye. ( Gousev, [0117] discloses a reference motion threshold and [0126] discloses a direction of motion as well. Canberk, Column 9, Lines 45-56 disclose analyzing the pupil aspects to determine the eye gazing direction. Also, please note the combination with Gousev with respect to the first subregion aspects. Respectfully, it is clear the combination teaches of eye tracking/location and updating aspects accordingly )
Response to Arguments
8. Applicant’s arguments considered, but are respectfully moot in view of new grounds of rejection(s).
Applicant’s representative, Attorney Ghane, is thanked for her time to discuss the application in an interview held on October 27, 2025.
Please note the updated rejection citing Gousev. As a newly cited reference, Applicant’s arguments are moot at this time.
Conclusion
9. Any inquiry concerning this communication or earlier communications from the examiner should be directed to DENNIS P JOSEPH whose telephone number is (571)270-1459. The examiner can normally be reached Monday - Friday 5:30 - 3:30 EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Amr Awad can be reached at 571-272-7764. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DENNIS P JOSEPH/Primary Examiner, Art Unit 2621