Prosecution Insights
Last updated: April 19, 2026
Application No. 19/078,575

AUGMENTED REALITY DEVICE INCLUDING VARIABLE FOCUS LENSES AND OPERATING METHOD THEREOF

Final Rejection §103§DP
Filed
Mar 13, 2025
Examiner
PATEL, SANJIV D
Art Unit
2625
Tech Center
2600 — Communications
Assignee
Samsung Electronics Co., Ltd.
OA Round
2 (Final)
78%
Grant Probability
Favorable
3-4
OA Rounds
2y 1m
To Grant
82%
With Interview

Examiner Intelligence

Grants 78% — above average
78%
Career Allow Rate
749 granted / 964 resolved
+15.7% vs TC avg
Minimal +4% lift
Without
With
+4.3%
Interview Lift
resolved cases with interview
Fast prosecutor
2y 1m
Avg Prosecution
27 currently pending
Career history
991
Total Applications
across all art units

Statute-Specific Performance

§101
3.9%
-36.1% vs TC avg
§103
56.5%
+16.5% vs TC avg
§102
15.7%
-24.3% vs TC avg
§112
11.3%
-28.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 964 resolved cases

Office Action

§103 §DP
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claims 1, 10, and 18 have been amended as per Applicant’s amendment filed on February 26, 2026. No claims have been canceled. Claims 1-18 are pending. Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. Claims 1, 10, 8 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1, 4, 9, 12, 15 of U.S. Patent No. 11924536 in view of Rani (US 11,675,215 B1, Filed on April 21, 2020). Present Application US Patent 11924536 1. An augmented reality device comprising: a variable focus lens; an eye tracking sensor configured to emit infrared (IR) light to an eye of a user and receive IR light reflected by the eye of the user; and at least one processor configured to: detect, by using the eye tracking sensor, a pupil feature point from the reflected IR light, measure a radius of rotation of the detected pupil feature point, wherein the radius of rotation is a distance between a first pupil feature point when the eye of the user is moved by a preset rotation angle in a first direction and a second pupil feature point when the eye of the user is moved by the preset rotation angle in a second direction which is opposite to the first direction, determine eye relief, which is a distance between the eye of the user and the variable focus lens, based on the measured radius of rotation, obtain, by using the eye tracking sensor, a gaze point at which gaze direction of a left eye of the user and gaze direction of a right eye of the user converge, obtain, by using the eye tracking sensor, an interpupillary distance which is a distance between a pupil of the left eye and a pupil of the right eye, and determine a position of a focal region of the variable focus lens based on the eye relief, the gaze point, and the interpupillary distance. 1. An augmented reality device comprising: a variable focus lens; an eye tracking sensor comprising an infrared (IR) light source and an IR camera; and at least one processor configured to: control the IR light source of the eye tracking sensor to emit IR light to the eye of the user and obtain an image by controlling the IR camera of the eye tracking sensor to photograph the IR light reflected by the eye of the user, [Claim 4] 4. The augmented reality device of claim 1, wherein the at least one processor is further configured to: obtain images by photographing an eye of the user moving by a preset rotation angle, by the eye tracking sensor; detect a pupil feature point by analyzing the images; [Claim 4] measure a radius of rotation of the detected pupil feature point; and obtain the eye relief based on the measured radius of rotation. detect a plurality of glint feature points from the obtained image, determine eye relief, which is a distance between the eye of the user and the variable focus lens, based on a distance between the plurality of detected glint feature points, obtain information with respect to a gaze point at which gaze direction of a left eye of the user and gaze direction of a right eye of the user converge, and an interpupillary distance which is a distance between a pupil of the left eye and a pupil of the right eye, based on the plurality of glint feature points, and determine a position of a focal region of the variable focus lens based on the information with respect to the eye relief, the gaze point, and the interpupillary distance. 7. The augmented reality device of claim 1, wherein the eye tracking sensor comprises a first eye tracking sensor configured to obtain a left-eye image by photographing the left eye of the user and detect a left-eye pupil from the left-eye image and a second eye tracking sensor configured to obtain a right-eye image by photographing the right eye of the user and detect a right-eye pupil from the right-eye image, and wherein the at least one processor is further configured to: obtain three-dimensional coordinates of the left-eye pupil and the right-eye pupil based on a positional relationship between the first eye tracking sensor and the second eye tracking sensor, and camera attribute information, and obtain the interpupillary distance based on the three-dimensional coordinates of the left-eye pupil and the right-eye pupil. 5. The augmented reality device of claim 1, wherein the at least one processor is further configured to: detect the pupil of the left eye from a left-eye image obtained by a first eye tracking sensor; detect the pupil of the right eye from a right-eye image obtained by a second eye tracking sensor; obtain three-dimensional coordinates of the pupil of the left eye and the pupil of the right eye based on a positional relationship between the first eye tracking sensor and the second eye tracking sensor, and camera attribute information; and obtain the interpupillary distance based on the three-dimensional coordinates of the pupil of the left eye and the pupil of the right eye. 9. The augmented reality device of claim 1, wherein the at least one processor is further configured to adjust refractive power of the focal region by applying a control voltage to the variable focus lens to generate a phase modulation profile for a position corresponding to the focal region. 8. The augmented reality device of claim 1, wherein the at least one processor is further configured to adjust refractive power of the focal region by applying a control voltage to the variable focus lens to generate a phase modulation profile for a position corresponding to the focal region. 10. An operating method of augmented reality device, the operating method comprising: emitting, through an eye tracking sensor of the augmented reality device, an infrared (IR) light to an eye of a user; receiving, through the eye tracking sensor, the IR light reflected by the eye of the user; detecting a pupil feature point from the reflected IR light; measuring a radius of rotation of the detected pupil feature point wherein the radius of rotation is a distance between a first pupil feature point when the eye of the user is moved by a preset rotation angle in a first direction and a second pupil feature point when the eye of the user is moved by the preset rotation angle in a second direction which is opposite to the first direction; determining eye relief, which is a distance between the eye of the user and a variable focus lens of the augmented reality device, based on the measured radius of rotation, obtaining, through the eye tracking sensor, a gaze point at which gaze direction of a left eye of the user and gaze direction of a right eye of the user converge, obtaining, through the eye tracking sensor, an interpupillary distance which is a distance between a pupil of the left eye and a pupil of the right eye, and determining a position of a focal region of the variable focus lens based on the eye relief, the gaze point, and the interpupillary distance. 9. An operating method of an augmented reality device, the operating method comprising: emitting infrared (IR) light, by using an IR light source of an eye tracking sensor, to an eye of a user; obtaining, by an IR camera of the eye tracking sensor, an image by photographing the IR light reflected by the eye of the user; detecting a plurality of glint feature points from the image obtained by the IR camera; [Claim 12] 12. The operating method of claim 9, wherein the detecting of the plurality of glint feature points comprises obtaining images by photographing an eye of the user moving by a preset rotation angle, by the eye tracking sensor, and detecting a pupil feature point by analyzing the images, and [Claim 12] wherein the obtaining of the information with respect to eye relief comprises measuring a radius of rotation of the detected pupil feature point, and obtaining the eye relief based on the measured radius of rotation. determining eye relief, which is a distance between the eye of the user and a variable focus lens of the augmented reality device, based on a distance between the plurality of detected glint feature points; obtaining information with respect to a gaze point at which gaze direction of a left eye of the user and gaze direction of a right eye of the user converge, and an interpupillary distance which is a distance between a pupil of the left eye and a pupil of the right eye of the user, based on the plurality of glint feature points; and determining a position of a focal region of the variable focus lens based on the information about the eye relief, the gaze point, and the interpupillary distance. 18. A computer program product comprising a computer-readable storage medium, the computer-readable storage medium comprising instructions readable by an augmented reality device to perform: emitting, through an eye tracking sensor of the augmented reality device, an infrared (IR) light to an eye of a user; receiving, through the eye tracking sensor, the IR light reflected by the eye of the user; detecting a pupil feature point from the reflected IR light; measuring a radius of rotation of the detected pupil feature point, wherein the radius of rotation is a distance between a first pupil feature point when the eye of the user is moved by a preset rotation angle in a first direction and a second pupil feature point when the eye of the user is moved by the preset rotation angle in a second direction which is opposite to the first direction; determining eye relief, which is a distance between the eye of the user and a variable focus lens of the augmented reality device, based on the measured radius of rotation, obtaining, through the eye tracking sensor, a gaze point at which gaze direction of a left eye of the user and gaze direction of a right eye of the user converge, obtaining, through the eye tracking sensor, an interpupillary distance which is a distance between a pupil of the left eye and a pupil of the right eye, and determining a position of a focal region of the variable focus lens based on the eye relief, the gaze point, and the interpupillary distance. 15. A computer program product comprising a computer-readable storage medium, the computer-readable storage medium comprising instructions readable by an augmented reality device to perform: emitting infrared (IR) light, by using an IR light source of an eye tracking sensor, to an eye of a user; obtaining, by an IR camera of the eye tracking sensor, an image by photographing the IR light reflected by the eye of the user; detecting a plurality of glint feature points from the image obtained by the IR camera; determining eye relief, which is a distance between the eye of the user and a variable focus lens of the augmented reality device, based on a distance between the plurality of detected glint feature points; obtaining information with respect to a gaze point at which gaze direction of a left eye of the user and gaze direction of a right eye of the user converge, and an interpupillary distance which is a distance between a pupil of the left eye and a pupil of the right eye of the user, based on the plurality of glint feature points; and determining a position of a focal region of the variable focus lens based on the information about the eye relief, the gaze point, and the interpupillary distance. 2. The augmented reality device of claim 1, wherein the eye tracking sensor comprises an IR light source configured to emit the IR light to the eye of the user and an IR camera configured to obtain a plurality of images by receiving, through an image sensor of the IR camera, the reflected IR light from the eye of the user, and wherein the at least one processor is further configured to: identify at least one pixel representing the pupil from among a plurality of pixels of each of the plurality of images obtained by the IR camera, and detect the pupil feature point based on the identified at least one pixel. 3. The augmented reality device of claim 2, wherein the at least one processor is further configured to: obtain coordinate information of the at least one pixel from among the plurality of pixels of each of the plurality of images obtained by the IR camera, and measure the radius of rotation of the pupil feature point based on the coordinate information. 4. The augmented reality device of claim 1, wherein the eye tracking sensor comprises an IR scanner configured to emit the IR light in a form of point light or line light toward a light reflector, such that the emitted IR light is reflected by the light reflector to be directed to the eye of the user and an IR detector configured to detect the IR light reflected by the eye of the user, and wherein the at least one processor is further configured to identify a position of the pupil feature point by analyzing the IR light detected by the IR detector. 5. The electronic device of claim 4, wherein the at least one processor is further configured to measure the radius of rotation of the pupil feature point based on a change of the position of the pupil feature point. 6. The augmented reality device of claim 1, wherein the eye relief is inversely proportional to the radius of rotation of the pupil feature point. 8. The augmented reality device of claim 1, wherein the at least one processor is further configured to: obtain coordinates of a center focus based on the eye relief, a distance between the gaze point and the eyes of the user, and the interpupillary distance; and determine, as the focal region, a region of a preset size around the center focus. 11. The operating method of claim 10, wherein the emitting of the IR light comprises emitting, by using an IR light source of the eye tracking sensor, to the eye of the user, wherein the detecting of the IR light comprises obtaining, by using an IR camera of the eye tracking sensor, a plurality of images by receiving, through an image sensor, the reflected IR light from the eye of the user, and wherein detecting of the pupil feature point comprises: identifying at least one pixel representing the pupil from among a plurality of pixels of each of the plurality of images obtained by the IR camera; and detecting the pupil feature point based on the identified at least one pixel. 12. The operating method of claim 10, wherein the measuring of the radius of rotation of the detected pupil feature point comprises: obtaining coordinate information of the at least one pixel from among the plurality of pixels of each of the plurality of images obtained by the IR camera; and measuring the radius of rotation of the pupil feature point based on the coordinate information. 13. The operating method of claim 10, wherein the measuring of the radius of rotation of the detected pupil feature point comprises: wherein the emitting of the IR light comprises emitting the IR light in a form of point light or line light toward a light reflector, such that the emitted IR light is reflected by the light reflector to be directed to the eye of the user, wherein the detecting of the IR light comprises detecting, by using an IR detector of the eye tracking sensor, the IR light reflected by the eye of the user, wherein the detecting of the pupil feature point comprises identifying a position of the pupil feature point by analyzing the IR light detected by the IR detector. 14. The operating method of claim 13, wherein the measuring of the radius of rotation of the detected pupil feature point comprises measuring the radius of rotation of the pupil feature point based on a change of the position of the pupil feature point. 15. The operating method of claim 10, wherein the eye tracking sensor comprises a first eye tracking sensor configured to obtain a left-eye image by photographing the left eye of the user and detect a left-eye pupil from the left-eye image and a second eye tracking sensor configured to obtain a right-eye image by photographing the right eye of the user and detect a right-eye pupil from the right-eye image, and wherein the obtaining of the interpupillary distance comprises: obtaining three-dimensional coordinates of the left-eye pupil and the right-eye pupil based on a positional relationship between the first eye tracking sensor and the second eye tracking sensor, and camera attribute information; and obtaining the interpupillary distance based on the three-dimensional coordinates of the left-eye pupil and the right-eye pupil. 16. The operating method of claim 10, wherein the determining of the position of a focal region of the variable focus lens comprises: obtaining coordinates of a center focus based on the eye relief, a distance between the gaze point and the eyes of the user, and the interpupillary distance; and determining, as the focal region, a region of a preset size around the center focus. 17. The operating method of claim 10, further comprises: adjusting refractive power of the focal region by applying a control voltage to the variable focus lens to generate a phase modulation profile for a position corresponding to the focal region. 2. The augmented reality device of claim 1 wherein the at least one processor is further configured to: determine the eye relief based on a size of a region of a glint pattern, which is a combination of the plurality of detected glint feature points. 3. The augmented reality device of claim 2, wherein the IR light source comprises a plurality of IR light-emitting diodes (LEDs) provided on a lens frame of the augmented reality device and spaced apart from each other by a preset distance, and wherein the at least one processor is further configured to obtain the eye relief based on at least one of the size of the region of the glint pattern, a positional relationship between the plurality of IR LEDs, or coordinates of each pixel of the IR camera. 6. The augmented reality device of claim 1, wherein the at least one processor is further configured to: determine, as a first focal region, a region having a preset size around a first center focus on a first variable focus lens at which a virtual straight line representing a first gaze direction of the left eye toward the gaze point meets the first variable focus lens; and determine, as a second focal region, a region having a preset size around a second center focus on a second variable focus lens at which a virtual straight line representing a second gaze direction of the right eye toward the gaze point meets the second variable focus lens. 7. The augmented reality device of claim 1, wherein the at least one processor is further configured to: obtain coordinates of a center focus based on the eye relief, a distance between the gaze point and the eye of the user, and the interpupillary distance; and determine, as the focal region, a region of a preset size around the center focus. 10. The operating method of claim 9 wherein the obtaining of the information about the eye relief comprises determining the eye relief based on a size of a region of a glint pattern, which is a combination of the plurality of detected glint feature point. 11. The operating method of claim 10, wherein the IR light source comprises a plurality of IR light-emitting diodes (LEDs) provided on a lens frame of the augmented reality device to be spaced apart from each other by a preset distance, and wherein the determining of the eye relief comprises obtaining the eye relief based on at least one of the size of the region of the glint pattern, a positional relationship between the plurality of IR LEDs, or coordinates of each pixel of the IR camera. 13. The operating method of claim 9, wherein the determining of the position of the focal region of the variable focus lens comprises: determining, as a first focal region, a region having a preset size around a first center focus on a first variable focus lens at which a virtual straight line representing a first gaze direction of the left eye toward the gaze point meets the first variable focus lens; and determining, as a second focal region, a region having a preset size around a second center focus on a second variable focus lens at which a virtual straight line representing a second gaze direction of the right eye toward the gaze point meets the second variable focus lens. 14. The operating method of claim 9, wherein the determining of the position of the focal region of the variable focus lens comprises obtaining coordinates of a center focus based on the eye relief, a distance between the gaze point and the eye of the user, and the interpupillary distance, and determining, as the focal region, a region of a preset size around the center focus. Although the claims at issue are not identical, they are not patentably distinct from each other because the scope of claims 1, 10, 18 of the present application overlap and encompass the scope of claims 1, 4, 9, 12, 15 of U.S. Patent No. 11924536, and vice-versa, with the exception of the claimed aspect of: wherein the radius of rotation is a distance between a first pupil feature point when the eye of the user is moved by a preset rotation angle in a first direction and a second pupil feature point when the eye of the user is moved by the preset rotation angle in a second direction which is opposite to the first direction. However, Rani does disclose wherein the radius of rotation is a distance between a first pupil feature point when the eye of the user is moved by a preset rotation angle in a first direction and a second pupil feature point when the eye of the user is moved by the preset rotation angle in a second direction which is opposite to the first direction (Rani at Figs. 5-6, second distance 512 or fourth distance 516; col. 13, ll. 11-17 discloses “After identifying the eyes and the pupils, the measurement component 328 may determine a first distance 510 between a first edge 506(1) of the right eye 502(1) and a center of the right pupil 504(1), as well as a second distance 512 between the first edge 506(1) of the right eye 502(1) and a second edge 508(1) of the right eye 502(1).”). U.S. Patent No. 11924536 discloses a base eyeglass based measurement system upon which the claimed invention is an improvement. Rani discloses a comparable eyeglass based measurement system which has been improved in the same way as the claimed invention. Hence, it would have been obvious to a person having ordinary skill in the art before the effective filing date to modify or add to U.S. Patent No. 11924536 the teachings of Rani for the predictable result of determining the distance between the user’s eyes and the object upon which they gaze (Rani at col. 4. Ll. 43-45). Claims 1, 8-10, 17, 18 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1, 7, 8, 9, 16, 17 of U.S. Patent No. 12279033 in view of Lee (US 2020/0379214 A1, Published December 3, 2020) and Rani (US 11,675,215 B1, Filed on April 21, 2020). Present Application US Patent 12279033 1. An augmented reality device comprising: a variable focus lens; an eye tracking sensor configured to emit infrared (IR) light to an eye of a user and receive IR light reflected by the eye of the user; and at least one processor configured to: detect, by using the eye tracking sensor, a pupil feature point from the reflected IR light, measure a radius of rotation of the detected pupil feature point, wherein the radius of rotation is a distance between a first pupil feature point when the eye of the user is moved by a preset rotation angle in a first direction and a second pupil feature point when the eye of the user is moved by the preset rotation angle in a second direction which is opposite to the first direction, determine eye relief, which is a distance between the eye of the user and the variable focus lens, based on the measured radius of rotation, obtain, by using the eye tracking sensor, a gaze point at which gaze direction of a left eye of the user and gaze direction of a right eye of the user converge, obtain, by using the eye tracking sensor, an interpupillary distance which is a distance between a pupil of the left eye and a pupil of the right eye, and determine a position of a focal region of the variable focus lens based on the eye relief, the gaze point, and the interpupillary distance. 1. An augmented reality device comprising: a variable focus lens; an eye tracking sensor comprising an infrared (IR) scanner and an IR detector, and at least one processor configured to: control the IR scanner of the eye tracking sensor to emit IR light to an eye of an user, detect the IR light reflected by the eye of the user by using the IR detector, detect a plurality of glint feature points from the reflected IR light, obtain distances between the plurality of detected glint feature points, determine eye relief, which is a distance between the eye of the user and the variable focus lens, based on the obtained distances between the plurality of glint feature points, obtain information with respect to a gaze point at which gaze direction of a left eye of the user and gaze direction of a right eye of the user converge, and an interpupillary distance which is a distance between a pupil of the left eye and a pupil of the right eye, based on the plurality of glint feature points, and determine a position of a focal region of the variable focus lens based on the information with respect to the eye relief, the gaze point, and the interpupillary distance. 8. The augmented reality device of claim 1, wherein the at least one processor is further configured to: obtain coordinates of a center focus based on the eye relief, a distance between the gaze point and the eyes of the user, and the interpupillary distance; and determine, as the focal region, a region of a preset size around the center focus. 7. The augmented reality device of claim 1, wherein the at least one processor is further configured to: obtain coordinates of a center focus based on the eye relief, a distance between the gaze point and the eyes of the user, and the interpupillary distance; and determine, as the focal region, a region of a preset size around the center focus. 9. The augmented reality device of claim 1, wherein the at least one processor is further configured to adjust refractive power of the focal region by applying a control voltage to the variable focus lens to generate a phase modulation profile for a position corresponding to the focal region. 8. The augmented reality device of claim 1, wherein the at least one processor is further configured to adjust refractive power of the focal region by applying a control voltage to the variable focus lens to generate a phase modulation profile for a position corresponding to the focal region. 10. An operating method of augmented reality device, the operating method comprising: emitting, through an eye tracking sensor of the augmented reality device, an infrared (IR) light to an eye of a user; receiving, through the eye tracking sensor, the IR light reflected by the eye of the user; detecting a pupil feature point from the reflected IR light; measuring a radius of rotation of the detected pupil feature point, wherein the radius of rotation is a distance between a first pupil feature point when the eye of the user is moved by a preset rotation angle in a first direction and a second pupil feature point when the eye of the user is moved by the preset rotation angle in a second direction which is opposite to the first direction; determining eye relief, which is a distance between the eye of the user and a variable focus lens of the augmented reality device, based on the measured radius of rotation, obtaining, through the eye tracking sensor, a gaze point at which gaze direction of a left eye of the user and gaze direction of a right eye of the user converge, obtaining, through the eye tracking sensor, an interpupillary distance which is a distance between a pupil of the left eye and a pupil of the right eye, and determining a position of a focal region of the variable focus lens based on the eye relief, the gaze point, and the interpupillary distance. 9. An operating method of augmented reality device, the operating method comprising: emitting, by an infrared (IR) scanner comprised in an eye tracking sensor, an IR light to an eye of an user; detecting, by an IR detector comprised in the eye tracking sensor, the IR light reflected by the eye of the user; detecting a plurality of glint feature points from the reflected IR light; obtaining distances between the plurality of detected glint feature points; obtaining information with respect to eye relief, which is a distance between the eye of the user and a variable focus lens of the augmented reality device, based on the obtained distances between the plurality of glint feature points; obtaining information with respect to a gaze point at which gaze directions of a left eye of the user and gaze direction of a right eye of the user converge, and an interpupillary distance which is a distance between a pupil of the left eye and a pupil of the right eye, based on the plurality of glint feature points; and determining a position of a focal region of the variable focus lens based on the information about the eye relief, the gaze point, and the interpupillary distance. 17. The operating method of claim 10, further comprises: adjusting refractive power of the focal region by applying a control voltage to the variable focus lens to generate a phase modulation profile for a position corresponding to the focal region. 16. The operating method of claim 9, further comprising adjusting refractive power of the focal region by applying a control voltage to the variable focus lens to generate a phase modulation profile for a position corresponding to the focal region. 18. A computer program product comprising a computer-readable storage medium, the computer-readable storage medium comprising instructions readable by an augmented reality device to perform: emitting, through an eye tracking sensor of the augmented reality device, an infrared (IR) light to an eye of a user; receiving, through the eye tracking sensor, the IR light reflected by the eye of the user; wherein the radius of rotation is a distance between a first pupil feature point when the eye of the user is moved by a preset rotation angle in a first direction and a second pupil feature point when the eye of the user is moved by the preset rotation angle in a second direction which is opposite to the first direction, detecting a pupil feature point from the reflected IR light; measuring a radius of rotation of the detected pupil feature point; determining eye relief, which is a distance between the eye of the user and a variable focus lens of the augmented reality device, based on the measured radius of rotation, obtaining, through the eye tracking sensor, a gaze point at which gaze direction of a left eye of the user and gaze direction of a right eye of the user converge, obtaining, through the eye tracking sensor, an interpupillary distance which is a distance between a pupil of the left eye and a pupil of the right eye, and determining a position of a focal region of the variable focus lens based on the eye relief, the gaze point, and the interpupillary distance. 17. A computer program product comprising a computer-readable storage medium, the computer-readable storage medium comprising instructions readable by an augmented reality device to perform: emitting, by an infrared (IR) scanner comprised in an eye tracking sensor, an IR light to an eye of an user; detecting, by an IR detector comprised in the eye tracking sensor, the IR light reflected by the eye of the user; detecting a plurality of glint feature points from the reflected IR light; obtaining distances between the plurality of detected glint feature points; obtaining information with respect to eye relief, which is a distance between the eye of the user and a variable focus lens of the augmented reality device, based on the obtained distances between the plurality of glint feature points; obtaining information with respect to a gaze point at which gaze directions of a left eye of the user and gaze direction of a right eye of the user converge, and an interpupillary distance which is a distance between a pupil of the left eye and a pupil of the right eye, based on the plurality of glint feature points; and determining a position of a focal region of the variable focus lens based on the information about the eye relief, the gaze point, and the interpupillary distance. 2. The augmented reality device of claim 1, wherein the eye tracking sensor comprises an IR light source configured to emit the IR light to the eye of the user and an IR camera configured to obtain a plurality of images by receiving, through an image sensor of the IR camera, the reflected IR light from the eye of the user, and wherein the at least one processor is further configured to: identify at least one pixel representing the pupil from among a plurality of pixels of each of the plurality of images obtained by the IR camera, and detect the pupil feature point based on the identified at least one pixel. 3. The augmented reality device of claim 2, wherein the at least one processor is further configured to: obtain coordinate information of the at least one pixel from among the plurality of pixels of each of the plurality of images obtained by the IR camera, and measure the radius of rotation of the pupil feature point based on the coordinate information. 4. The augmented reality device of claim 1, wherein the eye tracking sensor comprises an IR scanner configured to emit the IR light in a form of point light or line light toward a light reflector, such that the emitted IR light is reflected by the light reflector to be directed to the eye of the user and an IR detector configured to detect the IR light reflected by the eye of the user, and wherein the at least one processor is further configured to identify a position of the pupil feature point by analyzing the IR light detected by the IR detector. 5. The electronic device of claim 4, wherein the at least one processor is further configured to measure the radius of rotation of the pupil feature point based on a change of the position of the pupil feature point. 6. The augmented reality device of claim 1, wherein the eye relief is inversely proportional to the radius of rotation of the pupil feature point. 7. The augmented reality device of claim 1, wherein the eye tracking sensor comprises a first eye tracking sensor configured to obtain a left-eye image by photographing the left eye of the user and detect a left-eye pupil from the left-eye image and a second eye tracking sensor configured to obtain a right-eye image by photographing the right eye of the user and detect a right-eye pupil from the right-eye image, and wherein the at least one processor is further configured to: obtain three-dimensional coordinates of the left-eye pupil and the right-eye pupil based on a positional relationship between the first eye tracking sensor and the second eye tracking sensor, and camera attribute information, and obtain the interpupillary distance based on the three-dimensional coordinates of the left-eye pupil and the right-eye pupil. 11. The operating method of claim 10, wherein the emitting of the IR light comprises emitting, by using an IR light source of the eye tracking sensor, to the eye of the user, wherein the detecting of the IR light comprises obtaining, by using an IR camera of the eye tracking sensor, a plurality of images by receiving, through an image sensor, the reflected IR light from the eye of the user, and wherein detecting of the pupil feature point comprises: identifying at least one pixel representing the pupil from among a plurality of pixels of each of the plurality of images obtained by the IR camera; and detecting the pupil feature point based on the identified at least one pixel. 12. The operating method of claim 10, wherein the measuring of the radius of rotation of the detected pupil feature point comprises: obtaining coordinate information of the at least one pixel from among the plurality of pixels of each of the plurality of images obtained by the IR camera; and measuring the radius of rotation of the pupil feature point based on the coordinate information. 13. The operating method of claim 10, wherein the measuring of the radius of rotation of the detected pupil feature point comprises: wherein the emitting of the IR light comprises emitting the IR light in a form of point light or line light toward a light reflector, such that the emitted IR light is reflected by the light reflector to be directed to the eye of the user, wherein the detecting of the IR light comprises detecting, by using an IR detector of the eye tracking sensor, the IR light reflected by the eye of the user, wherein the detecting of the pupil feature point comprises identifying a position of the pupil feature point by analyzing the IR light detected by the IR detector. 14. The operating method of claim 13, wherein the measuring of the radius of rotation of the detected pupil feature point comprises measuring the radius of rotation of the pupil feature point based on a change of the position of the pupil feature point. 15. The operating method of claim 10, wherein the eye tracking sensor comprises a first eye tracking sensor configured to obtain a left-eye image by photographing the left eye of the user and detect a left-eye pupil from the left-eye image and a second eye tracking sensor configured to obtain a right-eye image by photographing the right eye of the user and detect a right-eye pupil from the right-eye image, and wherein the obtaining of the interpupillary distance comprises: obtaining three-dimensional coordinates of the left-eye pupil and the right-eye pupil based on a positional relationship between the first eye tracking sensor and the second eye tracking sensor, and camera attribute information; and obtaining the interpupillary distance based on the three-dimensional coordinates of the left-eye pupil and the right-eye pupil. 16. The operating method of claim 10, wherein the determining of the position of a focal region of the variable focus lens comprises: obtaining coordinates of a center focus based on the eye relief, a distance between the gaze point and the eyes of the user, and the interpupillary distance; and determining, as the focal region, a region of a preset size around the center focus. 2. The augmented reality device of claim 1, wherein the at least one processor is further configured to: sequentially emit, by using a point light source or a line light source included in the IR scanner, the IR light to be incident on an entire region in which the eye of the user is located, and sequentially receive the IR light reflected by the eye of the user by using a plurality of light detectors included in the IR detector. 3. The augmented reality device of claim 2, wherein the at least one processor is further configured to: detect the plurality of glint feature points by analyzing an array of rays of the IR light sequentially received through the plurality of light detectors. 4. The augmented reality device of claim 1, wherein the IR scanner comprises a micro-electro mechanical systems (MEMS) scanner configured to emit the IR light in a form of point light or line light toward a light reflector, such that the emitted IR light is reflected by the light reflector to be directed to the eye of the user. 5. The electronic device of claim 1, wherein the IR detector comprises a plurality of photodiodes disposed on a lens frame of the augmented reality device and spaced apart from each other by a preset distance, and wherein the at least one processor is further configured to calculate the eye relief based on the distances between the plurality of glint feature points and a positional relationship between the plurality of photodiodes. 6. The augmented reality device of claim 1, wherein the at least one processor is further configured to: determine, as a first focal region, a region having a preset size around a first center focus on a first variable focus lens at which a virtual straight line representing a first gaze direction of the left eye toward the gaze point meets the first variable focus lens; and determine, as a second focal region, a region having a preset size around a second center focus on a second variable focus lens at which a virtual straight line representing a second gaze direction of the right eye toward the gaze point meets the second variable focus lens. 10. The operating method of claim 9, wherein the emitting of the IR light comprises sequentially emitting, by using a point light source or a line light source included in the IR scanner, the IR light to be incident on an entire region in which the eye of the user is located, and wherein the detecting of the IR light comprises sequentially receiving the IR light reflected by the eye of the user by using a plurality of light detectors included in the IR detector. 11. The operating method of claim 10, wherein the detecting of the plurality of glint feature points comprises: detecting the plurality of glint feature points by analyzing an array of rays of the IR light sequentially received through the plurality of light detectors. 12. The operating method of claim 9, wherein the IR scanner comprises a micro-electro mechanical systems (MEMS) scanner configured to emit the IR light in a form of point light or line light toward a light reflector, such that the emitted IR light is reflected by the light reflector to be directed to the eye of the user. 13. The operating method of claim 9, wherein the IR detector comprises a plurality of photodiodes disposed on a lens frame of the augmented reality device and spaced apart from each other by a preset distance, and wherein the obtaining of the information with respect to the eye relief comprises calculating the eye relief based on the distances between the plurality of glint feature points and a positional relationship between the plurality of photodiodes. 14. The operating method of claim 9, wherein the determining of the position of the focal region of the variable focus lens comprises: determining, as a first focal region, a region having a preset size around a first center focus on a first variable focus lens at which a virtual straight line representing a first gaze direction of the left eye toward the gaze point meets the first variable focus lens; and determining, as a second focal region, a region having a preset size around a second center focus on a second variable focus lens at which a virtual straight line representing a second gaze direction of the right eye toward the gaze point meets the second variable focus lens. 15. The operating method of claim 9, wherein the determining of the position of the focal region of the variable focus lens comprises obtaining coordinates of a center focus based on the eye relief, a distance between the gaze point and the eyes of the user, and the interpupillary distance, and determining, as the focal region, a region of a preset size around the center focus. Although the claims at issue are not identical, they are not patentably distinct from each other because the scope of claims 1, 8-10, 17, 18 of the present application overlap and encompass the scope of claims 1, 7, 8, 9, 16, 17 of U.S. Patent No. 12279033, and vice-versa, with the exception that claims 1, 7, 8, 9, 16, 17 of U.S. Patent No. 12279033 does not disclose the claimed aspect of: “measure a radius of rotation of the detected pupil feature point, determine eye relief,… based on the measured radius of rotation.” However, Lee does disclose measure a radius of rotation of the detected pupil feature point, determine eye relief, which is a distance between the eye of the user and the variable focus lens, based on the measured radius of rotation (Lee at Figs. 5B-5C; ¶ [0164]-[0166]. Examiner takes an official notice that the Pythagorean Theorem is well-known in the art. In view of the officially noticed facts, it would be obvious to a person of ordinary skill to use the Pythagorean Theorem for the well-known purpose of determining a radial distance from eye 33 to screen 530). U.S. Patent No. 12279033 discloses a base augmented reality device upon which the claimed invention is an improvement. Lee discloses a comparable augmented reality device which has been improved in the same way as the claimed invention. Hence, it would have been obvious to a person having ordinary skill in the art at the time of filing to modify or add to U.S. Patent No. 12279033 the teachings of Lee for the predictable result preventing dizziness or motion sickness to a user watching a virtual image for a long time (Lee at ¶ [0005]). The combination of US Patent 12279033 and Lee does not disclose: wherein the radius of rotation is a distance between a first pupil feature point when the eye of the user is moved by a preset rotation angle in a first direction and a second pupil feature point when the eye of the user is moved by the preset rotation angle in a second direction which is opposite to the first direction. However, Rani does disclose wherein the radius of rotation is a distance between a first pupil feature point when the eye of the user is moved by a preset rotation angle in a first direction and a second pupil feature point when the eye of the user is moved by the preset rotation angle in a second direction which is opposite to the first direction (Rani at Figs. 5-6, second distance 512 or fourth distance 516; col. 13, ll. 11-17 discloses “After identifying the eyes and the pupils, the measurement component 328 may determine a first distance 510 between a first edge 506(1) of the right eye 502(1) and a center of the right pupil 504(1), as well as a second distance 512 between the first edge 506(1) of the right eye 502(1) and a second edge 508(1) of the right eye 502(1).”). The combination of US Patent 12279033 and Lee discloses a base eyeglass based measurement system upon which the claimed invention is an improvement. Rani discloses a comparable eyeglass based measurement system which has been improved in the same way as the claimed invention. Hence, it would have been obvious to a person having ordinary skill in the art before the effective filing date to modify or add to the combination of US Patent 12279033 and Lee the teachings of Rani for the predictable result of determining the distance between the user’s eyes and the object upon which they gaze (Rani at col. 4. Ll. 43-45). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 2, 6-11, 15-18 are rejected under 35 U.S.C. 103 as being unpatentable over Lin (US 2021/0037232 A1, Published February 4, 2021) in view of Lee (US 2020/0379214 A1, Published December 3, 2020) and Rani (US 11,675,215 B1, Filed on April 21, 2020). As to claim 1, Lin discloses an augmented reality device comprising: a… lens (Lin at Fig. 1, lens 130a,b); an eye tracking sensor configured to emit infrared (IR) light to an eye of a user and receive IR light reflected by the eye of the user (Lin at Fig. 1, light sources 160 and image capture units 140a,b Fig. 7, step S201; ¶ [0023][0042] discloses “The plurality of left-eye features comprises but not limited to, features of the left eyelid, features of the left iris, and/or the left eye pupil 12a.”); and at least one processor configured to: detect, by using the eye tracking sensor, a pupil feature point from the reflected IR light,… determine eye relief, which is a distance between the eye of the user and the variable focus lens (Lin at Figs. 2-7, steps S702-S703, S705; ¶ [0031] discloses “he term “eye relief” in this disclosure is defined as a distance from the eye of the user to an outer surface (or the center of the outer surface) of a corresponding lens. For example, the first eye relief is a distance from the left eye 10a to the outer surface of the first lens 130a. As another example, the second eye relief is a distance from the right eye 10b to the outer surface of the second lens 130b.” ¶ [0042]-[0043]. Claim 1 discloses “calculating a second eye relief according to at least one right-eye feature in the two right-eye images; calculating an interpupillary distance (IPD) according to the first eye relief and the second eye relief; and adjusting, according to the IPD, a distance between a first lens and a second lens of the HMD”),… obtain, by using the eye tracking sensor, an interpupillary distance which is a distance between a pupil of the left eye and a pupil of the right eye (Lin at Figs. 7, step 708; Claim 1 discloses “calculating an interpupillary distance (IPD) according to the first eye relief and second eye relief”), and determine a position of a focal region of the variable focus lens based on the eye relief, the gaze point, and the interpupillary distance (Lin at ¶ [0052] discloses “In some embodiments that the IPD has been determined, the HMD 100 may further use the first image capture unit 140a to capture a plurality of left- eye images, and use the second image capture unit 140b to capture a plurality of right- eye images. Then, the HMD 100 recognizes the plurality of left-eye features from the plurality of left-eye images and the plurality of right-eye features from the plurality of right-eye images to dynamically determine a point of gaze of the user.” Claim 9 including claims 1, 5, 6).). Lin does not disclose measuring a radius of rotation of the detected pupil feature point, determine eye relief, which is a distance between the eye of the user and the variable focus lens, based on the measured radius of rotation. Lin does not disclose obtain, by using the eye tracking sensor, a gaze point at which gaze direction of a left eye of the user and gaze direction of a right eye of the user converge. Lin does not disclose that the lens is a variable focus lens. However, Lee does disclose measuring a radius of rotation of the detected pupil feature point, determine eye relief, which is a distance between the eye of the user and the variable focus lens, based on the measured radius of rotation (Lee at Figs. 5B-5C; ¶ [0164]-[0166]. Examiner takes an official notice that the Pythagorean Theorem is well-known in the art. In view of the officially noticed facts, it would be obvious to a person of ordinary skill to use the Pythagorean Theorem for the well-known purpose of determining a radial distance from eye 33 to screen 530). Lee does disclose obtain, by using the eye tracking sensor, a gaze point at which gaze direction of a left eye of the user and gaze direction of a right eye of the user converge (Lee at Fig. 4, Steps S410-S430). Lee also discloses a variable focus lens (Lee at ¶ [0006], [0063]). Lin discloses a base augmented reality device upon which the claimed invention is an improvement. Lee discloses a comparable augmented reality device which has been improved in the same way as the claimed invention. Hence, it would have been obvious to a person having ordinary skill in the art at the time of filing to modify or add to Lin the teachings of Lee for the predictable result preventing dizziness or motion sickness to a user watching a virtual image for a long time (Lee at ¶ [0005]). The combination of Lin and Lee does not disclose: wherein the radius of rotation is a distance between a first pupil feature point when the eye of the user is moved by a preset rotation angle in a first direction and a second pupil feature point when the eye of the user is moved by the preset rotation angle in a second direction which is opposite to the first direction. However, Rani does disclose wherein the radius of rotation is a distance between a first pupil feature point when the eye of the user is moved by a preset rotation angle in a first direction and a second pupil feature point when the eye of the user is moved by the preset rotation angle in a second direction which is opposite to the first direction (Rani at Figs. 5-6, second distance 512 or fourth distance 516; col. 13, ll. 11-17 discloses “After identifying the eyes and the pupils, the measurement component 328 may determine a first distance 510 between a first edge 506(1) of the right eye 502(1) and a center of the right pupil 504(1), as well as a second distance 512 between the first edge 506(1) of the right eye 502(1) and a second edge 508(1) of the right eye 502(1).”). The combination of Lin and Lee discloses a base eyeglass based measurement system upon which the claimed invention is an improvement. Rani discloses a comparable eyeglass based measurement system which has been improved in the same way as the claimed invention. Hence, it would have been obvious to a person having ordinary skill in the art before the effective filing date to modify or add to the combination of Lin and Lee and Lee the teachings of Rani for the predictable result of determining the distance between the user’s eyes and the object upon which they gaze (Rani at col. 4. Ll. 43-45). As to claim 2, the combination of Lin, Lee and Rani discloses the augmented reality device of claim 1, wherein the eye tracking sensor comprises an IR light source configured to emit the IR light to the eye of the user and an IR camera configured to obtain a plurality of images by receiving, through an image sensor of the IR camera, the reflected IR light from the eye of the user (Lin at Fig. 1), and wherein the at least one processor is further configured to: identify at least one pixel representing the pupil from among a plurality of pixels of each of the plurality of images obtained by the IR camera, and detect the pupil feature point based on the identified at least one pixel (Lee at Figs. 5B-5C; ¶ [0164]-[0166]). Lin discloses a base augmented reality device upon which the claimed invention is an improvement. Lee discloses a comparable augmented reality device which has been improved in the same way as the claimed invention. Hence, it would have been obvious to a person having ordinary skill in the art at the time of filing to modify or add to Lin the teachings of Lee for the predictable result preventing dizziness or motion sickness to a user watching a virtual image for a long time (Lee at ¶ [0005]). As to claim 6, the combination of Lin, Lee and Rani discloses the augmented reality device of claim 1, wherein the eye relief is inversely proportional to the radius of rotation of the pupil feature point (Lee at Fig. 5C). Lin discloses a base augmented reality device upon which the claimed invention is an improvement. Lee discloses a comparable augmented reality device which has been improved in the same way as the claimed invention. Hence, it would have been obvious to a person having ordinary skill in the art at the time of filing to modify or add to Lin the teachings of Lee for the predictable result preventing dizziness or motion sickness to a user watching a virtual image for a long time (Lee at ¶ [0005]). As to claim 7, the combination of Lin, Lee and Rani discloses the augmented reality device of claim 1, wherein the eye tracking sensor comprises a first eye tracking sensor configured to obtain a left-eye image by photographing the left eye of the user and detect a left-eye pupil from the left-eye image and a second eye tracking sensor configured to obtain a right-eye image by photographing the right eye of the user and detect a right-eye pupil from the right-eye image, and wherein the at least one processor is further configured to: obtain three-dimensional coordinates of the left-eye pupil and the right-eye pupil based on a positional relationship between the first eye tracking sensor and the second eye tracking sensor, and camera attribute information, and obtain the interpupillary distance based on the three-dimensional coordinates of the left-eye pupil and the right-eye pupil (Lin at Figs. 6, 9, in particular; ¶ [0051]-[0052]). As to claim 8, the combination of Lin, Lee and Rani discloses the augmented reality device of claim 1, wherein the at least one processor is further configured to: obtain coordinates of a center focus based on the eye relief, a distance between the gaze point and the eyes of the user, and the interpupillary distance (Lee at Fig. 5C; Lin at Figs. 6, 9); and determine, as the focal region, a region of a preset size around the center focus (Lee at Figs. 1, 13, 16, focus adjustment regions 112 to 126). Lin discloses a base augmented reality device upon which the claimed invention is an improvement. Lee discloses a comparable augmented reality device which has been improved in the same way as the claimed invention. Hence, it would have been obvious to a person having ordinary skill in the art at the time of filing to modify or add to Lin the teachings of Lee for the predictable result preventing dizziness or motion sickness to a user watching a virtual image for a long time (Lee at ¶ [0005]). As to claim 9, the combination of Lin, Lee and Rani discloses the augmented reality device of claim 1, wherein the at least one processor is further configured to adjust refractive power of the focal region by applying a control voltage to the variable focus lens to generate a phase modulation profile for a position corresponding to the focal region (Lee at Figs. 6-8, in particular; ¶ [0180]-[0181]). Lin discloses a base augmented reality device upon which the claimed invention is an improvement. Lee discloses a comparable augmented reality device which has been improved in the same way as the claimed invention. Hence, it would have been obvious to a person having ordinary skill in the art at the time of filing to modify or add to Lin the teachings of Lee for the predictable result preventing dizziness or motion sickness to a user watching a virtual image for a long time (Lee at ¶ [0005]). As to claim 10, Lin discloses an operating method of augmented reality device, the operating method comprising: emitting, through an eye tracking sensor of the augmented reality device, an infrared (IR) light to an eye of a user; receiving, through the eye tracking sensor, the IR light reflected by the eye of the user (Lin at Fig. 1, light sources 160 and image capture unit 140a,b; ¶ [0023]); detecting a pupil feature point from the reflected IR light (Lin at Fig. 1, light sources 160 and image capture units 140a,b Fig. 7, step S201; ¶ [0023][0042] discloses “The plurality of left-eye features comprises but not limited to, features of the left eyelid, features of the left iris, and/or the left eye pupil 12a.”); determining eye relief, which is a distance between the eye of the user and a… focus lens of the augmented reality device,… (Lin at Figs. 2-7, steps S702-S703, S705; ¶ [0031] discloses “he term “eye relief” in this disclosure is defined as a distance from the eye of the user to an outer surface (or the center of the outer surface) of a corresponding lens. For example, the first eye relief is a distance from the left eye 10a to the outer surface of the first lens 130a. As another example, the second eye relief is a distance from the right eye 10b to the outer surface of the second lens 130b.” ¶ [0042]-[0043]. Claim 1 discloses “calculating a second eye relief according to at least one right-eye feature in the two right-eye images; calculating an interpupillary distance (IPD) according to the first eye relief and the second eye relief; and adjusting, according to the IPD, a distance between a first lens and a second lens of the HMD”),… obtaining, through the eye tracking sensor, an interpupillary distance which is a distance between a pupil of the left eye and a pupil of the right eye (Lin at Figs. 7, step 708; Claim 1 discloses “calculating an interpupillary distance (IPD) according to the first eye relief and second eye relief”), and determining a position of a focal region of the variable focus lens based on the eye relief, the gaze point, and the interpupillary distance (Lin at ¶ [0052] discloses “In some embodiments that the IPD has been determined, the HMD 100 may further use the first image capture unit 140a to capture a plurality of left- eye images, and use the second image capture unit 140b to capture a plurality of right- eye images. Then, the HMD 100 recognizes the plurality of left-eye features from the plurality of left-eye images and the plurality of right-eye features from the plurality of right-eye images to dynamically determine a point of gaze of the user.” Claim 9 including claims 1, 5, 6).). Lin does not disclose obtaining, through the eye tracking sensor, a gaze point at which gaze direction of a left eye of the user and gaze direction of a right eye of the user converge. Lin does not disclose that the lens is a variable focus lens. Lin does not disclose measuring a radius of the detected pupil feature point; determining eye relief based on measured radius of rotation. However, Lee does disclose obtaining, through the eye tracking sensor, a gaze point at which gaze direction of a left eye of the user and gaze direction of a right eye of the user converge (Lee at Fig. 2; Fig. 4, Steps S410-S430). Lee also discloses a variable focus lens (Lee at ¶ [0006], [0063]). Lee discloses measuring a radius of the detected pupil feature point; determining eye relief based on measured radius of rotation (Lee at Figs. 5B-5C; ¶ [0164]-[0166]. Examiner takes an official notice that the Pythagorean Theorem is well-known in the art. In view of the officially noticed facts, it would be obvious to a person of ordinary skill to use the Pythagorean Theorem for the well-known purpose of determining a radial distance from eye 33 to screen 530). Lin discloses a base augmented reality device upon which the claimed invention is an improvement. Lee discloses a comparable augmented reality device which has been improved in the same way as the claimed invention. Hence, it would have been obvious to a person having ordinary skill in the art at the time of filing to modify or add to Lin the teachings of Lee for the predictable result preventing dizziness or motion sickness to a user watching a virtual image for a long time (Lee at ¶ [0005)). The combination of Lin and Lee does not disclose: wherein the radius of rotation is a distance between a first pupil feature point when the eye of the user is moved by a preset rotation angle in a first direction and a second pupil feature point when the eye of the user is moved by the preset rotation angle in a second direction which is opposite to the first direction. However, Rani does disclose wherein the radius of rotation is a distance between a first pupil feature point when the eye of the user is moved by a preset rotation angle in a first direction and a second pupil feature point when the eye of the user is moved by the preset rotation angle in a second direction which is opposite to the first direction (Rani at Figs. 5-6, second distance 512 or fourth distance 516; col. 13, ll. 11-17 discloses “After identifying the eyes and the pupils, the measurement component 328 may determine a first distance 510 between a first edge 506(1) of the right eye 502(1) and a center of the right pupil 504(1), as well as a second distance 512 between the first edge 506(1) of the right eye 502(1) and a second edge 508(1) of the right eye 502(1).”). The combination of Lin and Lee discloses a base eyeglass based measurement system upon which the claimed invention is an improvement. Rani discloses a comparable eyeglass based measurement system which has been improved in the same way as the claimed invention. Hence, it would have been obvious to a person having ordinary skill in the art before the effective filing date to modify or add to the combination of Lin and Lee and Lee the teachings of Rani for the predictable result of determining the distance between the user’s eyes and the object upon which they gaze (Rani at col. 4. Ll. 43-45). As to claim 11, the combination of Lin, Lee, and Rani discloses the operating method of claim 10, wherein the emitting of the IR light comprises emitting, by using an IR light source of the eye tracking sensor, to the eye of the user, wherein the detecting of the IR light comprises obtaining, by using an IR camera of the eye tracking sensor, a plurality of images by receiving, through an image sensor, the reflected IR light from the eye of the user (Lin at Fig. 1), and wherein detecting of the pupil feature point comprises: identifying at least one pixel representing the pupil from among a plurality of pixels of each of the plurality of images obtained by the IR camera; and detecting the pupil feature point based on the identified at least one pixel (Lee at Figs. 5B-5C; ¶ [0164]-[0166]). Lin discloses a base augmented reality device upon which the claimed invention is an improvement. Lee discloses a comparable augmented reality device which has been improved in the same way as the claimed invention. Hence, it would have been obvious to a person having ordinary skill in the art at the time of filing to modify or add to Lin the teachings of Lee for the predictable result preventing dizziness or motion sickness to a user watching a virtual image for a long time (Lee at ¶ [0005]). As to claim 15, the combination of Lin, Lee, and Rani discloses the operating method of claim 10, wherein the eye tracking sensor comprises a first eye tracking sensor configured to obtain a left-eye image by photographing the left eye of the user and detect a left-eye pupil from the left-eye image and a second eye tracking sensor configured to obtain a right-eye image by photographing the right eye of the user and detect a right-eye pupil from the right-eye image, and wherein the obtaining of the interpupillary distance comprises: obtaining three-dimensional coordinates of the left-eye pupil and the right-eye pupil based on a positional relationship between the first eye tracking sensor and the second eye tracking sensor, and camera attribute information; and obtaining the interpupillary distance based on the three-dimensional coordinates of the left-eye pupil and the right-eye pupil (Lin at Figs. 6, 9, in particular; ¶ [0051]-[0052]). As to claim 16, the combination of Lin, Lee, and Rani discloses the operating method of claim 10, wherein the determining of the position of a focal region of the variable focus lens comprises: obtaining coordinates of a center focus based on the eye relief, a distance between the gaze point and the eyes of the user, and the interpupillary distance (Lee at Fig. 5C; Lin at Figs. 6, 9); and determining, as the focal region, a region of a preset size around the center focus (Lee at Figs. 1, 13, 16, focus adjustment regions 112 to 126). Lin discloses a base augmented reality device upon which the claimed invention is an improvement. Lee discloses a comparable augmented reality device which has been improved in the same way as the claimed invention. Hence, it would have been obvious to a person having ordinary skill in the art at the time of filing to modify or add to Lin the teachings of Lee for the predictable result preventing dizziness or motion sickness to a user watching a virtual image for a long time (Lee at ¶ [0005]). As to claim 17, the combination of Lin, Lee, and Rani discloses the operating method of claim 10, further comprises: adjusting refractive power of the focal region by applying a control voltage to the variable focus lens to generate a phase modulation profile for a position corresponding to the focal region (Lee at Figs. 6-8, in particular; ¶ [0180]-[0181]). Lin discloses a base augmented reality device upon which the claimed invention is an improvement. Lee discloses a comparable augmented reality device which has been improved in the same way as the claimed invention. Hence, it would have been obvious to a person having ordinary skill in the art at the time of filing to modify or add to Lin the teachings of Lee for the predictable result preventing dizziness or motion sickness to a user watching a virtual image for a long time (Lee at ¶ [0005]). As to claim 18, Lin discloses a computer program product comprising a computer-readable storage medium, the computer-readable storage medium comprising instructions readable by an augmented reality device (Lin at Fig. 1, processing circuit 170, memory module 172, and computer program 174) to perform: emitting, through an eye tracking sensor of the augmented reality device, an infrared (IR) light to an eye of a user; receiving, through the eye tracking sensor, the IR light reflected by the eye of the user (Lin at Fig. 1, light sources 160 and image capture unit 140a,b; ¶ [0023]); detecting a pupil feature point from the reflected IR light; measuring a radius of rotation of the detected pupil feature point (Lin at Fig. 1, light sources 160 and image capture units 140a,b Fig. 7, step S201; ¶ [0023][0042] discloses “The plurality of left-eye features comprises but not limited to, features of the left eyelid, features of the left iris, and/or the left eye pupil 12a.”); determining eye relief, which is a distance between the eye of the user and a… focus lens of the augmented reality device,… (Lin at Figs. 2-7, steps S702-S703, S705; ¶ [0031] discloses “he term “eye relief” in this disclosure is defined as a distance from the eye of the user to an outer surface (or the center of the outer surface) of a corresponding lens. For example, the first eye relief is a distance from the left eye 10a to the outer surface of the first lens 130a. As another example, the second eye relief is a distance from the right eye 10b to the outer surface of the second lens 130b.” ¶ [0042]-[0043]. Claim 1 discloses “calculating a second eye relief according to at least one right-eye feature in the two right-eye images; calculating an interpupillary distance (IPD) according to the first eye relief and the second eye relief; and adjusting, according to the IPD, a distance between a first lens and a second lens of the HMD”)… obtaining, through the eye tracking sensor, an interpupillary distance which is a distance between a pupil of the left eye and a pupil of the right eye (Lin at Figs. 7, step 708; Claim 1 discloses “calculating an interpupillary distance (IPD) according to the first eye relief and second eye relief”), and determining a position of a focal region of the variable focus lens based on the eye relief, the gaze point, and the interpupillary distance (Lin at ¶ [0052] discloses “In some embodiments that the IPD has been determined, the HMD 100 may further use the first image capture unit 140a to capture a plurality of left- eye images, and use the second image capture unit 140b to capture a plurality of right- eye images. Then, the HMD 100 recognizes the plurality of left-eye features from the plurality of left-eye images and the plurality of right-eye features from the plurality of right-eye images to dynamically determine a point of gaze of the user.” Claim 9 including claims 1, 5, 6).). Lin does not disclose obtaining, through the eye tracking sensor, a gaze point at which gaze direction of a left eye of the user and gaze direction of a right eye of the user converge. Lin does not disclose that the lens is a variable focus lens. Lin does not disclose measuring a radius of the detected pupil feature point; determining eye relief based on measured radius of rotation. However, Lee does disclose obtaining, through the eye tracking sensor, a gaze point at which gaze direction of a left eye of the user and gaze direction of a right eye of the user converge (Lee at Fig. 4, Steps S410-S430). Lee also discloses a variable focus lens (Lee at ¶ [0006], [0063]). Lee discloses measuring a radius of the detected pupil feature point; determining eye relief based on measured radius of rotation (Lee at Figs. 5B-5C; ¶ [0164]-[0166]. Examiner takes an official notice that the Pythagorean Theorem is well-known in the art. In view of the officially noticed facts, it would be obvious to a person of ordinary skill to use the Pythagorean Theorem for the well-known purpose of determining a radial distance from eye 33 to screen 530). Lin discloses a base augmented reality device upon which the claimed invention is an improvement. Lee discloses a comparable augmented reality device which has been improved in the same way as the claimed invention. Hence, it would have been obvious to a person having ordinary skill in the art at the time of filing to modify or add to Lin the teachings of Lee for the predictable result preventing dizziness or motion sickness to a user watching a virtual image for a long time (Lee at ¶ [0005)). The combination of Lin and Lee does not disclose: wherein the radius of rotation is a distance between a first pupil feature point when the eye of the user is moved by a preset rotation angle in a first direction and a second pupil feature point when the eye of the user is moved by the preset rotation angle in a second direction which is opposite to the first direction. However, Rani does disclose wherein the radius of rotation is a distance between a first pupil feature point when the eye of the user is moved by a preset rotation angle in a first direction and a second pupil feature point when the eye of the user is moved by the preset rotation angle in a second direction which is opposite to the first direction (Rani at Figs. 5-6, second distance 512 or fourth distance 516; col. 13, ll. 11-17 discloses “After identifying the eyes and the pupils, the measurement component 328 may determine a first distance 510 between a first edge 506(1) of the right eye 502(1) and a center of the right pupil 504(1), as well as a second distance 512 between the first edge 506(1) of the right eye 502(1) and a second edge 508(1) of the right eye 502(1).”). The combination of Lin and Lee discloses a base eyeglass based measurement system upon which the claimed invention is an improvement. Rani discloses a comparable eyeglass based measurement system which has been improved in the same way as the claimed invention. Hence, it would have been obvious to a person having ordinary skill in the art before the effective filing date to modify or add to the combination of Lin and Lee and Lee the teachings of Rani for the predictable result of determining the distance between the user’s eyes and the object upon which they gaze (Rani at col. 4. Ll. 43-45). Claims 3, 4, 5 are rejected under 35 U.S.C. 103 as being unpatentable over Lin, Lee, and Rani as applied to claims 2 or 1 respectively above, and further in view of Koo (US 2022/0229490 A1, Filed January 21, 2022) As to claim 3, the combination of Lin, Lee, and Rani discloses the augmented reality device of claim 2. The combination does not expressly disclose that the at least one processor is further configured to: obtain coordinate information of the at least one pixel from among the plurality of pixels of each of the plurality of images obtained by the IR camera, and measure the radius of rotation of the pupil feature point based on the coordinate information. However, Koo does disclose that the at least one processor is further configured to: obtain coordinate information of the at least one pixel from among the plurality of pixels of each of the plurality of images obtained by the IR camera, and measure the radius of rotation of the pupil feature point based on the coordinate information (Koo at ¶ [0102]). The combination of Lin, Lee, and Rani discloses a base augmented reality device upon which the claimed invention is an improvement. Koo discloses a comparable augmented reality device which has been improved in the same way as the claimed invention. Hence, it would have been obvious to a person having ordinary skill in the art at the time of filing to modify or add to the combination of Lin, Lee, and Rani the teachings of Koo for the predictable result of increasing user comfort (Koo at ¶ [0003]). As to claim 4, the combination of Lin, Lee, and Rani discloses the augmented reality device of claim 1. The combination does not expressly disclose that the eye tracking sensor comprises an IR scanner configured to emit the IR light in a form of point light or line light toward a light reflector, such that the emitted IR light is reflected by the light reflector to be directed to the eye of the user and an IR detector configured to detect the IR light reflected by the eye of the user, and wherein the at least one processor is further configured to identify a position of the pupil feature point by analyzing the IR light detected by the IR detector. However, Koo does disclose that the eye tracking sensor comprises an IR scanner configured to emit the IR light in a form of point light or line light toward a light reflector, such that the emitted IR light is reflected by the light reflector to be directed to the eye of the user and an IR detector configured to detect the IR light reflected by the eye of the user, and wherein the at least one processor is further configured to identify a position of the pupil feature point by analyzing the IR light detected by the IR detector (Koo at Figs. 2, 4; ¶ [0020])). The combination of Lin, Lee, and Rani discloses a base augmented reality device upon which the claimed invention is an improvement. Koo discloses a comparable augmented reality device which has been improved in the same way as the claimed invention. Hence, it would have been obvious to a person having ordinary skill in the art at the time of filing to modify or add to the combination of Lin, Lee, and Rani the teachings of Koo for the predictable result of increasing user comfort (Koo at ¶ [0003]). As to claim 5, the combination of Lin, Lee, Rani, and Koo discloses the electronic device of claim 4, wherein the at least one processor is further configured to measure the radius of rotation of the pupil feature point based on a change of the position of the pupil feature point (Lee at Figs. 5B-5C; ¶ [0164]-[0166]). Claims 12, 13, 14 are rejected under 35 U.S.C. 103 as being unpatentable over Lin, Lee, and Rani as applied to claim 10 above, and further in view of Koo (US 2022/0229490 A1, Filed January 21, 2022) As to claim 12, the combination of Lin, Lee, and Rani discloses the operating method of claim 10. The combination does not expressly disclose that the measuring of the radius of rotation of the detected pupil feature point comprises: obtaining coordinate information of the at least one pixel from among the plurality of pixels of each of the plurality of images obtained by the IR camera; and measuring the radius of rotation of the pupil feature point based on the coordinate information. However, Koo does disclose that the measuring of the radius of rotation of the detected pupil feature point comprises: obtaining coordinate information of the at least one pixel from among the plurality of pixels of each of the plurality of images obtained by the IR camera; and measuring the radius of rotation of the pupil feature point based on the coordinate information (Koo at ¶ [0102]). The combination of Lin, Lee, and Rani discloses a base augmented reality device upon which the claimed invention is an improvement. Koo discloses a comparable augmented reality device which has been improved in the same way as the claimed invention. Hence, it would have been obvious to a person having ordinary skill in the art at the time of filing to modify or add to the combination of Lin, Lee, and Rani the teachings of Koo for the predictable result of increasing user comfort (Koo at ¶ [0003]). As to claim 13, the combination of Lin, Lee, and Rani discloses the operating method of claim 10. The combination does not expressly disclose that the measuring of the radius of rotation of the detected pupil feature point comprises: wherein the emitting of the IR light comprises emitting the IR light in a form of point light or line light toward a light reflector, such that the emitted IR light is reflected by the light reflector to be directed to the eye of the user, wherein the detecting of the IR light comprises detecting, by using an IR detector of the eye tracking sensor, the IR light reflected by the eye of the user, wherein the detecting of the pupil feature point comprises identifying a position of the pupil feature point by analyzing the IR light detected by the IR detector. However, Koo does disclose that the measuring of the radius of rotation of the detected pupil feature point comprises: wherein the emitting of the IR light comprises emitting the IR light in a form of point light or line light toward a light reflector, such that the emitted IR light is reflected by the light reflector to be directed to the eye of the user, wherein the detecting of the IR light comprises detecting, by using an IR detector of the eye tracking sensor, the IR light reflected by the eye of the user, wherein the detecting of the pupil feature point comprises identifying a position of the pupil feature point by analyzing the IR light detected by the IR detector (Koo at Figs. 2, 4; ¶ [0020])). The combination of Lin, Lee, and Rani discloses a base augmented reality device upon which the claimed invention is an improvement. Koo discloses a comparable augmented reality device which has been improved in the same way as the claimed invention. Hence, it would have been obvious to a person having ordinary skill in the art at the time of filing to modify or add to the combination of Lin, Lee, and Rani the teachings of Koo for the predictable result of increasing user comfort (Koo at ¶ [0003]). As to claim 14, the combination of Lin, Lee, Rani, and Koo operating method of claim 13, wherein the measuring of the radius of rotation of the detected pupil feature point comprises measuring the radius of rotation of the pupil feature point based on a change of the position of the pupil feature point (Lee at Figs 5B-5C; ¶ [0164]-[0166]). Response to Arguments Applicant’s arguments with respect to claims 1-18 have been considered but they are believed to be addressed above, and therefore, moot in view of the new grounds of rejection. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Sanjiv D Patel whose telephone number is (571)270-5731. The examiner can normally be reached Monday - Friday, 9:00 am - 5:00 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, William Boddie can be reached at 571-272-0666. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Sanjiv D. Patel/Primary Examiner, Art Unit 2625 03/18/2026
Read full office action

Prosecution Timeline

Mar 13, 2025
Application Filed
Nov 12, 2025
Non-Final Rejection — §103, §DP
Feb 26, 2026
Response Filed
Mar 18, 2026
Final Rejection — §103, §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602124
DISPLAY DEVICE INCLUDING A TOUCH SENSOR AND MANUFACTURING METHOD THEREOF
2y 5m to grant Granted Apr 14, 2026
Patent 12603054
DISPLAY SUBSTRATE AND DISPLAY DEVICE
2y 5m to grant Granted Apr 14, 2026
Patent 12596194
Apparatus for Optically Measuring the Distance to a Scattering Target Object or a Reflecting Target Object
2y 5m to grant Granted Apr 07, 2026
Patent 12596448
DISPLAY DEVICE
2y 5m to grant Granted Apr 07, 2026
Patent 12591300
LIDAR-BASED IMMERSIVE 3D REALITY CAPTURE SYSTEMS, AND RELATED METHODS AND APPARATUS
2y 5m to grant Granted Mar 31, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
78%
Grant Probability
82%
With Interview (+4.3%)
2y 1m
Median Time to Grant
Moderate
PTA Risk
Based on 964 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month