Prosecution Insights
Last updated: April 19, 2026
Application No. 16/345,406

METHOD OF DETERMINING AN EYE PARAMETER OF A USER OF A DISPLAY DEVICE

Non-Final OA §103
Filed
Apr 26, 2019
Examiner
PICHLER, MARIN
Art Unit
2872
Tech Center
2800 — Semiconductors & Electrical Systems
Assignee
Essilor International
OA Round
11 (Non-Final)
63%
Grant Probability
Moderate
11-12
OA Rounds
3y 0m
To Grant
72%
With Interview

Examiner Intelligence

Grants 63% of resolved cases
63%
Career Allow Rate
411 granted / 650 resolved
-4.8% vs TC avg
Moderate +9% lift
Without
With
+8.7%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
61 currently pending
Career history
711
Total Applications
across all art units

Statute-Specific Performance

§101
0.2%
-39.8% vs TC avg
§103
41.1%
+1.1% vs TC avg
§102
26.9%
-13.1% vs TC avg
§112
25.0%
-15.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 650 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION Response to Amendment The amendment and the Request for Continuing Examination filed on 07/14/2025 have been entered. Claims 1, 5-11, 17-18, 21 and 22 are now pending in the application. Claim 1 has been amended and claim 26 has been canceled by the Applicant. Previous claim 26 rejection under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph has been withdrawn in light of Applicant’s cancelation of the claim. Examiner Notes Examiner cites particular columns and line numbers in the references as applied to the claims below for the convenience of the applicant. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested that, in preparing responses, the applicant fully consider the references in entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the examiner. Priority As required by e M.P.E.P. 201.14(c), acknowledgement is made of applicant’s claim for priority based on national stage entry of application PCT/IB2016/001705, international filing date of 10/28/2016. Receipt is acknowledged of papers submitted under 35 U.S.C. 119(a)-(d), which papers have been placed of record in the file. However, to overcome a prior art rejection, applicant(s) must submit a translation of the foreign priority papers in order to perfect the claimed foreign priority because said papers has not been made of record in accordance with 37 CFR 1.55. See MPEP § 201.15. Drawings The applicant’s drawings submitted are acceptable for examination purposes. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 5-11, 17-18 and 21-22 are rejected under 35 U.S.C. 103 as being unpatentable over Samec et al. (hereafter Samec, of record, see Information Disclosure Statement dated 04/26/2019) US 20170000324 A1 in view of Border et al. (hereafter Border, of record) US 20160131912 A1 and further in view of Qi (of record) US 20140168607 A1. In regard to independent claim 1, Samec teaches (see e.g. Figs. 2-15) a method for determining an eye parameter of a user of a binocular display device including a left display and a right display, the eye parameter relating to a dioptric parameter of an ophthalmic lens to be provided to the user (i.e. as wearable AR device and method e.g. 62,100 1400,1500 of ophthalmic system functioning as a phoropter/or refractor to determine a suitable refraction to correct/improve vision of a wearer/patient, and includes adaptive optics (VFE) including refractive lenses for left and right eye, paragraphs [03, 1416-18,1443, 1473-1487, 1492, 1496-97, 1510-11,1685-96,1701-02, 711-1717], see e.g. Figs. 3-5, 10A-12,14-15), the method (e.g. 1400, 1500) comprising: displaying a left image and a right image to the user when using the binocular display device (i.e. as image displayed, projected to the user eyes, paragraphs [1700-01, 1487,1510-11, 1685-91], see e.g. Figs. 5, 10-12, 14-15), wherein the binocular display device comprises left and right moveable lenses through which the user sees the displayed left and right images, respectively (i.e. as images displayed/projected with adaptive optics (VFE) including refractive lenses for left and right eye with moving elements, deforming moving parts, paragraphs [1487,1496-97,1506,1615, 1685-94,1701-02]), and a content of the right image is different from a content of the left image (i.e. as VFE provides images with different content, e.g. paragraphs [1648, 1663-64, 1680]; modifying at least one parameter of the display device at least one parameter of the display device (i.e. image displayed/projected with incremental modification(s) at different virtual distances and changes with adaptive optics and display, paragraphs [1487, 1702-03,1648,1496,1506,1615, 1685-94,1710-11], Figs. 10-12, 14-15) by (1) moving the left movable lens and the right movable lens along an axis toward or away from the left image and the right image, respectively (i.e. as adaptable optic include moving lens elements, e.g. to varying/alter focus of projected images, paragraphs [1487,1496,1506,1615), Figs. 10-12, 14-15), (2) adjusting, a separation distance between a right reference point of the right image seen by the right eye of the user and a left reference point of the left image seen by the left eye of the user (i.e. due to left/right eye testing i.e. as system is projecting images to left/right eye, at depth planes e.g. near, intermediate, far as system adjusts for inter-pupillary distance (IPD), and using focused most collimated images given pupil size, constriction state, i.e. the vergence, dynamic accommodation, and optical alignment to user’s eyes, as described in e.g. paragraphs [1713-1716, 1439, 1687]), and (3) scaling the right and left images so that both images are seen with a same angular size by the user (i.e. by treating both eyes together with binocular system 1400, by presenting varying sizes of images and at varying depth planes, e.g. paragraphs [1684, 1691,1694, 1698-1703], Figs. 5,10A, 14), wherein modifying step is repeated until image subjective quality of a perceived image is perceived by the user as optimal (i.e. as until user/wearer can view the image comfortably e.g. through biofeedback, see paragraphs [376, 1703-04, 1710-16,1682-92, 1526-28,1571,1604,1633, 1707-1713], as clearly depicted by dashed arrows in Figs. 10A, 11,12,15 ); and determining the eye parameter based on the modified at least one parameter of the binocular display device (i.e. as determined eye prescription of the user/wearer, see paragraphs [1475, 1487, 1685-91, 1699, 1713-17], see e.g. Figs. 5, 10A,14-15, and including treating both eyes together or individually, e.g. paragraphs [1699-1703, 1713-16, 1807-08], Figs. 5, 14), wherein the displaying step and the modifying step are implemented in binocular vision (i.e. by treating both eyes together using binocular system 1400, e.g. paragraphs [1699-1703, 1713-16, 1807-08], Figs. 5, 12,14), and wherein the method further comprises a calibration step including determining a correlation between the at least one parameter of the binocular display device modified during the modifying step and a virtual display distance, the calibrations step comprising using a device incorporating a camera (i.e. as system 1400 performing method 1500 includes calibration for determining proper diopter and/or account for the proper diopter correction, based at least in part on the configuration of the display platform of 1400, 62, and given that during calibration, the iris of the wearer is analyzed, as camera e.g. inward facing camera of the device is used, paragraphs [1721, 1708,1705]), and the step of determining the eye parameter further comprises determining the eye parameter of the user using said correlation (i.e. as calibration is used for proper diopter correction and prescription (diopter) for corrective lens for wearer/user, see paragraphs [1721, 1687-92, 1699-1703, 1710-13]), and wherein the moving of the left and right lenses varies the virtual display distance of the perceived image by changing an effective vergence of rays entering the eyes (since (i.e. adaptable optic moving lens elements, vary and alter focus of projected images, changing the vergence of wearer’s eyes when the focus/diopter is changed, see paragraphs [1487,1496,1506,1588,1615, 1707-1712], Figs. 10-12, 14-15), the variation in the virtual display distance being continuous and being achieved by displacement of the lenses so as to simulate different focal distances (i.e. as adaptable optic moving lens elements, e.g. to vary/alter focus (dioptric power) of projected images as continuous change of focus/diopter, paragraphs [1487,1496,1506,1615, 1712), Figs. 10-12, 14-15), Samec does not explicitly disclose that adjusting so that right reference point and left reference point provide parallel beams to the user where adjustment step comprises performing ray tracing for a first horizontal ray from the right eye through the right lens to the right reference point on the right image, performing ray tracing for a second horizontal ray from the left eye through the left lens to the left reference point on the left image, and separating the left and right images on the left and right displays so that a distance between the left and right images corresponds to the separation distance between the left and right reference points (i.e. as it noted that Samec includes left/right eye testing by projecting images to left/right eye through each lens e.g. 1024, see at least paragraphs [1439, 1472, 1687, 1692, 1694, 1705-07, 1713-1716]). However, Border teaches in the same field of invention of head-worn computing (HWC) systems and related methods (see Figs. 1-2, 14, 112-117, 165-171, 186-192, Title, Abstract, paragraphs [226-236, 740-762]) and further teaches adjusting (i.e. as adjusting lens elements e.g. 14624 for the head-worn display for flexible interpupillary spacing, e.g. paragraphs [647, 667], and as displayed images to left and right eye be shifted laterally through digital shifting thus changing the distance of a point, e.g. center point, on left and right image presented to left and right eye of the user and user’s convergence distance, see paragraphs [647,667,760-762], Figs. 186-192) so that the right reference point and left reference point provide parallel beams to the user (i.e. as L, R images laterally shifted such that images from a point (e.g. center point) provide parallel lines of sight to each eye to provide for difference in interpupillary distance of different users, see paragraphs [760-762], Figs. 187-191), also that during the adjustment step, the distance between the right reference point and the left reference point is adjusted based on a virtual convergence distance (i.e. as lateral shift of L, R images to L and R eye of the user is based on convergence distance due to presented L, R images through HWC system, as depicted in Figs. 187-192, thus providing desired convergence distance and adjustment for different interpupillary distances of users, and also allowing the focus distance to be same as the convergence distance so that the focus cue associated with the focus distance is the same as the convergence cue and the user thereby is presented with a stereo image that has consistent stereo cues for a more comfortable viewing experience, see paragraphs [760-761]), and where adjustment step comprises performing ray tracing for a first horizontal ray from the right eye through the right lens to the right reference point on the right image, performing ray tracing for a second horizontal ray from the left eye through the left lens to the left reference point on the left image (i.e. as lines of sight of the user are traced from each R, L eye of the user through lenses of optical assembly to each point (e.g. center point) in each R, L image of R, L display so that displayed images are shifted laterally through digital shifting of the image on the image source 110 that can change the convergence distance in viewing of stereo images and change the perceived distance to the displayed image, see Border, paragraphs [760-762]), and that separating the left and right images on the left and right displays so that a distance between the left and right images corresponds to the separation distance between the left and right reference points (i.e. as images are separated corresponding to e.g. center pints for parallel lines of sight to each eye of the user, Border paragraphs [760-762], thus providing desired convergence distance and adjustment for different interpupillary distances of users, and also allowing the focus distance to be same as the convergence distance so that the focus cue associated with the focus distance is the same as the convergence cue and the user thereby is presented with a stereo image that has consistent stereo cues for a more comfortable viewing experience, see paragraphs [760-761]). Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to adapt the wearable AR device and methods of Samec to include adjustment of corrective lenses and that adjusting displayed images to left and right eye can be shifted laterally through digital shifting thus changing the distance of a point, e.g. center point, on left and right image presented to left and right eye of the user and user’s convergence distance including infinity where L, R reference points provide parallel beams to the user L, R eye that is based on convergence distance due to presented L, R images through HWC system, using ray-tracing i.e. lines of sight of the user are traced from each R, L eye of the user through lenses of optical assembly to each point (e.g. center point) in each R, L image of R, L display, as images are separated corresponding to e.g. center pints for parallel lines of sight to each eye of the user, in order provide for desired convergence distance and adjustment for different interpupillary distances of users, and also allow the focus distance to be same as the convergence distance so that the focus cue associated with the focus distance is the same as the convergence cue and the user thereby is presented with a stereo image that has consistent stereo cues for a more comfortable viewing experience, see paragraphs [760-761]). Additionally, in the alternative, that Samec alone does not disclose scaling the right and left images so that both images are seen with a same angular size by the user, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to adapt the wearable AR device and methods of Samec to include scaling the right and left images so that both images are seen with a same angular size by the user, i.e. that the images are presented having same angular size (as disclosed in Border, e.g. stereo image presented to the user can be scaled over portion or entire image to adjust disparity and /or stereo depth over the entire image, see paragraphs [667,741-744, 752]), according to teaching of Border in order to adjust disparity of the presented stereo image and present the user with adjusted stereo depth over the entire image viewed by the user, and provide comfortable viewing conditions for the user (see Border, paragraphs [741-744, 752]). Further, given the narrower reading of the term performing ray tracing involving numerical calculations which the Samec-Border combination does not explicitly mention (i.e. in the combination Border teaches that lines of sight of the user are traced from each R, L eye of the user through lenses of optical assembly to each point (e.g. center point) in each R, L image of R, L display so that displayed images are shifted laterally through digital shifting of the image on the image source 110 that can change the convergence distance in viewing of stereo images and change the perceived distance to the displayed image, see Border, paragraphs [760-762]). However, Qi teaches in same field of invention of eyeglasses-wearing simulation method, device and system (see Figs. 1-10, Title, Abstract, paragraphs [02,6-18, 32-42,50-54] where images are presented separately through L,R lenses to L, R eyes of patient, e.g. Figs. 3-6) and further explicitly teaches performing ray tracing (i.e. as simulation program performs ray tracing of binocular vision for all the intersection points of the sample points with respect to the axes, and that enables calculations of distortion and blur for each intersection point, paragraphs [68-72, 80]). Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to apply numerical calculation ray-tracing method of Qi to the adjusting step for displayed images and adjustable lenses of Samec-Border combination, in order to additionally enable calculations of distortion and blur for each intersection point in the visual field, paragraphs [68-72, 80]). Regarding claim 5, the Samec-Border-Qi combination teaches the invention and the method as set forth above, and Samec teaches (see e.g. Figs. 2-15) that the display device (e.g. 1400, Figs. 5, 10B-E, 14) comprises a light field display and during the modifying step at least one parameter of the light field display is modified so as to modify the virtual display distance of the perceived image (i.e. as images displayed/projected with incremental modification(s) with adaptive optics (VFE) light modulators , paragraphs [699, 1457, 1484, 1497, 1685-90]). Regarding claim 6, the Samec-Border-Qi combination teaches the invention and the method as set forth above, and Samec teaches (see e.g. Figs. 2-15) that the binocular display device (e.g. 1400, Figs. 5, 10B-E, 14) includes means for selecting the virtual display distance among a set of predetermined virtual distances (i.e. as 1400 with interface features 1404, is used to project image light at different focal planes and distances, see paragraphs [1691-96, 1700-04,1710-12, 1503]) and the virtual display distance of the perceived image is obtained from use of at least one of the predetermined virtual distance (i.e. as 1400 used in 1500 to project image light at different focal planes and distances, see paragraphs [1691-96,1700-04,1710-12, 1503]). Regarding claim 7, the Samec-Border-Qi combination teaches the invention and the method as set forth above, and Samec teaches (see e.g. Figs. 2-15) that the binocular display device (e.g. 1400, Figs. 5, 10B-E, 14) further includes an optical element having adjusting means to adjust the power of the optical element (i.e. as images displayed/projected with incremental modification(s) with adaptive optics (VFE) with adjustable/variable optical power/focus, paragraphs [1481,87, 1496-97, 1685-86,93,98, 1710]). Regarding claim 8, the Samec-Border-Qi combination teaches the invention and the method as set forth above, and Samec teaches (see e.g. Figs. 2-15) that the binocular display device is a head mounted binocular display device (i.e. as e.g. 1400,62 is head mounted binocular device, depicted in e.g. Figs 5, 14, paragraphs [1509-10, 1496-97, 1685-93, 1713]). Regarding claim 9, the Samec-Border-Qi combination teaches the invention and the method as set forth above, and Samec teaches (see e.g. Figs. 2-15) that the image subjective quality relates to sharpness of the perceived image (i.e. as image is focused on retina of wearer/user and is clear, paragraphs [1476, 1687, 1695-97,1702-03, 1710]). Regarding claim 10, the Samec-Border-Qi combination teaches the invention and the method as set forth above, and Samec teaches (see e.g. Figs. 2-15) that the method further comprises an astigmatism determining step5Docket No. 521930US (i.e. as 1400 device and method 1500 include testing and determination for astigmatism, paragraphs [1480, 1486,1689, 1710, 1750]) including displaying Preliminary Amendmentan image or a plurality of images comprising similar elements with different orientations to the user and determining orientations corresponding to the subjective sharpest perceived images (as by incrementally change axis of cylinder in determination of astigmatism, paragraphs [1480, 1486,1689, 1710, 1750]). Regarding claim 11, the Samec-Border-Qi combination teaches the invention and the method as set forth above, and Samec teaches (see e.g. Figs. 2-15) that the perceived image subjective quality relates to contrast of the perceived image (i.e. as perceived image is focused on retina of wearer/user with given contrast, paragraphs [1591, 1687, 1695-97,1702-03, 1710, 2101-04]). Regarding claim 17, the Samec-Border-Qi combination teaches the invention and the method as set forth above, and Samec teaches (see e.g. Figs. 2-15) further comprising providing different images at different distances from a reference position, and for each image determining whether the user can detect if the distance is shorter or greater that the reference position (i.e. by treating both eyes together or individually due to varying depth planes, e.g. paragraphs [1684, 1691,1694, 1698-1703,1809-1813], Figs. 5, 14, 18). Regarding claim 18, the Samec-Border-Qi combination teaches the invention and the method as set forth above, and Samec teaches (see e.g. Figs. 2-15) further comprising determining, using an eye tracker, (i.e. as system provides eye tracking, and visual fields testing, paragraphs [1457, 1472, 1490, 1705, 2106-10], also as Border eyes camera eye tracking system) whether the user is seeking in peripheral or central vision (i.e. as eye tracking system used to detect focusing point, convergence point, vergence and accommodation and due to visual field test, paragraphs [1457, 1472, 1490, 1705, 2106-10], Figs. 5, 10D). Regarding claim 21, the Samec-Border-Qi combination teaches the invention and the method as set forth above, and Samec teaches (see e.g. Figs. 2-15, the display device 1400, Figs. 5, 10B-E, 14, with images displayed/projected with adaptive optics (VFE) including refractive lens with moving elements, deforming moving parts, paragraphs [1496-97, 1685-94,1701-02]) wherein step of adjusting the separation distance between the left and right reference points includes keeping a convergence angle of the right eye identical to a convergence angle of the left eye, regardless of the position of the left and right movable lenses (i.e. as the left/right eye testing for depth planes e.g. near, intermediate, far depth planes depending on displayed/projected images, and given that for far depth plane displayed images for left and right eye are far, at infinity such that the provided light beams are parallel with associate far distance plane vergence of the left/right eye, as using software processed projected images, and using focused most collimated images given pupil constriction state, and since the convergence angle for left and right eye lie parallel in the same plane, see e.g. paragraphs [1713-1716, 1694], Figs. 10-12, 14-15), wherein the convergence angle of the right eye is a horizontal angle between a forward direction of the user and a gaze direction of the right eve of the user and the convergence angle of the left eye is a horizontal angle between the forward direction of the user and a gaze direction of the left eye of the user. (i.e. as angle of convergence, as determined based on the position of the left and right eye of the wearer, e.g. paragraphs [176,1546, 1560,1767-1770], see also Border, Figs. 186-192 and descriptions). Regarding claim 22, the Samec-Border-Qi combination teaches the invention and the method as set forth above, and Samec teaches (see e.g. Figs. 2-15) that the method further comprising providing an accommodation of the user (i.e. as device 1400 and method 1500 includes accommodation relaxing step, since accommodative state of eye is monitored and determined, as displaying images at different depth planes including at infinity and far distances, and detection of relaxed accommodation, provided by images at far depth planes, effectively at infinity, as detailed in paragraphs [1687,1692-1694, 1698, 1705, 1823-24, 1832], e.g. Fig. 15, as detected relaxed accommodation, paragraphs [1705-04, 1687,1692, 1823-24, 1832]) by displaying images that provide a visual impression of diverging or displaying an image in perspective (i.e. as displaying images at far distance depth planes, for achieving and detecting relaxed accommodation, through displaying images at far depth planes, effectively at infinity, as detailed in paragraphs [1687,1692-1694, 1698, 1705, 1823-24, 1832], e.g. Fig. 15, also Border, Figs. 186-192 and descriptions). Response to Arguments Applicant’s arguments filed in the Remarks dated 07/14/2025 regarding claim 1 and its dependent claims have been fully considered but are not persuasive. Specifically, Applicant argues on pages 7-11 that the cited references of Samec and Border do not disclose the new feature of claim 1, namely, regarding the new limitation for moving the left and right lenses so that “the variation in the virtual display distance being continuous and being achieved by displacement of the lenses so as to simulate different focal distances “ and further that (1) “modifying at least one parameter of the binocular display device” including all three prongs (1)-(3) and new amendments to claim 1 of calibration step, since Samec discloses other features, such as adaptable optics element configured to apply a wavefront correction, without disclosing ray-tracing, scaling of the left and right images and being seen with the same angular size by the user, that the modification steps can be repeated, and because Border discloses other features of head-worn display, but not modifying at least one parameter of the binocular display device” including any of the three prongs (1)-(3) or the calibration step. The Examiner respectfully disagrees. With respect to the above issue, as noted in the rejection above, the cited prior art of Samec teaches most of the limitations of claim 1 and in combination with Border and Qi teaches and renders obvious all limitations of claim 1, as Samec teaches (see e.g. Figs. 2-15) a method for determining an eye parameter of a user of a binocular display device including a left display and a right display, the eye parameter relating to a dioptric parameter of an ophthalmic lens to be provided to the user (i.e. as wearable AR device and method e.g. 62,100 1400,1500 of ophthalmic system functioning as a phoropter/or refractor to determine a suitable refraction to correct/improve vision of a wearer/patient, and includes adaptive optics (VFE) including refractive lenses for left and right eye, paragraphs [03, 1416-18,1443, 1473-1487, 1492, 1496-97, 1510-11,1685-96,1701-02, 711-1717], see e.g. Figs. 3-5, 10A-12,14-15), the method (e.g. 1400, 1500) comprising: displaying a left image and a right image to the user when using the binocular display device (i.e. as image displayed, projected to the user eyes, paragraphs [1700-01, 1487,1510-11, 1685-91], see e.g. Figs. 5, 10-12, 14-15), wherein the binocular display device comprises left and right moveable lenses through which the user sees the displayed left and right images, respectively (i.e. as images displayed/projected with adaptive optics (VFE) including refractive lenses for left and right eye with moving elements, deforming moving parts, paragraphs [1487,1496-97,1506,1615, 1685-94,1701-02]), and a content of the right image is different from a content of the left image (i.e. as VFE provides images with different content, e.g. paragraphs [1648, 1663-64, 1680]; modifying at least one parameter of the display device at least one parameter of the display device (i.e. image displayed/projected with incremental modification(s) at different virtual distances and changes with adaptive optics and display, paragraphs [1487, 1702-03,1648,1496,1506,1615, 1685-94,1710-11], Figs. 10-12, 14-15) by (1) moving the left movable lens and the right movable lens along an axis toward or away from the left image and the right image, respectively (i.e. as adaptable optic include moving lens elements, e.g. to varying/alter focus of projected images, paragraphs [1487,1496,1506,1615), Figs. 10-12, 14-15), (2) adjusting, a separation distance between a right reference point of the right image seen by the right eye of the user and a left reference point of the left image seen by the left eye of the user (i.e. due to left/right eye testing i.e. as system is projecting images to left/right eye, at depth planes e.g. near, intermediate, far as system adjusts for inter-pupillary distance (IPD), and using focused most collimated images given pupil size, constriction state, i.e. the vergence, dynamic accommodation, and optical alignment to user’s eyes, as described in e.g. paragraphs [1713-1716, 1439, 1687]), and (3) scaling the right and left images so that both images are seen with a same angular size by the user (i.e. by treating both eyes together with binocular system 1400, by presenting varying sizes of images and at varying depth planes, e.g. paragraphs [1684, 1691,1694, 1698-1703], Figs. 5,10A, 14), wherein modifying step is repeated until image subjective quality of a perceived image is perceived by the user as optimal (i.e. as until user/wearer can view the image comfortably e.g. through biofeedback, see paragraphs [376, 1703-04, 1710-16,1682-92, 1526-28,1571,1604,1633, 1707-1713], as clearly depicted by dashed arrows in Figs. 10A, 11,12,15 ); and determining the eye parameter based on the modified at least one parameter of the binocular display device (i.e. as determined eye prescription of the user/wearer, see paragraphs [1475, 1487, 1685-91, 1699, 1713-17], see e.g. Figs. 5, 10A,14-15, and including treating both eyes together or individually, e.g. paragraphs [1699-1703, 1713-16, 1807-08], Figs. 5, 14), wherein the displaying step and the modifying step are implemented in binocular vision (i.e. by treating both eyes together using binocular system 1400, e.g. paragraphs [1699-1703, 1713-16, 1807-08], Figs. 5, 12,14), and wherein the method further comprises a calibration step including determining a correlation between the at least one parameter of the binocular display device modified during the modifying step and a virtual display distance, the calibrations step comprising using a device incorporating a camera (i.e. as system 1400 performing method 1500 includes calibration for determining proper diopter and/or account for the proper diopter correction, based at least in part on the configuration of the display platform of 1400, 62, and given that during calibration, the iris of the wearer is analyzed, as camera e.g. inward facing camera of the device is used, paragraphs [1721, 1708,1705]), and the step of determining the eye parameter further comprises determining the eye parameter of the user using said correlation (i.e. as calibration is used for proper diopter correction and prescription (diopter) for corrective lens for wearer/user, see paragraphs [1721, 1687-92, 1699-1703, 1710-13]), and wherein the moving of the left and right lenses varies the virtual display distance of the perceived image by changing an effective vergence of rays entering the eyes (since (i.e. adaptable optic moving lens elements, vary and alter focus of projected images, changing the vergence of wearer’s eyes when the focus/diopter is changed, see paragraphs [1487,1496,1506,1588,1615, 1707-1712], Figs. 10-12, 14-15), the variation in the virtual display distance being continuous and being achieved by displacement of the lenses so as to simulate different focal distances (i.e. as adaptable optic moving lens elements, e.g. to vary/alter focus (dioptric power) of projected images as continuous change of focus/diopter, paragraphs [1487,1496,1506,1615, 1712), Figs. 10-12, 14-15). Samec does not explicitly disclose that adjusting so that right reference point and left reference point provide parallel beams to the user where adjustment step comprises performing ray tracing for a first horizontal ray from the right eye through the right lens to the right reference point on the right image, performing ray tracing for a second horizontal ray from the left eye through the left lens to the left reference point on the left image, and separating the left and right images on the left and right displays so that a distance between the left and right images corresponds to the separation distance between the left and right reference points (i.e. as it noted that Samec includes left/right eye testing by projecting images to left/right eye through each lens e.g. 1024, see at least paragraphs [1439, 1472, 1687, 1692, 1694, 1705-07, 1713-1716]). However, Border teaches in the same field of invention of head-worn computing (HWC) systems and related methods (see Figs. 1-2, 14, 112-117, 165-171, 186-192, Title, Abstract, paragraphs [226-236, 740-762]) and further teaches adjusting (i.e. as adjusting lens elements e.g. 14624 for the head-worn display for flexible interpupillary spacing, e.g. paragraphs [647, 667], and as displayed images to left and right eye be shifted laterally through digital shifting thus changing the distance of a point, e.g. center point, on left and right image presented to left and right eye of the user and user’s convergence distance, see paragraphs [647,667,760-762], Figs. 186-192) so that the right reference point and left reference point provide parallel beams to the user (i.e. as L, R images laterally shifted such that images from a point (e.g. center point) provide parallel lines of sight to each eye to provide for difference in interpupillary distance of different users, see paragraphs [760-762], Figs. 187-191), also that during the adjustment step, the distance between the right reference point and the left reference point is adjusted based on a virtual convergence distance (i.e. as lateral shift of L, R images to L and R eye of the user is based on convergence distance due to presented L, R images through HWC system, as depicted in Figs. 187-192, thus providing desired convergence distance and adjustment for different interpupillary distances of users, and also allowing the focus distance to be same as the convergence distance so that the focus cue associated with the focus distance is the same as the convergence cue and the user thereby is presented with a stereo image that has consistent stereo cues for a more comfortable viewing experience, see paragraphs [760-761]), and where adjustment step comprises performing ray tracing for a first horizontal ray from the right eye through the right lens to the right reference point on the right image, performing ray tracing for a second horizontal ray from the left eye through the left lens to the left reference point on the left image (i.e. as lines of sight of the user are traced from each R, L eye of the user through lenses of optical assembly to each point (e.g. center point) in each R, L image of R, L display so that displayed images are shifted laterally through digital shifting of the image on the image source 110 that can change the convergence distance in viewing of stereo images and change the perceived distance to the displayed image, see Border, paragraphs [760-762]), and that separating the left and right images on the left and right displays so that a distance between the left and right images corresponds to the separation distance between the left and right reference points (i.e. as images are separated corresponding to e.g. center pints for parallel lines of sight to each eye of the user, Border paragraphs [760-762], thus providing desired convergence distance and adjustment for different interpupillary distances of users, and also allowing the focus distance to be same as the convergence distance so that the focus cue associated with the focus distance is the same as the convergence cue and the user thereby is presented with a stereo image that has consistent stereo cues for a more comfortable viewing experience, see paragraphs [760-761]). Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to adapt the wearable AR device and methods of Samec to include adjustment of corrective lenses and that adjusting displayed images to left and right eye can be shifted laterally through digital shifting thus changing the distance of a point, e.g. center point, on left and right image presented to left and right eye of the user and user’s convergence distance including infinity where L, R reference points provide parallel beams to the user L, R eye that is based on convergence distance due to presented L, R images through HWC system, using ray-tracing i.e. lines of sight of the user are traced from each R, L eye of the user through lenses of optical assembly to each point (e.g. center point) in each R, L image of R, L display, as images are separated corresponding to e.g. center pints for parallel lines of sight to each eye of the user, in order provide for desired convergence distance and adjustment for different interpupillary distances of users, and also allow the focus distance to be same as the convergence distance so that the focus cue associated with the focus distance is the same as the convergence cue and the user thereby is presented with a stereo image that has consistent stereo cues for a more comfortable viewing experience, see paragraphs [760-761]). Additionally, in the alternative, that Samec alone does not disclose scaling the right and left images so that both images are seen with a same angular size by the user, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to adapt the wearable AR device and methods of Samec to include scaling the right and left images so that both images are seen with a same angular size by the user, i.e. that the images are presented having same angular size (as disclosed in Border, e.g. stereo image presented to the user can be scaled over portion or entire image to adjust disparity and /or stereo depth over the entire image, see paragraphs [667,741-744, 752]), according to teaching of Border in order to adjust disparity of the presented stereo image and present the user with adjusted stereo depth over the entire image viewed by the user, and provide comfortable viewing conditions for the user (see Border, paragraphs [741-744, 752]). Further, given the narrower reading of the term performing ray tracing involving numerical calculations which the Samec-Border combination does not explicitly mention (i.e. in the combination Border teaches that lines of sight of the user are traced from each R, L eye of the user through lenses of optical assembly to each point (e.g. center point) in each R, L image of R, L display so that displayed images are shifted laterally through digital shifting of the image on the image source 110 that can change the convergence distance in viewing of stereo images and change the perceived distance to the displayed image, see Border, paragraphs [760-762]). However, Qi teaches in same field of invention of eyeglasses-wearing simulation method, device and system (see Figs. 1-10, Title, Abstract, paragraphs [02,6-18, 32-42,50-54] where images are presented separately through L,R lenses to L, R eyes of patient, e.g. Figs. 3-6) and further explicitly teaches performing ray tracing (i.e. as simulation program performs ray tracing of binocular vision for all the intersection points of the sample points with respect to the axes, and that enables calculations of distortion and blur for each intersection point, paragraphs [68-72, 80]). Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to apply numerical calculation ray-tracing method of Qi to the adjusting step for displayed images and adjustable lenses of Samec-Border combination, in order to additionally enable calculations of distortion and blur for each intersection point in the visual field, paragraphs [68-72, 80]). As noted above, Samac expressly teaches that the moving of the left and right lenses varies the virtual display distance of the perceived image by changing an effective vergence of rays entering the eyes (since (i.e. adaptable optic moving lens elements, vary and alter focus of projected images, changing the vergence of wearer’s eyes when the focus/diopter is changed, see paragraphs [1487,1496,1506,1588,1615, 1707-1712], Figs. 10-12, 14-15), the variation in the virtual display distance being continuous and being achieved by displacement of the lenses so as to simulate different focal distances (i.e. as adaptable optic moving lens elements, e.g. to vary/alter focus (dioptric power) of projected images as continuous change of focus/diopter, paragraphs [1487,1496,1506,1615, 1712), Figs. 10-12, 14-15). Further, specifically, Samec is directed to methods and systems for eye diagnostics and treatments, including vision correction. Samec teaches measuring, using an eye tracker of the binocular display e.g. 1400, as device used by the user (using and measuring eye gaze of user e.g. eye-tracking system, e.g. paragraphs [206, 1457, 1472, 1490, 1705, 2106-10]). Samec expressly teaches displaying a left image and a right image to the user when using the binocular display device (i.e. as image displayed, projected to the user eyes, paragraphs [1700-01, 1487,1510-11, 1685-91], see e.g. Figs. 5, 10-12, 14-15), wherein the binocular display device comprises left and right moveable lenses through which the user sees the displayed left and right images, respectively (i.e. as images displayed/projected with adaptive optics (VFE) including refractive lenses for left and right eye with moving elements, deforming moving parts, paragraphs [1487,1496-97,1506,1615, 1685-94,1701-02]), and modifying step is repeated until image subjective quality of a perceived image is perceived by the user as optimal, i.e. as until user/wearer can view the image comfortably e.g. through biofeedback, see paragraphs [1703-04, 1710-16,1682-92, 1526-28,1571,1604,1633], as clearly depicted by loops is diagrams and by arrows in Figs. 10A, 11,12,15. Further, Samec teaches that the method further comprises a calibration step including determining a correlation between the at least one parameter of the binocular display device modified during the modifying step and a virtual display distance, the calibrations step comprising using a device incorporating a camera (i.e. as system 1400 performing method 1500 includes calibration for determining proper diopter and/or account for the proper diopter correction, based at least in part on the configuration of the display platform of 1400, 62, and given that during calibration, the iris of the wearer is analyzed, as camera e.g. inward facing camera of the device is used, paragraphs [1721, 1708,1705]), and the step of determining the eye parameter further comprises determining the eye parameter of the user using said correlation (i.e. as calibration is used for proper diopter correction and prescription (diopter) for corrective lens for wearer/user, see paragraphs [1721, 1687-92, 1699-1703, 1710-13]). It is also noted that no additional details of the calibration step including measurement or calibrations (see e.g. paragraphs 140-156 of published application) are recited. Given that it was noted that Samec does not explicitly disclose that adjusting so that right reference point and left reference point provide parallel beams to the user where adjustment step comprises performing ray tracing for a first horizontal ray from the right eye through the right lens to the right reference point on the right image, performing ray tracing for a second horizontal ray from the left eye through the left lens to the left reference point on the left image, and separating the left and right images on the left and right displays so that a distance between the left and right images corresponds to the separation distance between the left and right reference points (i.e. as it noted that Samec includes left/right eye testing by projecting images to left/right eye through each lens e.g. 1024, and system adjustment given and based on vergence and accommodation changes, see at least paragraphs [1439, 1472, 1687, 1692, 1694, 1705-07, 1713-1716]); Therefore the cited prior art of Border was used, as Border teaches in the same field of invention of head-worn computing (HWC) systems and related methods (see Figs. 1-2, 14, 112-117, 165-171, 186-192, Title, Abstract, paragraphs [226-236, 740-762]) and further teaches adjusting (i.e. as adjusting lens elements e.g. 14624 for the head-worn display for flexible interpupillary spacing, e.g. paragraphs [647, 667], and as displayed images to left and right eye be shifted laterally through digital shifting thus changing the distance of a point, e.g. center point, on left and right image presented to left and right eye of the user and user’s convergence distance, see paragraphs [647,667,760-762], Figs. 186-192) so that the right reference point and left reference point provide parallel beams to the user (i.e. as L, R images laterally shifted such that images from a point (e.g. center point) provide parallel lines of sight to each eye to provide for difference in interpupillary distance of different users, see paragraphs [760-762], Figs. 187-191), and where adjustment step comprises performing ray tracing for a first horizontal ray from the right eye through the right lens to the right reference point on the right image, performing ray tracing for a second horizontal ray from the left eye through the left lens to the left reference point on the left image (i.e. as lines of sight of the user are traced from each R, L eye of the user through lenses of optical assembly to each point (e.g. center point) in each R, L image of R, L display, Border, paragraphs [760-762]), and that separating the left and right images on the left and right displays so that a distance between the left and right images corresponds to the separation distance between the left and right reference points (i.e. as images are separated corresponding to e.g. center pints for parallel lines of sight to each eye of the user, Border paragraphs [760-762], thus providing desired convergence distance and adjustment for different interpupillary distances of users, and also allowing the focus distance to be same as the convergence distance so that the focus cue associated with the focus distance is the same as the convergence cue and the user thereby is presented with a stereo image that has consistent stereo cues for a more comfortable viewing experience, see paragraphs [760-761]). Therefore, it was noted that it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to adapt the wearable AR device and methods of Samec to include adjustment of corrective lenses and that adjusting displayed images to left and right eye can be shifted laterally through digital shifting thus changing the distance of a point, e.g. center point, on left and right image presented to left and right eye of the user and user’s convergence distance including infinity where L, R reference points provide parallel beams to the user L, R eye that is based on convergence distance due to presented L, R images through HWC system, using ray-tracing i.e. lines of sight of the user are traced from each R, L eye of the user through lenses of optical assembly to each point (e.g. center point) in each R, L image of R, L display, as images are separated corresponding to e.g. center pints for parallel lines of sight to each eye of the user, in order provide for desired convergence distance and adjustment for different interpupillary distances of users, and also allow the focus distance to be same as the convergence distance so that the focus cue associated with the focus distance is the same as the convergence cue and the user thereby is presented with a stereo image that has consistent stereo cues for a more comfortable viewing experience, see paragraphs [760-761]). Regarding the “scaling” limitations, Samec teaches scaling the right and left images so that both images are seen with a same angular size by the user i.e. by treating both eyes together due to varying sizes of images at varying depth planes, e.g. paragraphs [1684, 1691,1694, 1698-1703], Figs. 5, 14). However, as previously explained that in the alternative since no details are recited or disclosed about the scaling step, that Samec alone does not disclose scaling the right and left images so that both images are seen with a same angular size by the user, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to adapt the wearable AR device and methods of Samec to include scaling the right and left images so that both images are seen with a same angular size by the user having same angular size, disclosed in Border, specifically the stereo image presented to the user can be scaled over portion or entire image to adjust disparity and /or stereo depth over the entire image, see paragraphs [667,741-744, 752]), according to teaching of Border in order to adjust disparity of the presented stereo image and present the user with adjusted ste
Read full office action

Prosecution Timeline

Apr 26, 2019
Application Filed
Apr 26, 2019
Response after Non-Final Action
Apr 28, 2021
Non-Final Rejection — §103
Jul 23, 2021
Response Filed
Jul 28, 2021
Final Rejection — §103
Dec 02, 2021
Request for Continued Examination
Dec 03, 2021
Response after Non-Final Action
Dec 08, 2021
Examiner Interview Summary
Dec 08, 2021
Applicant Interview (Telephonic)
Feb 14, 2022
Non-Final Rejection — §103
May 18, 2022
Response Filed
May 23, 2022
Final Rejection — §103
Nov 28, 2022
Request for Continued Examination
Nov 30, 2022
Response after Non-Final Action
Dec 13, 2022
Applicant Interview (Telephonic)
Dec 13, 2022
Examiner Interview Summary
Mar 07, 2023
Non-Final Rejection — §103
Jun 13, 2023
Response Filed
Jun 22, 2023
Final Rejection — §103
Sep 28, 2023
Response after Non-Final Action
Oct 20, 2023
Request for Continued Examination
Oct 25, 2023
Response after Non-Final Action
Jan 02, 2024
Non-Final Rejection — §103
Apr 05, 2024
Response Filed
Apr 21, 2024
Final Rejection — §103
Jul 25, 2024
Request for Continued Examination
Aug 04, 2024
Response after Non-Final Action
Jan 27, 2025
Non-Final Rejection — §103
Apr 30, 2025
Response Filed
May 09, 2025
Final Rejection — §103
Jul 14, 2025
Request for Continued Examination
Jul 15, 2025
Response after Non-Final Action
Sep 22, 2025
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12591106
CAMERA MODULE
2y 5m to grant Granted Mar 31, 2026
Patent 12578545
CAMERA MODULE
2y 5m to grant Granted Mar 17, 2026
Patent 12578544
OPTICAL ELEMENT DRIVING MECHANISM
2y 5m to grant Granted Mar 17, 2026
Patent 12572035
MOISTURE-RESISTANT EYE WEAR
2y 5m to grant Granted Mar 10, 2026
Patent 12554099
IMAGING OPTICAL LENS SYSTEM, IMAGE CAPTURING UNIT AND ELECTRONIC DEVICE
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

11-12
Expected OA Rounds
63%
Grant Probability
72%
With Interview (+8.7%)
3y 0m
Median Time to Grant
High
PTA Risk
Based on 650 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month