Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
The instant application having Application No. 18319436 filed on 06/08/2023 is presented for examination by the examiner.
Election/Restrictions
Applicant’s election of Invention group I, claims 1-10 in the reply filed on 09/15/2025 is acknowledged. Because applicant did not distinctly and specifically point out the supposed errors in the restriction requirement, the election has been treated as an election without traverse (MPEP § 818.01(a)). Claims 11-20 are withdrawn from further consideration pursuant to 37 CFR 1.142(b) as being drawn to a nonelected Invention group II, there being no allowable generic or linking claim.
Examiner Notes
Examiner cites particular columns and line numbers in the references as applied to the claims below for the convenience of the applicant. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested that, in preparing responses, the applicant fully consider the references in entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the examiner.
Drawings
The applicant’s drawings submitted are acceptable for examination purposes.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim 7 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 7 recites the limitations for” when detecting the trigger, the at least one processor (108, 308) is configured to:
determine a gaze direction of a dominant eye of the user and a gaze direction of a non- dominant eye of the user;
detect when a given criteria is satisfied, wherein the given criteria is satisfied when at least one of the following is true:
(i) the gaze direction of the dominant eye and the gaze direction of the non-dominant eye has not converged for at least a predefined time period,
(ii) the gaze direction of the dominant eye and the gaze direction of the non-dominant eye has not converged within a predefined error margin from each other”. However, these limitations are confusing because it is unclear how they can be understood and treated. Specifically, it is unclear how can the processor be configured for recited sequential operations, if it is unclear what is the dominant and what is the non-dominant eye of the user, how is the eye dominance determined, and if there is some noticeable eye dominance for a given user. Further, the phrases “the gaze direction of the dominant eye and the gaze direction of the non-dominant eye has not converged” are also confusing, since it is unclear how can the gaze direction of the dominant eye converge pr not converge? Or how can the gaze direction of the non-dominant eye converged or not converge? Does this convergence or non-convergence apply to gaze of each eye separately? Or is the convergence (or vergence) defined by gaze directions of both eyes? Given that the dominant and non-dominant eyes are not defined or specified, the following claim limitations are all indefinite, as e.g. the given criteria cannot be detected whether they are satisfied or not satisfied. Lasty, it is unclear what happens in the alternative, if the criteria is not satisfied? What configuration procedure is the processor supposed to follow if any? For the purposes of examination, the claim limitations will be treated broadly, such that either eye of the user can be dominant or non-dominant, or that there is no clear eye dominance, and that the regardless of the criteria satisfaction, the processor can change e.g. optical power of the active optical element for a dominant-eye optical element, and a non- dominant-eye optical element, corresponding to either eye of the user, which can be either dominant or non-dominant eye, or either eye when there is no clear eye dominance.
It is suggested to amend the claim and provide explanations in order to remove the indefiniteness issues noted above.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-5 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Inoue et al. (hereafter Inoue) US 20140347623 A1.
In regard to independent claim 1, Inoue teaches (see Figs. 1-11) an optical apparatus (i.e. multifocal lenses, and methods, see abstract, paragraphs [02, 06-10, 22-31, 40-49, 51-61,65-76,88,93]) comprising:
eye-tracking means (sensor 110,170 sensing eye movement, gaze point sensing, eye blink movement, paragraphs [41-47, 65-76], Figs. 1,9,7);
an active optical element per eye (multifocal lens 160 with adjustable changeable focal length, paragraphs [40-47, 71-76], Figs. 1,9,7 );
controlling means (controller 150, paragraphs [41-48, 65-76], Figs. 1,7,9) for controlling one or more optical parameters of the active optical element (e.g. focal length of multifocal lens 160, paragraphs [40-47]) ; and
at least one processor (gazing point distance measuring section 120, focal length selection section 130, time duration determiner 140, controller 150, paragraphs [41-48,51-60, 65-76], Figs. 1,7,9) configured to:
detect a trigger for changing at least one of the one or more optical parameters of the active optical element (e.g. conditioned on gazing point distance, direction, focal length, and/or time duration, viewability index, 150 controls the focal length of multifocal lens 160, paragraphs [42,-48,65-72], Figs. 1,7,9);
process eye-tracking data, collected by the eye-tracking means, to detect a beginning of an eye blink of a user (as 150 refers to eye movement detector 110,170, and determines whether a blink of eye movement of the user is detected or not, paragraphs [46-48,70-72], Figs. 1,7,9); and
drive the controlling means to change the at least one of the one or more optical parameters of the active optical element during the eye blink (i.e. as 150 with inputs from 120,130,140 drives 160 e.g. changing focal length during the eye blink, paragraphs [41-47, 65-76], Figs. 1,9,7).
Regarding claim 2, Inoue teaches (see Figs. 1-11) that the one or more optical parameters of the active optical element (parameter of 160, focal length) comprise at least one of: at least one optical power to be produced (i.e. as controlled, set focal length of 160 i.e. inverse of optical power of 160, paragraphs [41-47, 65-76]), a position of at least one region of the active optical element in which the at least one optical power is to be produced (i.e. as controlled, set focal length in region of lens 160 of eyeglasses, paragraphs [40-47, 65-76, 93]), a shape of the at least one region (i.e. as controlled, set focal length in region with shape of lens 160 of eyeglasses, paragraphs [40-47, 65-76, 93]), an optical center of the at least one region, a shape of the active optical element (i.e. due to controlled focal length of lens 160 of e.g. eyeglasses, paragraphs [40-47, 65-76, 93]), a profile of the active optical element (i.e. due to controlled focal length of lens 160 of e.g. eyeglasses, paragraphs [40-47, 65-76, 93]), a transparency of at least a portion of the active optical element, a blur of at least a portion of the active optical element, a color of the active optical element, whether an astigmatism correction mode of the active optical element is switched on or off, whether a prismatic correction mode of the active optical element is switched on or off.
Regarding claim 3, Inoue teaches (see Figs. 1-11) that the at least one processor (120-150) is configured to: determine an optical depth at which the user is fixating (i.e. 120, 150 from data of 110, determines gazing point distance, paragraphs [42-50, 65-67]);
select a predefined optical power (130 selects focal length), from amongst a plurality of predefined optical powers, based on the optical depth at which the user is fixating (as 130 focal length selection, selects focal length from predetermined focal lengths, based on gazing point distance, e.g. paragraphs [66-68, 47-48], Figs. 1,7,9); and
detect a difference between the predefined optical power and a current optical power being produced by the active optical element (as 150 determines whether the selected focal length is different from a current focal length, as focal lengths and gazing point distances are compared by assigned “viewability” index, e.g. see paragraphs [66-69, 47-48, 59-62], Figs. 1,6-7,9),
wherein the at least one processor (120-150) is further configured to, when said difference is smaller than a predefined difference, drive the controlling means (150) to change the at least one of the one or more optical parameters of the active optical element during the eye blink, by performing a transition from the current optical power to the predefined optical power (i.e. as 150 drives 160 to change the focal length, Figs. 1,7,9, paragraphs [69-71, where it is noted that the above mentioned viewability index is calculated for the new gazing distance for the present optical power and a calculated optimal optical power, then a difference, between these two values is evaluated and used to determine "time duration" as time limit threshold , paragraphs [41-53], this time duration is the estimated time during which the user can be expected to tolerate the sub- optimal optical power: the higher D, the shorter the time duration; this time duration corresponds to the duration which the system waits for the user to blink or have a saccade movement to perform the change of optical power; and if the duration has elapsed without such blink being detected, the change is performed during normal vision; hence that for small differences between the two optical powers (as inverses of focal lengths), the system will wait for a blink (log time duration), whereas for large differences the system switches without delay paragraphs [53-56, 59-63, 65-71]).
Regarding claim 4, Inoue teaches (see Figs. 1-11) that the at least one processor (120-150) is configured to drive the controlling means (150) to perform the transition from the current optical power to the predefined optical power without a delay, when said difference is not smaller than the predefined difference (i.e. as noted in claim 3 above, 150 drives the change in focal length of 160 for large differences in optical power, as the system switches without delay, e.g. step S16, e.g. paragraphs [53-56, 59-63, 58-71], Figs. 1,7,9).
Regarding claim 5, Inoue teaches (see Figs. 1-11) that the predefined difference lies in a range of 0.25 diopters to 3 diopters (i.e. focal length difference from 400 cm to 33.3 cm) (i.e. as differences in focal lengths, including example movements in gazing point differences with corresponding focal length differences e.g. 2.8m, 40 cm, 2.4 m, for movement A1-A2, B1-B2, C1-C2, see paragraphs [49-50,59-63], Figs. 2, 6).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Inoue et al. (hereafter Inoue) US 20140347623 A1 in view of Peloux et al. (hereafter Peloux) US 20210132414 A1.
Regarding claim 6, Inoue teaches (see Figs. 1-11) that when detecting the trigger, the at least one processor (e.g. as conditioned on gazing point distance, direction, focal length, and/or time duration, viewability index, 150 controls the focal length of multifocal lens 160, paragraphs [42,-48,65-72], Figs. 1,7,9) is configured to:
determine a gaze direction of a given eye of the user (i.e. as 120 measures and determines gazing point distance, and thus the direction of gazing point, paragraphs [42-48]), whilst determining an optical depth at which the user is fixating (as determines gazing point distance, paragraphs [42-48]);
select a predefined optical power (e.g. focal length), from amongst a plurality of predefined optical powers (predefined optical lengths, e.g. see Fig. 2), based on the optical depth at which the user is fixating (i.e. based on gazing point distance, as 110 provides gaze/eye data, 120 measures distance of gaze point and 130 selects focal length for 160, paragraphs [43-45,65-66], Figs. 1,7,9). But Inoue is silent that the processor (120-150) is configured to
select a region of the active optical element (all or part of lenses of 160) in which the predefined optical power is to be produced, based on the gaze direction; and
detect an overlap of less than a predefined percent between the selected region and a current region of the active optical element in which a current optical power is being produced; wherein the at least one processor is further configured to, when the overlap between the selected region and the current region is less than the predefined percent, drive the controlling means to change the at least one of the one or more optical parameters of the active optical element without any delay, by producing the predefined optical power in the selected region.
However, Peloux teaches in the same field of invention of a worn by a user optical device including an active programmable lens (see Figs. 1-3, abstract, paragraphs [01, 15-44, 49-63, 93-114,117-126], where device 10 has also optical function of the active programmable lenses 20 is adjusted during an eye-blink]), and further teaches that that the processor (as controller 30 with processor 32 and memory 34, paragraphs [49-60]) is configured to
select a region of the active optical element in which the predefined optical power is to be produced, based on the gaze direction (as selected region or part of lenses of 20, e.g. for part of wearer’s field of view, for central vision, for peripheral vision of the wearer given vision data (vision distance, direction of wearer) sensed/measured by vision sensor 22, e.g. paragraphs [49-60]); and
detect an overlap of less than a predefined percent between the selected region and a current region of the active optical element in which a current optical power is being produced (i.e. as 10, with 30 controls and activates regions of active programmable lens 20, since 30 stores vision data and executable instructions, and predetermined optical states, and as 30 is adjusting the optical power and/or the position of the optical function of the active programmable lens 20 with the vision direction of the eye of the wearer, see paragraphs [33-35,50-57, 99-113, 117-127]); wherein the at least one processor (30, 32,34) is further configured to, when the overlap between the selected region and the current region is less than the predefined percent, drive the controlling means to change the at least one of the one or more optical parameters of the active optical element without any delay, by producing the predefined optical power in the selected region (i.e. given the functionality of 30 directing and controlling 20, e.g. between different positions/regions, where adjustable optical power is re-evaluated at this time, depending, on the average of the vision distances between the two periods of time, i.e. comparing two directions, their overlap, where passing from one optical power to another may be instantaneous, or at least at the speed of the optical device, see e.g. paragraphs [107-111, 117-127,33-35],and providing the device with active lens with the ability for moving the optical function and/or the optical center of the active programmable lens with the vision direction of said at least one eye of the wearer).
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to adapt and configure the controller/processors of multifocal lenses of Inoue to include configuration that can select region of the active optical element in which the predefined optical power is to be produced, based on the gaze direction, and control the optical power is select region given the overlap with the current region, i.e. corresponding to different viewing positions, according to teachings of Peloux, in order to provide the active lens multifocal lens device with the ability for moving the optical function and/or the optical center of the active programmable lens with the vision direction of said at least one eye of the wearer (see Peloux, paragraphs [107-111,33-35]).
Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Inoue et al. (hereafter Inoue) US 20140347623 A1 in view of Barraza-Bernal et al. (hereafter Barraza-Bernal) US 20190209005 A1.
Regarding claim 7, Inoue teaches (see Figs. 1-11) that when detecting the trigger, the at least one processor (e.g. as conditioned on gazing point distance, direction, focal length, and/or time duration, viewability index, 150 controls the focal length of multifocal lens 160, paragraphs [42,-48,65-72], Figs. 1,7,9) is configured to:
determine a gaze direction of a dominant eye of the user and a gaze direction of a non- dominant eye of the user (i.e. as 120 measures and determines gazing point distance, and thus the direction of gazing point of the eyes of the user, where one eye can be dominant and the other can be non-dominant, or both eyes when there is no clear eye dominance, paragraphs [42-48], and determines gazing point distance, paragraphs [42-48], see also treatment due to 112(b) issues noted above);
wherein the at least one processor (120-150) is further configured to, when the given criteria is satisfied, drive the controlling means (150, controlling 160) to change the at least one of the one or more optical parameters of the active optical element (104A-B, 200, 304A-B) during the eye blink (i.e. as 150 with inputs from 120,130,140 drives 160 e.g. changing focal length during the eye blink, paragraphs [41-47, 65-76], Figs. 1,9,7).
But Inoue is silent that the processor (120-150) is configured to detect when a given criteria is satisfied, wherein the given criteria is satisfied when at least one of the following is true: (i) the gaze direction of the dominant eye and the gaze direction of the non-dominant eye has not converged for at least a predefined time period,
(ii) the gaze direction of the dominant eye and the gaze direction of the non-dominant eye has not converged within a predefined error margin from each other; and when the given criteria is satisfied, drive the controlling means, by producing, at a dominant-eye optical element, an optical power that is different from a predefined dominant-eye optical power corresponding to the dominant eye, whilst producing, at a non- dominant-eye optical element, a predefined non-dominant-eye optical power corresponding to the non-dominant eye.
However, Barraza-Bernal teaches in the same field of invention of an apparatus for assisting in establishing a correction for correcting heterotropia or heterophoria (see Figs. 1-4, title, abstract, e.g. paragraphs [58-66], also with measuring device 1, analyzing unit 21, simulation device 21 with displays and refraction correcting devices 49, 51, paragraphs [68-73]), and further teaches that the processor (21) is configure to detect when a given criteria is satisfied, wherein the given criteria is satisfied when at least one of the following is true: (i) the gaze direction of the dominant eye and the gaze direction of the non-dominant eye has not converged for at least a predefined time period,
(ii) the gaze direction of the dominant eye and the gaze direction of the non-dominant eye has not converged within a predefined error margin from each other (i.e. as such criteria/error margin(s) are given/present in deviations data including diagnostic data d, classification data c, magnitude data m, and prescription data p, determining whether or not heterotropia or heterophoria is present, e.g. paragraphs [60-66, Figs. 1-3); and when the given criteria is satisfied, drive the controlling means (21 drives 41, e.g. paragraphs [58-66, 67-70,73]), by producing, at a dominant-eye optical element (e.g. setting, changing refractive power of one of refractive devices 51A,B, e.g. paragraphs [66-70]), an optical power that is different from a predefined dominant-eye optical power corresponding to the dominant eye (i.e. refraction or defocus), whilst producing, at a non- dominant-eye optical element (e.g. setting, changing refractive power of other of refractive devices 51A,B, e.g. paragraphs [66-70]), a predefined non-dominant-eye optical power corresponding to the non-dominant eye (i.e. as viewing device 41 changes and sets refractive corrective power for each of refraction correction devices 51A,51B given data d, c, m, and p outputted to interface 43 of simulation device 41, with lenses 51A,B including e.g. wave front manipulator such as a liquid lens, an Alvarez-element, or the like, thereby correcting the line of sight of the affected eye/eyes and providing necessary refraction correction, see e.g. paragraphs [26-27, 66-70]).
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to adapt and configure the controller/processors of multifocal lenses of Inoue to include criteria that can recognize convergence disorders including heterophoria and heterotropia and provide corrective optical powers to affected eye/eyes according to teachings of Barraza-Bernal, in order to provide for correcting the line of sight of the affected eye/eyes and provide necessary refraction correction, (see e.g. paragraphs [26-27, 66-70]).
Claims 8-10 are rejected under 35 U.S.C. 103 as being unpatentable over Inoue et al. (hereafter Inoue) US 20140347623 A1 in view of Lewis WO 2023096713 A1 (of record, see Information disclosure statement dated 12/02/2024).
Regarding claim 8, Inoue teaches (see Figs. 1-11) that when detecting the trigger, the at least one processor (e.g. as conditioned on gazing point distance, direction, focal length, and/or time duration, viewability index, 150 controls the focal length of multifocal lens 160, paragraphs [42,-48,65-72], Figs. 1,7,9) wherein the at least one processor (120-150) is further configured to drive the controlling means (150, controlling 160) to change the at least one of the one or more optical parameters of the active optical element (104A-B, 200, 304A-B) during the eye blink (i.e. as 150 with inputs from 120,130,140 drives 160 e.g. changing focal length during the eye blink, paragraphs [41-47, 65-76], Figs. 1,9,7).
But Inoue is silent about comprising a plurality of light sensors and that the processor (120-150) is further configured to detect when a light intensity of light signals sensed by the plurality of light sensors exceeds a predefined threshold intensity,
and when the light intensity of the light signals sensed by the plurality of light sensors exceeds the predefined threshold intensity, drive the controlling means (150) to change by adjusting at least one of: a transparency, a color, of at least a portion (160) of the active optical element.
However, Lewis teaches in the same field of invention of personalized optics (e.g. active glasses, eyewear 100, with correction, enhancement regions, see Figs. 1-4, 9-12, paragraphs [12-29, 63-70,78-83,88-96, 183-197,209-223]) and further teaches comprising a plurality of light sensors (sensors 123 including ambient sensors 123b, paragraphs [183-197], e.g. Figs. 1A-B) and that the processor (e.g. computing device 121, e.g. paragraphs [183-197]) is further configured to detect when a light intensity of light signals sensed by the plurality of light sensors exceeds a predefined threshold intensity (i.e. as 121 receives signals from 123, 123b, for controlling correction or enhancement of lenses 112 including for changes in ambient lighting, illuminance, e.g. paragraphs [183-197,200,209-223])
and when the light intensity of the light signals sensed by the plurality of light sensors exceeds the predefined threshold intensity, drive the controlling means (121) to change by adjusting at least one of: a transparency, a color, of at least a portion of the active optical element (i.e. as 121 can adjust shading to portion(s) of lenses 112, in response to signals received and determined shading correction/enhancement, see paragraphs [209-223], thus providing distinct corrections or enhancements to vision in a region where the wearer is looking).
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to adapt and configure the controller/processors and multifocal lenses of Inoue to include ambient sensors sensing the light intensity and processor configured to receive signals sensed by the plurality of light sensors and when predefined threshold intensity is exceeded, drive the controlling means (150) to change transparency/color i.e. shading of at least a portion of active lenses according to teachings of Lewis, in order to provide distinct corrections or enhancements to vision in a region where the wearer is looking (see e.g. paragraphs [193-196,200]).
Regarding claim 9, the Inoue-Lewis combination teaches the invention as set forth above, and Inoue teaches (see Figs. 1-11) that the at least one processor (120-150 as modified with Lewis) is further configured to:
determine a gaze direction of a given eye of the user (sensor 110,170 sensing eye movement, gaze point sensing, eye blink movement, paragraphs [41-47, 65-76], Figs. 1,9,7, and with modifications of Lewis for vision enhancement/correction); and select the portion of the active optical element (160, i.e. portion of 112) in which the at least one of: the transparency, the color, is to be adjusted, based on the gaze direction (160 i.e. due to combination, has regions e.g. 131 as in lenses 112, that are controllable and adjustable, see paragraphs [200-203).
Regarding claim 10, Inoue teaches (see Figs. 1-11) that at least one processor (e.g. as 120- 150 controls the focal length of multifocal lens 160, paragraphs [42,-48,65-72], Figs. 1,7,9) is configured to:
drive the controlling means (106, 306A-B) to change the at least one of the one or more optical parameters of the active optical element (104A-B, 200, 304A-B) when the predicted time duration is longer than a predefined time threshold (i.e. as elapsed time duration is longer than (pre) determined time duration, which is time predetermined time limit set by 140, see e.g. paragraphs [47-48,53-57,67-68]), drive the controlling means (106150) to change the at least one of the one or more optical parameters of the active optical element without a delay (i.e. as 150 drives 160 to change the focal length, Figs. 1,7,9, paragraphs [69-71], where it is noted that the above mentioned viewability index is calculated for the new gazing distance for the present optical power and a calculated optimal optical power, then a difference, between these two values is evaluated and used to determine the "time duration" paragraphs [41-53], as the time duration is the estimated time during which the user can be expected to tolerate the sub-optimal optical power; and the higher D, the shorter the time duration; the time duration corresponds to the duration which the system waits for the user to blink or have a saccade movement to perform the change of optical power, and when the duration has elapsed without such blink being detected, the change is performed during normal vision, hence for small differences between the two optical powers (as inverses of focal lengths), the system waits for a blink, whereas for large differences the system switches without delay paragraphs [53-56, 59-63, 70-71]).
But Inoue is silent that the processor (120-150) is configured to determine an average frequency at which previous eye blinks of the user have occurred; and predict a time duration after which a next eye blink is expected to occur, based on the average frequency and time elapsed since a last eye blink of the user.
However, Lewis teaches in the same field of invention of personalized optics (e.g. active glasses, eyewear 100, with correction, enhancement regions, see Figs. 1-4, 9-12, paragraphs [12-29, 63-70,78-83,88-96, 183-197,209-223]) and further teaches the processor (computing device 121, e.g. paragraphs [183-197]) is configured to determine an average frequency at which previous eye blinks of the user have occurred (i.e. as digital eyewear 100, 121 is responsive to and uses parameter of blink rates of the wearer, given his/hers medical condition and activity and eyewear 100 with 121 may determine the actual values of such or other measures, also compare with baseline “normal” rate for the wearer or for ordinary patients (e.g. such as with respect to blink rate and related measures), and 100 can also determine first and other derivatives of those values, first order and other statistical measures of those values, correlations of pairs of those values, and/or determine medical information with respect to such values, see specifically, paragraphs [12,131,208,241]);
and predict a time duration after which a next eye blink is expected to occur, based on the average frequency and time elapsed since a last eye blink of the user (i.e. given that statistical measure of the blink rate provides expected time between blink events, and given that eyewear 100 can with respect to one or more medical conditions, attempt to predict those medical conditions (or activities), and also monitor those medical conditions (as they begin, proceed, finish, end, or recur, see paragraphs [241,243], hence eyewear 100 provides for prediction of next eye blink based on average blink rates, while accounting for elapsed time, see paragraphs [12,131,208,241]).
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to adapt and configure the controller/processors and multifocal lenses of Inoue to use blink rates as statistical measure to predict time interval for the next predicted eye blink according to teachings of Lewis, in order to provide distinct corrections or enhancements to vision and obtain medical information with respect to those value, and generally enhance capability of the eyewear to dynamically adjust its effect on viewing to match the combination of the wearer, the object or scene being viewed, and possibly other conditions (see e.g. paragraphs [200, 241,243,11]).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Lewis US 12210230 B1 also teaches features of instant invention (see e.g. Figs. 1-4 and their descriptions).
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MARIN PICHLER whose telephone number is (571)272-4015. The examiner can normally be reached Monday-Friday 8:30am -5:00pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Thomas K Pham can be reached at (571)272-3689. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MARIN PICHLER/Primary Examiner, Art Unit 2872