Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
This office action is in response to the communication filed 6/28/2023.
Amendment to the abstract, filed 6/28/2023, is acknowledged and accepted
Amendment to claims 1-22, filed 6/28/2023, are acknowledged and accepted.
Information Disclosure Statement
The information disclosure statements submitted on 12/4/2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statements is being considered by the examiner.
Specification
35 U.S.C. 112(a) or pre-AIA 35 U.S.C. 112, requires the specification to be written in “full, clear, concise, and exact terms.” The disclosure is objected to because the specification is replete with informalities and terms which are not clear, concise and exact. The specification should be revised carefully in order to comply with 35 U.S.C. 112(a) or pre-AIA 35 U.S.C. 112. Examples of some informalities and unclear, inexact, or verbose terms used in the specification are listed as follows:
On pg. 2, line 12, “to which reference is made to” should read “to which reference is made” or “, which reference is made to” to remove an improper/redundant “to”
On pg. 4, line 12, "after detection of trigger" should read "after detection of the trigger" to include a missing article
On pg. 8, lines 23 and 26, “the user’s eyes shifts” should read “the user’s eyes shift” for proper subject-verb agreement. Similarly, for “the user’s eyes, shifts” on pg. 9, lines 9-10, which also suffers an extra/misplaced comma.
On pg. 9, line 9 reads “for example, such as”. Having both “for example,” and “such as” is redundant/improper. Applicant should keep only one of these phrases.
On pg. 11, line 16, “focussing" should read “focusing”. While “focussing" is proper in other English dialects, Applicant uses the American “focusing” elsewhere in the specification (e.g. pg. 1, line 16), and should be consistent in their spelling.
On pg. 31, line 30, “misses to converge” is improper and should read “fails to converge” or similar
On pg. 43, line 35, and pg. 44, line 8, “controlling means” should read “the controlling means”
Examiner notes that this list is not exhaustive, and reiterates that the specification should
be revised carefully in order to comply with 35 U.S.C. 112(a). Applicant’s specification should be provided in clear and proper idiomatic English and contain no new matter.
Claim Objections
Claims 7-9, 14-16, and 18-20 are objected to because of the following informalities:
In claim 7, line 11, and claim 18, line 10, “has not converged” should say “have not converged”
In claim 8, line 9, and claim 19, line 8, “at least one of: a transparency, a colour, of at least a portion…” is ungrammatical and should read “at least one of a transparency or a colour of at least a portion…”.
In claim 14, line 8, a comma is missing before the word “when”, which leaves the closing comma of the parenthetical statement (“when said difference…”) unpaired.
Claims not specifically addressed in the objections above inherit the objections of the claim from which they depend. Appropriate correction is required.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-5, 11-16, and 22 are rejected under 35 U.S.C. 102(a)(1) and 102(a)(2) as being anticipated by Inoue et al (US 20140347623 A1, hereinafter “Inoue”).
Regarding claim 1, Inoue discloses (see FIGs. 1 and 7, ¶s 41-49 and 64-71) an optical apparatus (viewer 10) comprising:
eye-tracking means (sensor 110 with eye movement detector 170);
an active optical element (multifocal lens 160) per eye;
controlling means (controller 150) for controlling one or more optical parameters (i.e. focal length) of the active optical element (multifocal lens 160)
Further regarding claim 1, and now also regarding claim 12, Inoue discloses:
at least one processor (gazing point distance measuring section 120, focal length selection section 130, controller 150) configured to:
detect a trigger (“change in gazing point distance”) for changing at least one of the one or more optical parameters (i.e. focal length) of the active optical element (multifocal lens 160) (see ¶s 65-66: “gazing point distance measuring section 120 detects a change in gazing point distance of the user based on the data acquired by the sensor 110 (S11). If the change in gazing point distance is detected (YES at S11)... focal length selection section 130 selects a new focal length based on a new gazing point distance measured… at step S11”);
process eye-tracking data, collected by the eye-tracking means (sensor 110 with eye movement detector 170), to detect a beginning of a saccade of a user's eye (see ¶ 70: “At step S17, the controller 150 refers to the eye movement detector 170, and determines whether a blink or saccadic eye movement of the user is detected or not”); and
drive the controlling means (controller 150) to change the at least one of the one or more optical parameters (i.e. focal length) of the active optical element (multifocal lens 160) during the saccade (see ¶s 70-71: “If a blink or saccadic eye movement is detected (YES at S17), the process proceeds to step S18”, “At step S18, the controller 150 outputs a control signal for switching the focal length of the multifocal lens 160”).
Regarding claims 2 and 13, Inoue discloses the optical apparatus of claim 1 and the method of claim 12.
Inoue further discloses (see FIGs. 1 and 7, ¶s 41-49 and 64-71) wherein the one or more optical parameters (i.e. focal length) of the active optical element (multifocal lens 160) comprise at least one of:
at least one optical power to be produced (Examiner notes that power and focal length are inversely related),
a position of at least one region of the active optical element (multifocal lens 160) in which the at least one optical power is to be produced,
a shape (to which focal length corresponds) of the at least one region,
an optical centre of the at least one region,
a shape (corresponding to focal length; see ¶ 42 regarding focus variation by applying pressure to a liquid lens) of the active optical element (multifocal lens 160),
a profile (i.e. focal length) of the active optical element (multifocal lens 160),
a transparency of at least a portion of the active optical element ,
a blur of at least a portion of the active optical element,
a colour of the active optical element,
whether an astigmatism correction mode of the active optical element is switched on or off,
whether a prismatic correction mode of the active optical element is switched on or off.
Regarding claims 3 and 14, Inoue discloses the optical apparatus of claim 1 and the method of claim 12.
Inoue further discloses (see FIGs. 1 and 7, ¶s 41-49 and 64-71) wherein when detecting the trigger (i.e. “change in gazing point distance”), the at least one processor (gazing point distance measuring section 120, focal length selection section 130, controller 150) is configured to:
determine an optical depth (“gazing point distance”) at which the user is fixating;
select a predefined optical power (or focal length), from amongst a plurality of predefined optical powers (or focal lengths), based on the optical depth (“gazing point distance”) at which the user is fixating (note the following excerpts –¶ 45: “focal length selection section 130 selects a focal length to be set among a plurality of predetermined focal lengths that can be set”¶ 66: “focal length selection section 130 selects a new focal length based on a new gazing point distance measured by the gazing point distance measuring section 120 at step S11, in accordance with a predetermined relationship between the gazing point distance and the focal length” ); and
detect a difference between the predefined optical power (or focal length) and a current optical power (or focal length) being produced by the active optical element (multifocal lens 160),
wherein the at least one processor (gazing point distance measuring section 120, focal length selection section 130, controller 150) is further configured to, when said difference is smaller than a predefined difference, drive the controlling means (controller 150) to change the at least one of the one or more optical parameters (i.e. focal length) of the active optical element (multifocal lens 160) during the saccade, by performing a transition from the current optical power (or focal length) to the predefined optical power (¶s 70-71: “If a blink or saccadic eye movement is detected (YES at S17), the process proceeds to step S18”, “At step S18, the controller 150 outputs a control signal for switching the focal length of the multifocal lens 160”).
(Note, regarding items C and D above, that for different gaze point distances, Inoue assigns a “viewability” index to different focal lengths. And when changes to gaze point distance occur, Inoue makes a determination of when focal length changes should also be made (i.e. “without a wait time”, “at the time of… a saccadic eye movement”, etc.) by evaluating the resulting viewability changes (i.e. as a functional equivalent to the difference) against some preset differentiators/criteria (analogous to the predefined difference). See also FIG. 6 with ¶s 59-63.)
Regarding claims 4 and 15, Inoue discloses the optical apparatus of claim 3 and the method of claim 14.
Inoue further discloses wherein the at least one processor (gazing point distance measuring section 120, focal length selection section 130, controller 150) is configured to drive the controlling means (controller 150) to perform the transition from the current optical power (or focal length) to the predefined optical power (or focal length) without a delay, when said difference is not smaller than the predefined difference. (See ¶ 60 describing a scenario where the change in focal length (or equivalently, viewability index) is sufficiently large, such that it is preferable to perform focal length switching “without a wait time”.)
Regarding claims 5 and 16, Inoue discloses the optical apparatus of claim 3 and the method of claims 14.
Inoue further discloses wherein the predefined difference lies in a range of 0.25 dioptre (i.e. a focal length of 4 meters = 400 cm) to 3 dioptre (0.333… meters = 33.3… cm). (See FIG. 6, ¶s 59-63; Inoue describes three scenarios where gazing point distance moves from:
A1(18cm)[Wingdings font/0xE0]A2(2.8m), corresponding to a focal length difference of (300-20) cm = 280 cm (per FIG. 2) that occurs “without a wait time”
B1(28cm) [Wingdings font/0xE0] B2(40cm), corresponding to a focal length difference of (60-20) cm = 40cm (per FIG. 2) that occurs “at the time of… a saccadic eye movement”
C1(70cm) [Wingdings font/0xE0] C2(1.5m), corresponding to a focal length difference of (300 - 60) cm = 240 cm (per FIG. 2) that occurs “at the time of… a saccadic eye movement”
Clearly, therefore, a predefined difference lies somewhere between 240cm and 280cm, which further lies within the claimed range of 33.3… cm to 400 cm.)
Regarding claim 11, Inoue discloses (see FIGs. 1 and 7, ¶s 41-49 and 64-71) an optical apparatus comprising:
eye-tracking means (sensor 110 with eye movement detector 170);
an active optical element (multifocal lens 160) per eye;
controlling means (controller 150) for controlling one or more optical parameters (i.e. focal length) of the active optical element (multifocal lens 160)
Further regarding claim 11, and now also regarding claim 22, Inoue discloses:
at least one processor (gazing point distance measuring section 120, focal length selection section 130, controller 150) configured to:
process eye-tracking data, collected by the eye-tracking means (sensor 110 with eye movement detector 170), to detect a beginning of a given saccade of a user's eye (see ¶ 70: “At step S17, the controller 150 refers to the eye movement detector 170, and determines whether a blink or saccadic eye movement of the user is detected or not”);
detect, during the given saccade, a trigger for changing at least one of the one or more optical parameters (i.e. focal length) of the active optical element (multifocal lens 160); and
drive the controlling means (controller 150) to change the at least one of the one or more optical parameters (i.e. focal length) of the active optical element (multifocal lens 160) during the given saccade.
(Regarding items Aii and Aiii above, see ¶s 70-71: “If a blink or saccadic eye movement is detected (YES at S17), the process proceeds to step S18”, “At step S18, the controller 150 outputs a control signal for switching the focal length of the multifocal lens 160”. Clearly, therefore, when the saccade occurs, controller 150 detects a trigger which further directs it to switch the focal length.)
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 6 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Inoue, as applied respectively to claims 1 and 12 above, and further in view of Peloux et al (WO 2018193057 A1, hereinafter “Peloux”).
Regarding claims 6 and 17, Inoue discloses the optical apparatus of claim 1 and the method of claim 12.
Inoue further discloses (FIGs. 1, 7; ¶s 41-49, 64-71) wherein when detecting the trigger (i.e. “change in gazing point distance”), the at least one processor (gazing point distance measuring section 120, focal length selection section 130, controller 150) is configured to:
determine a gaze direction of a given eye of the user (note ¶ 44: “gazing point distance measuring section 120 measures the gazing point distance”, which implicitly requires determination of gaze direction), whilst determining an optical depth (“gazing point distance”) at which the user is fixating;
select a predefined optical power (or focal length), from amongst a plurality of predefined optical powers (or focal lengths), based on the optical depth (“gazing point distance”) at which the user is fixating (note the following excerpts –¶ 45: “focal length selection section 130 selects a focal length to be set among a plurality of predetermined focal lengths that can be set”¶ 66: “focal length selection section 130 selects a new focal length based on a new gazing point distance measured by the gazing point distance measuring section 120 at step S11, in accordance with a predetermined relationship between the gazing point distance and the focal length” ).
Inoue does not explicitly disclose that the processor is configured to:
select a region of the active optical element in which the predefined optical power is to be produced, based on the gaze direction; and
detect an overlap of less than a predefined percent between the selected region and a current region of the active optical element in which a current optical power is being produced;
wherein the at least one processor is further configured to, when the overlap between the selected region and the current region is less than the predefined percent, drive the controlling means to change the at least one of the one or more optical parameters of the active optical element without any delay, by producing the predefined optical power in the selected region.
Inoue and Peloux are related as being directed towards dynamic focus-adjustable lenses and systems/methods based on saccadic cues.
Peloux discloses that the processor (processor 34 of optical power controller 30) is configured to:
select a region of the active optical element (active programmable lens 20) in which the predefined optical power is to be produced, based on the gaze direction (“vision direction”); and
detect an overlap of less than a predefined percent between the selected region and a current region of the active optical element (active programmable lens 20) in which a current optical power is being produced;
wherein the at least one processor (processor 34 of optical power controller 30) is further configured to, when the overlap between the selected region and the current region is less than the predefined percent, drive the controlling means (optical power controller 30) to change the at least one of the one or more optical parameters (i.e. optical power/position) of the active optical element (active programmable lens 20) without any delay (“instantaneously, or at least at the speed of the optical device”), by producing the predefined optical power in the selected region.
(See pg.6, line 15, to pg. 8 line 2; pg. 14 lines 11-24. Note the following excerpts –
pg. 7: “optical power controller 30 comprises a memory 32 and a processor 34”, “memory 32 is adapted to store vision data… [and] at least two predetermined optical power states…”, “processor 34 is configured to execute the stored computer executable instructions so as to control the optical power of the active programmable lens 20.”
pg. 14: “The computer executable instructions may comprise instructions for adjusting the optical power and/or the position of the optical function of the active programmable lens 20 with the vision direction of the eye of the wearer”, “The adjustable optical power is re-evaluated at this time [i.e. after a new region is selected based on a new gaze direction, which differs from a previous vision/gaze direction corresponding to the current region], depending, for example, on the average of the vision distances [i.e. a measure of the overlap between the two vision directions and their corresponding regions] between the two periods of time”, “Passing from one optical power to another may be done, for example, instantaneously, or at least at the speed of the optical device.”)
It would have therefore been obvious for one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Inoue with Peloux, in order to quickly/precisely measure and accommodate gaze-direction, and to provide (locally) adjustable optical function/power comfortably and in a manner which reduces complexity, cost, and weight (Peloux pg. 2, lines 21-27; pg. 7, lines 20-29)
Claims 7 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Inoue, as applied respectively to claims 1 and 12 above, and further in view of Barraza-Bernal and Wahl (US 20190209005 Al, hereinafter “Barraza-Bernal”).
Regarding claims 7 and 18, Inoue discloses the optical apparatus of claim 1 and the method of claim 12.
Inoue further discloses (FIGs. 1, 7; ¶s 41-49, 64-71) wherein when detecting the trigger (i.e. “change in gazing point distance”), the at least one processor (gazing point distance measuring section 120, focal length selection section 130, controller 150) is configured to:
determine a gaze direction of a dominant eye of the user and a gaze direction of a non-dominant eye of the user. (See ¶ 44: “gazing point distance measuring section 120 measures the gazing point distance”, which implicitly requires determination of gaze direction.)
wherein the at least one processor (gazing point distance measuring section 120, focal length selection section 130, controller 150) is further configured to drive the controlling means (controller 150) to change the at least one of the one or more optical parameters (i.e. focal length) of the active optical element (multifocal lens 160) during the saccade (see ¶s 70-71: “If a blink or saccadic eye movement is detected (YES at S17), the process proceeds to step S18”, “At step S18, the controller 150 outputs a control signal for switching the focal length of the multifocal lens 160”).
Inoue does not disclose the at least one processor configured to:
detect when a given criteria is satisfied, wherein the given criteria is satisfied when at least one of the following is true:
the gaze direction of the dominant eye and the gaze direction of the non-dominant eye has not converged for at least a predefined time period,
the gaze direction of the dominant eye and the gaze direction of the non-dominant eye has not converged within a predefined error margin from each other; and
wherein the at least one processor is further configured to, when the given criteria is satisfied, drive the controlling means to change the at least one of the one or more optical parameters of the active optical element, by producing, at a dominant-eye optical element, an optical power that is different from a predefined dominant-eye optical power corresponding to the dominant eye, whilst producing, at a non-dominant-eye optical element, a predefined non-dominant-eye optical power corresponding to the non-dominant eye.
Inoue and Barraza-Bernal are related as being directed towards focus-adjustable liquid lenses and visual accommodations.
Barraza-Bernal discloses the at least one processor (analyzing unit 21) configured to:
detect when a given criteria is satisfied, wherein the given criteria is satisfied when at least one of the following is true:
the gaze direction of the dominant eye and the gaze direction of the non-dominant eye has not converged for at least a predefined time period,
the gaze direction of the dominant eye and the gaze direction of the non-dominant eye has not converged within a predefined error margin from each other (note: from ¶s 60-65, that measuring device 1 tracks gaze directions and transmits the data to analyzing unit 21, which then determines deviation data Δ encompassing diagnostic/classification/magnitude/prescription data (d/c/m/p), of either heterotropia or heterophoria. Clearly, therefore, various criteria and associated error margins must be involved in order to make such determinations); and
wherein the at least one processor (analyzing unit 21) is further configured to, when the given criteria is satisfied, drive the controlling means (simulation device 41) to change the at least one of the one or more optical parameters (i.e. refractive power) of the active optical element (lenses 51A/51B, or “wave front manipulators”), by producing, at a dominant-eye optical element (lens 51A or “wave front manipulator”), an optical power (“refraction” or “defocus”) that is different from a predefined dominant-eye optical power (“refraction” or “defocus”) corresponding to the dominant eye, whilst producing, at a non-dominant-eye optical element (lens 51B or “wave front manipulator”), a predefined non-dominant-eye optical power (“refraction” or “defocus”) corresponding to the non-dominant eye. (See ¶s 66-70, the aforementioned data d/c/m/p is transmitted from the analyzing unit 21 to a simulation device 41, which has a viewing device 47 with displays 49A/49B for each of the patient’s eyes. The viewing device also typically includes a refraction correcting device with lenses 51A/51B, or alternatively “wave front manipulators [such as] liquid lenses, Alvarez elements, or the like”, for providing refractive corrections to each eye.)
It would have therefore been obvious for one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Inoue with teachings of Barraza-Bernal, in order to correct refractive errors that accompany vergence disorders such as heterophoria and heterotropia (Barraza-Bernal ¶ 70), and also to treat the vergence disorders themselves (i.e. by optical penalization therapy).
Claims 8-10 and 19-21 are rejected under 35 U.S.C. 103 as being unpatentable over Inoue, as applied to claim 1 above, and further in view of Lewis (WO 2023096713 A1).
Regarding claims 8 and 19, Inoue discloses the optical apparatus of claim 1 and the method of claim 12.
Inoue further discloses (see FIGs. 1 and 7, ¶s 41-49 and 64-71) the optical apparatus:
wherein when detecting the trigger (i.e. “change in gazing point distance”), the at least one processor gazing point distance measuring section 120, focal length selection section 130, controller 150) is configured to detect,
wherein the at least one processor (gazing point distance measuring section 120, focal length selection section 130, controller 150) is further configured to drive the controlling means (controller 150) to change the at least one of the one or more optical parameters (i.e. focal length) of the active optical element (multifocal lens 160) during the saccade (see ¶s 70-71: “If a blink or saccadic eye movement is detected (YES at S17), the process proceeds to step S18”, “At step S18, the controller 150 outputs a control signal for switching the focal length of the multifocal lens 160”).
Inoue does not disclose the optical apparatus:
further comprising a plurality of light sensors, wherein when detecting the trigger, the at least one processor is configured to detect when a light intensity of light signals sensed by the plurality of light sensors exceeds a predefined threshold intensity,
wherein the at least one processor is further configured to, when the light intensity of the light signals sensed by the plurality of light sensors exceeds the predefined threshold intensity, drive the controlling means to change the at least one of the one or more optical parameters of the active optical element, by adjusting at least one of: a transparency, a colour, of at least a portion of the active optical element.
Inoue and Lewis are related as being directed towards dynamic focus-adjustable lenses and systems/methods based on saccadic cues.
Lewis discloses (see FIG. 1, ¶s 183-197, 209-212) the optical apparatus (eyewear 100):
further comprising a plurality of light sensors (sensors 123 including ambient sensors 123b), wherein when detecting the trigger, the at least one processor (of computing device 121) is configured to detect when a light intensity of light signals sensed by the plurality of light sensors (sensors 123 including ambient sensors 123b) exceeds a predefined threshold intensity,
wherein the at least one processor (of computing device 121) is further configured to, when the light intensity of the light signals sensed by the plurality of light sensors (sensors 123 including ambient sensors 123b) exceeds the predefined threshold intensity, drive the controlling means (computing device 121) to change the at least one of the one or more optical parameters (“shading”) of the active optical element (lenses 112(a,b)), by adjusting at least one of: a transparency, a colour, of at least a portion of the active optical element (note: shading impacts both transparency and color).
(Note the following excerpts –
¶s 184-185: "an example eyewear 100 can include... a computing device 121, such as possibly including a processor, ... one or more sensors 123, such as possibly including...ambient sensors 123b disposed to receive information about an environment near the wearer..."
¶s 209-210: "eyewear 100 can be responsive to environment features, such as: features of wearer's field of view (FOV), features of objects or scenes within the wearer's FOV, other features of the ambient environment...", "features of the wearer's field of view can include one or more of: ambient light, such as total luminance, luminance in a particular region thereof (such as in a region of peripheral vision), ..."
See also ¶s 213-223 describing various situations in which eyewear 100 dynamically adjusts shading in response to various triggers – e.g.:
"heavily shade infalling light" – in response to an “IFF signal” (¶ 213)
“remove any shading against sunlight”, "shading against sunlight", etc. – in response to “a signal [that] can indicate that the wearer is about to enter or to exit a dark tunnel” (¶s 215-217)
"adjust shading, or other effects, with respect to an object or with respect to a portion of the user's field of view (FOV)" – in response to a “signal [that] indicates that the wearer is about to view a display” (¶s 220-222)
Clearly, therefore, eyewear 100 is configured to detect when a light intensity of light signals exceeds a threshold intensity, i.e. for desirable lighting/viewing conditions.)
It would have therefore been obvious for one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Inoue with teachings of Lewis, in order to expand the functionality of the optical device (e.g. by providing aforementioned lighting corrections), and to achieve a more standard device that can meet a variety of needs and provide proper assistance in all circumstances (Lewis ¶s 1-11).
Regarding claims 9 and 20, modified Inoue discloses the optical apparatus of claim 8 and the method of claim 19.
Lewis further discloses (see FIG. 1, ¶s 183-197, 209-212) wherein the at least one processor (of computing device 121) is further configured to:
determine a gaze direction of a given eye of the user; and
select the portion of the active optical element (lenses 112(a,b)) in which the at least one of: the transparency, the colour, is to be adjusted (note: shading impacts both transparency and color), based on the gaze direction.
(Note the following excerpts –
¶ 200: "Similar to the lens regions 131..., each lens pixel 141 can be individually controlled, such as by the computing device 121... This can have the effect that the wearer’s vision can be corrected or enhanced for each direction where the wearer might look."
¶ 203: "the lenses 112 can include... alternative regions that can have their shading, or other effects, separately adjusted.")
Regarding claims 10 and 21, Inoue discloses the optical apparatus of claim 1 and the method of claim 12.
Inoue further discloses (see FIGs. 1 and 7, ¶s 41-49 and 64-71) wherein the at least one processor (gazing point distance measuring section 120, focal length selection section 130, controller 150) is configured to:
when the predicted time duration is longer than a predefined time threshold, drive the controlling means to change the at least one of the one or more optical parameters of the active optical element without a delay.
(Consider the following points:
Inoue discloses a time duration, corresponding to the predefined time threshold, that is (pre)determined based on change in gazing point distance (i.e. in ¶s 47, 67-68; ¶s 53-57 as well).
Inoue further states that it may be preferable to set this time duration such that saccadic eye movement can occur during this time (i.e. in ¶s 61-62; in other words, one may account for the predicted time to a next saccadic eye movement when determining the time duration) – at which point it is desirable to switch the focal length.
However, if no saccadic eye movement occurs in this time duration – i.e. if the predicted time to a next saccadic eye movement is longer than the predefined time threshold – then the focal length switches without further delay (¶s 48, 70-71))
Inoue does not disclose wherein the at least one processor is configured to:
determine an average frequency at which previous saccades of the user's eye have occurred; and
predict a time duration after which a next saccade is expected to occur, based on the average frequency and time elapsed since a last saccade of the user's eyes
Inoue and Lewis are related as being directed towards dynamic focus-adjustable lenses and systems/methods based on saccadic cues.
Lewis discloses (see FIG. 1, ¶s 183-197, 241-243) wherein the at least one processor (computing device 121) is configured to:
determine an average frequency at which previous saccades of the user's eye have occurred (¶ 241: “the eyewear 100 can determine whether the wearer is subject to... concerns, in response to the wearer's eye activity, such as... saccade rates”, “”The eyewear 100 can determine the actual values of these or other measures… first and other derivatives of those values, first order and other statistical measures of those values, correlations of pairs of those values, medical information with respect to those values”);
predict a time duration after which a next saccade is expected to occur, based on the average frequency and time elapsed since a last saccade of the user's eyes (Examiner notes that – while any average frequency will directly encode the expected time between events and enable the prediction of a next event from elapsed time in a trivially subtractive manner – Lewis, regarding medical conditions associated with average/statistical saccadic rates cited above, further states in ¶ 243: “the eyewear 100 can, with respect to one or more medical conditions, attempt to predict those medical conditions, ... monitor those medical conditions (as they begin, proceed, finish, end, or recur)…”. Thus, Lewis’s disclosure enables the prediction of upcoming saccadic events – based on average saccadic rates, and in an active manner that accounts for elapsed time.)
It would have therefore been obvious for one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Inoue with teachings of Lewis, in order to expand the functionality of the optical device, and to achieve a more standard device that can meet a variety of needs and provide proper assistance in all circumstances (Lewis ¶s 1-11).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to WAI-GA D. HO whose telephone number is (571)270-1624. The examiner can normally be reached Monday through Friday, 10AM - 6PM E.T..
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Stephone Allen can be reached at (571) 272-2434. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/W.D.H./Examiner, Art Unit 2872
/STEPHONE B ALLEN/Supervisory Patent Examiner, Art Unit 2872