DETAILED ACTION
This office action is responsive to the amendment filed 10/15/25. Claims 1-9 remain pending and under prosecution.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Election/Restrictions
Applicant’s election without traverse of Group I in the reply filed on 3/21/2025 is acknowledged.
Claims 10-20 were withdrawn from further consideration pursuant to 37 CFR 1.142(b) as being drawn to a nonelected invention, there being no allowable generic or linking claim. Election was made without traverse in the reply filed on 3/21/2025.
Claims 1-9 remain pending and under prosecution.
Claims 10-20 were cancelled on 10/15/2025.
Information Disclosure Statement
It is noted that no IDS has been submitted. Applicant is reminded of the requirement to submit all known prior art.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
No claims are interpreted under 112(f).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 1-5 and 8-9 are rejected under 35 U.S.C. 103 as being unpatentable over Nagashima et al (WO 2005046464) in view of Torch (US Pub No. 20010028309).
In regard to Claim 1, Nagashima et al a device for measuring eye movement in a human subject, the device comprising:
a housing 11 – mask portion, best seen in Figure 1-2 (0013);
at least one stimulator 13 mounted to the housing, the at least one stimulator configured to provide stimulus to one or both eyes of the subject – “the light stimulating means 13 are arranged so as to face the eyeball of one eye” (0013 and subsequent unnumbered paragraphs);
a sensor – CCD camera 15 – mounted to the housing, the camera configured to collect information related to movement of one or both eyes of the subject – forms part of pupil imaging means 12, best seen in Figure 1-2;
a processing unit – control analysis means 14 – and
a user interface configured to control the at least one stimulator and display information collected by the camera – “The control analysis means 14 which controls the pupil imaging means 12 and the light stimulus means 13 and performs arithmetic analysis on the pupil miosis and mydriasis based on the image taken by the pupil imaging means 12,” “The imaging and analysis results are displayed on a display device as needed,” best seen in Figure 3.
It is submitted that the control analysis means necessarily includes a user interface as is well-known, routine, and conventional in the art.
However, Nagashima et al do not expressly disclose the processing unit configured to define a vertical axis and track an upper eyelid tracking point of at least one of the one or both eyes of the subject along the vertical axis based on the information related to movement of one or both eyes of the subject received from the camera.
Torch teach that it is well-known in the art to provide an analogous device for measuring eye movement in a human subject comprising a processing unit 540 configured to define a vertical axis (due to the vertical linear arrangement of sensors 533 – “a linear array 530 of emitters 532 and sensors 533 may be provided, preferably in a vertical arrangement” 0099), best seen in Figure 10D, and
track an upper eyelid tracking point 302 – which is necessarily tracked in moving upper eyelid, best seen in Figure 10D – of at least one of the one or both eyes of the subject along the vertical axis based on sensed information related to movement of one or both of the eyes of the subject received from sensors 533 – “the CPU 540 may cycle through the sensors 533 in the array 530 and sequentially process the signal from each of the sensors 533, similar to the processors previously described… each sensor… may detect movement of the eyelid 302 past a particular portion of the eye 300, e.g., to measure PERCLOS, as shown in FIG. 12A” (0100); “the output signals from the sensors 553 may be processed to measure the percentage of pupil coverage of the eyelid 302, for example, due to partial eye closure” (0101).
Torch also teach camera 830 can monitor a position of a portion of the eye, such as the pupil along a vertical axis, best seen in Figure 16 – “ FIG. 16 shows an exemplary video output from a camera included in a system having twenty emitters disposed in a vertical arrangement. The camera may detect twenty discrete regions of light arranged as a vertical band. The camera may also detect a "glint" point, G, and/or a moving bright pupil, P. Thus, the movement of the pupil may be monitored in relation to the glint point, G, and/or in relation to the vertical band 1-20… Thus, the processor may determine the location of the pupil in terms of orthogonal coordinates (e.g., x-y or angle-radius) relative to the reference frame 850 (0121-0122).
Thus, Torch teach tracking an upper eyelid tracking point of at least one eye along a vertical axis based on sensed information related to movement of one or both of the eyes, the information taken from sensors 533. Torch also teaches that an analogous camera can detect movement of one or both of the eyes, such as for measuring drowsiness (abst).
Therefore, it would have been obvious to one of ordinary skill in the art at the time of filing to modify Nagashima et al such that the processing unit configured to define a vertical axis and track an upper eyelid tracking point of at least one of the one or both eyes of the subject along the vertical axis based on the information related to movement of one or both eyes of the subject received as taught by Torch to effectively determine eyelid movement of the subject as desired such as for monitoring drowsiness, wherein in combination it would have been obvious to have the information related to movement of one or both eyes of the subject received from the camera of Nagashima et al, as taught by Torch, as an effective mechanism to monitor the movement of one or both of the eyes in the drowsiness determination.
2. Nagashima et al disclose the device of claim 1, further comprising a seal –periphery of the goggle – that defines a cavity within which the camera 15 and the at least one stimulator 13 are disposed, best seen in Figure 1-2, the seal configured to conform to the shape and contours of the subject's face to minimize ambient light within the cavity – “the peripheral part is in close contact with the skin of the face, blocking light from entering the hollow interior from the outside… it is possible to effectively track the pupil-to-light reaction accompanying the light emission of the light stimulating means 13. This eliminates the need for a darkroom for dark adaptation.”
3. Nagashima et al disclose the device of claim 1, further comprising a first stimulator mounted on a left side of the housing and a second stimulator mounted on a right side of the housing, wherein each of the first and second stimulators are configured to provide stimulus to a respective eye of the subject – fixation light emitting diode 20 is mounted on a left side and light stimulating means 13 is mounted on a right side, best seen in Figure 1-2.
4. Nagashima et al disclose the device of claim 3, further comprising a divider – defined as the structure separating the left goggle from the right goggle – configured to preclude stimulation provided by one of the first and second stimulators from affecting or stimulating the eye closest to the other stimulator, best seen in Figure 1-2.
5. Nagashima et al disclose the device of claim 4, further comprising a seal that defines a cavity (hollow interior) within which the camera 15 and the at least one first and second stimulators 13 are disposed, best seen in Figure 2 – ““the peripheral part is in close contact with the skin of the face, blocking light from entering the hollow interior from the outside,”
wherein both the seal and the divider are configured to conform to the shape and contours of the subject's face, best seen in Figure 1-2.
8. Nagashima et al disclose the device of claim 1, wherein the camera 15 collects information related to a blink reflex of one or both eyes of the subject (abst).
9. Nagashima et al disclose the device of claim 1, wherein the camera 15 collects information related to pupillary response of the subject (abst).
Claims 1 and 6-9 are rejected under 35 U.S.C. 103 as being unpatentable over Kopke (US Pub No. 20050018139) in view of Torch (US Pub No. 20010028309).
In regard to Claim 1, Kopke disclose a device for measuring eye movement in a human subject, the device comprising:
a housing 1 – shown in Figure 1, making up a headworn mask (0017);
at least one stimulator – TFT screen – mounted to the housing, the at least one stimulator configured to provide stimulus to one or both eyes of the subject – “The computer may have in its memory a selection of standard tests, including pictures and geometrical figures that can be transmitted to the displays and shown to the test subject” (0017);
a video camera mounted to the housing, the camera configured to collect information related to movement of one or both eyes of the subject – “The mask further comprises two miniature video cameras, one located at each of the test subject's eyes. The purpose of these cameras is to record the eye response to the picture displayed on the TFT display” (0018);
a processing unit – computer, best seen in Figure 1 (0017-0020) and
a user interface configured to control the at least one stimulator and display information collected by the camera, best seen in Figure 5, which shows a conventional computer with keyboard and other related user interface.
However, Nagashima et al do not expressly disclose the processing unit configured to define a vertical axis and track an upper eyelid tracking point of at least one of the one or both eyes of the subject along the vertical axis based on the information related to movement of one or both eyes of the subject received from the camera.
Torch teach that it is well-known in the art to provide an analogous device for measuring eye movement in a human subject comprising a processing unit 540 configured to define a vertical axis (due to the vertical linear arrangement of sensors 533 – “a linear array 530 of emitters 532 and sensors 533 may be provided, preferably in a vertical arrangement” 0099), best seen in Figure 10D, and
track an upper eyelid tracking point 302 – which is necessarily tracked in moving upper eyelid, best seen in Figure 10D – of at least one of the one or both eyes of the subject along the vertical axis based on sensed information related to movement of one or both of the eyes of the subject received from sensors 533 – “the CPU 540 may cycle through the sensors 533 in the array 530 and sequentially process the signal from each of the sensors 533, similar to the processors previously described… each sensor… may detect movement of the eyelid 302 past a particular portion of the eye 300, e.g., to measure PERCLOS, as shown in FIG. 12A” (0100); “the output signals from the sensors 553 may be processed to measure the percentage of pupil coverage of the eyelid 302, for example, due to partial eye closure” (0101).
Torch also teach camera 830 can monitor a position of a portion of the eye, such as the pupil along a vertical axis, best seen in Figure 16 – “ FIG. 16 shows an exemplary video output from a camera included in a system having twenty emitters disposed in a vertical arrangement. The camera may detect twenty discrete regions of light arranged as a vertical band. The camera may also detect a "glint" point, G, and/or a moving bright pupil, P. Thus, the movement of the pupil may be monitored in relation to the glint point, G, and/or in relation to the vertical band 1-20… Thus, the processor may determine the location of the pupil in terms of orthogonal coordinates (e.g., x-y or angle-radius) relative to the reference frame 850 (0121-0122).
Thus, Torch teach tracking an upper eyelid tracking point of at least one eye along a vertical axis based on sensed information related to movement of one or both of the eyes, the information taken from sensors 533. Torch also teaches that an analogous camera can detect movement of one or both of the eyes, such as for measuring drowsiness (abst).
Therefore, it would have been obvious to one of ordinary skill in the art at the time of filing to modify Nagashima et al such that the processing unit configured to define a vertical axis and track an upper eyelid tracking point of at least one of the one or both eyes of the subject along the vertical axis based on the information related to movement of one or both eyes of the subject received as taught by Torch to effectively determine eyelid movement of the subject as desired such as for monitoring drowsiness, wherein in combination it would have been obvious to have the information related to movement of one or both eyes of the subject received from the camera of Nagashima et al, as taught by Torch, as an effective mechanism to monitor the movement of one or both of the eyes in the drowsiness determination.
6. Kopke disclose the device of claim 1, further comprising a screen mounted on the housing, the screen configured to display instructions to the subject – “instructions are given during the vision screening to the test subject based on the eye response of the test subject” (0011-0012, 0019).
7. Kopke disclose the device of claim 1, further comprising a screen mounted on the housing, the screen configured to provide stimulus to the subject – TFT screen (0017).
8. Kopke disclose the device of claim 1, wherein the camera collects information related to a blink reflex of one or both eyes of the subject (0018).
9. Kopke disclose the device of claim 1, wherein the camera collects information related to pupillary response of the subject (0018).
Response to Arguments
Applicant’s arguments with respect to claim(s) above have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Torch is set forth to teach the new limitations.
The previous 112 rejections are moot in light of applicant’s amendments
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Huong NGUYEN whose telephone number is (571)272-8340. The examiner can normally be reached 10 am - 6 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jennifer Robertson can be reached at (571)272-5001. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/H.Q.N/Examiner, Art Unit 3791
/JENNIFER ROBERTSON/ Supervisory Patent Examiner, Art Unit 3791