DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1, 2, and 4-6 are rejected under 35 U.S.C. 102(a)(1) and 102(a)(2) as being unpatentable by Yamada(US 20180125406 A1).
Regarding claim 1, Yamada discloses a state estimation device comprising: a time-series data acquisition circuitry configured to acquire time-series data (hereinafter referred to as subject time-series data) related to a blinking of a subject during a predetermined task; and a state estimation circuitry configured to generate a state estimation result indicating a psychological/cognitive state of the subject from the subject time-series data and time-series data (hereinafter referred to as reference time-series data) related to a blinking serving as a reference(Furthermore, a condition where the eye tracking data is acquired may not be limited to the natural viewing conditions. In further embodiments, the eye tracker 112 may acquire the eye tracking data from the person P while the person P performs a task, such as driving[0023]. The mental fatigue estimation model 200C shown in FIG. 2C may receive a series of feature frames, each of which includes the base features BF(i) and extended features EF(i) calculated from each corresponding part of the eye tracking stream data within a predetermined time window[0064]. The eye tracking data acquired by the eye tracker 112 may include information of pupil, information of gaze and/or information of blink. The feature extractor 130 shown in FIG. 1 may be configured to extract eye movement features from the information of the pupil, the information of the gaze and/or the information of the blink as the base features. The feature extractor 130 may be further configured to extract other eye movement features from the information of the pupil as the one or more extended features[0038]. A computer-implemented method for estimating a mental state of a target individual includes obtaining first time series data representing pupil dynamics of one eye and second time series data representing pupil dynamics of other eye from the target individual, analyzing the first and second time series data to extract a feature of the eye movement, in which the feature represents relationship of the pupil dynamics between the one eye and the other eye, and estimating the mental state of the target individual using the feature of the eye the movement[abstract]).
Regarding claim 2, Yamada discloses the state estimation device according to claim 1, wherein the state estimation circuitry comprises an index value calculation circuitry that calculates an index value (hereinafter referred to as a synchronization index value) indicating a degree of synchronization between the subject time-series data and the reference time-series data from the subject time-series data and the reference time-series data, and a state estimation result generation circuitry that generates the state estimation result from the synchronization index value(In an embodiment, the coordination relationship may be calculated as a phase synchronization index between the time series data of the left eye and the time series data of the right eye[0043]. The feature extractor 130 may analyze the time series data of the pupil diameter of the left eye and the time series data of the pupil diameter of the right eye to extract the phase synchronization index or correlation value as the one or more extended features. The obtained phase synchronization index or correlation value may be used as a part of or whole of explanatory variables of the mental fatigue estimation model 200. The phase synchronization index, which may be a measure of the coordination relationship in rather short time scale, can be used as the one or more extended features in comparison with the correlation value[0047]).
Regarding claim 4, Yamada discloses the state estimation device according to claim 1, wherein the predetermined task is a task for going around a certain specific course or a task for performing a specific operation at predetermined timing(The eye tracking data was acquired from each participant while the participant was watching a video clip of 5 minutes before and after doing a mental calculation task of approximately 35 minutes by hearing questions, which required no visual processing. Each 5-min phase for video watching consisted of nine short video clips of 30 seconds. The eye tracking data of each 30 seconds obtained between breaks was used as one sample. The states of the mental fatigue of the participants were confirmed by observing statistically significant increment in both of subjective measure (0-10 rating scales) and objective measure (pupil diameter). The eye tracking data collected before the mental calculation task was labelled as “non-fatigue” and the eye tracking data collected after the task was labelled as “fatigue”[0067]).
Regarding claim 5, Yamada discloses a state estimation method comprising: a time-series data acquisition step of acquiring time-series data (hereinafter referred to as subject time-series data) related to a blinking of a subject during a predetermined task by a state estimation device; and a state estimation step of generating a state estimation result indicating a psychological/cognitive state of the subject from the subject time-series data and time-series data (hereinafter referred to as reference time-series data) related to a blinking serving as a reference by the state estimation device(Fig. 1, In further embodiments, the eye tracker 112 may acquire the eye tracking data from the person P while the person P performs a task, such as driving[0023]. The eye tracking data acquired by the eye tracker 112 may include information of pupil, information of gaze and/or information of blink. The feature extractor 130 shown in FIG. 1 may be configured to extract eye movement features from the information of the pupil, the information of the gaze and/or the information of the blink as the base features[0038]. FIG. 1 illustrates a block/flow diagram of a mental fatigue estimation system 100. As shown in FIG. 1, the mental fatigue estimation system 100 may include an eye tracking system 110, a raw training data store 120, a feature extractor 130, a training system 140, a model store 150, and an estimation engine 160[0020]. FIG. 3 illustrates schematic examples of time series data representing pupil dynamics obtained from both eyes of a person, which can be used to extract one or more extended features according to an embodiment of the present invention[0012]).
Regarding claim 6, Yamada discloses a non-transitory computer-readable storage medium which stores a program for causing a computer to function as the state estimation device according to claim 1(A computer program product for estimating a mental state of a target individual, the computer program product comprising a non-transitory computer readable storage medium having program instructions embodied therewith, the program instructions executable by a computer to cause the computer to perform the method of claim 1[claim 20]).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim 3 is rejected under 35 U.S.C. 103 as being unpatentable over Yamada and in view of Daikokuya(JP 2011210103 A).
Regarding claim 3, Yamada discloses the state estimation device according to claim 2, but fails to explicitly disclose wherein the state estimation result indicates a degree of concentration of the subject, and indicates that the degree of concentration of the subject is higher as the synchronization index value is higher.
However, Daikokuya teaches “ The driver driving concentration degree determination unit 19 compares the occurrence frequency of the cluster blink received from the cluster blink occurrence frequency calculation unit 17 and the threshold value stored in the storage unit 25 to determine the degree of concentration of the driver for driving. Specifically, the driver driving concentration degree determination unit 19 determines that the threshold for driving of the driver has decreased when the occurrence frequency of the group blink of the driver is higher than the threshold stored in the storage unit 25. On the other hand, the driver driving concentration degree determination unit 19 determines that the concentration degree of the driver with respect to driving is not lowered when the occurrence frequency of the cluster blink of the driver is lower than the threshold stored in the storage unit 25. When the driver driving concentration degree determining unit 19 determines that the driver's concentration on driving is decreasing, the alarm device 11 is activated(see attached translation, page 4, paragraph 1)”.
It would be obvious to one of ordinary skill in the art before the effective filing date to configure the mental state estimation method of Yamada with the concentration degree determination of the driver’s state measurement device of Daikokuya. Doing so would specify concentration as one of the factors estimated from the blinks of a patient during mental state evaluation.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MARIA CATHERINE ANTHONY whose telephone number is (703)756-4514. The examiner can normally be reached 7:30 am - 4:30 pm, EST, M-F.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, CARL LAYNO can be reached at (571) 272-4949. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MARIA CATHERINE ANTHONY/Examiner, Art Unit 3796
/CARL H LAYNO/Supervisory Patent Examiner, Art Unit 3796