Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-8, 10-13, 16-22 and 25 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Alcaide (US 2020.0337653).
Regarding claim 1, Alcaide discloses:
A computer-implemented method comprising: monitoring a gaze and a brain activity of a user interacting with a user interface: detecting that the gaze of the user is directed at an element of the user interface; detecting an expectancy wave in the brain activity of the user in temporal conjunction with the gaze at the element (see Fig. 1-4; [0055-0075]; monitor gaze via 302 and brain activity via BCI 110 to detect gaze of user and associated brain activity at an element on the user interface 271/371)
determining, based on the expectancy wave in temporal conjunction with the gaze, that the users intends to interact with the element: and performing processing related to, such as triggering, an interaction with the element in response to determining that the user intends to interact with the element (see Fig. 1-4; [0055-0075]; based on the temporal conjunction of brain activity and gaze, determine user’s intent of object for interaction and perform such action as determined by the user’s intent)
Regarding claim 2, the rejection of claim 1 is incorporated herein. Alcaide further disclose:
monitoring the gaze and the brain activity of the user comprises: recording the gaze and the brain activity of the user during a training period; and training, based on data gathered during the training period, a classifier that detects expectancy waves in the brain activity (see [0074-0075, 0097]; recording data during training period and classifying acquired signals accordingly).
Regarding claim 3, the rejection of claim 2 is incorporated herein. Alcaide further disclose:
detecting the expectancy wave in the brain activity of the user comprises detecting the expectancy wave via the classifier (see [0097]).
Regarding claim 4, the rejection of claim 3 is incorporated herein. Alcaide further disclose:
recording the gaze and the brain activity of the user during the training period comprises: recording examples of voluntary attention by the user (see [0064]; (e.g., flashing targets))
and recording examples of involuntary attention by the user (see [0049, 0054]; (e.g., imagination tracking))
Regarding claim 5, the rejection of claim 4 is incorporated herein. Alcaide further disclose:
the user interface is within a mixed reality environment (see [0053-0055])
Regarding claim 6, the rejection of claim 5 is incorporated herein. Alcaide further disclose:
monitoring the gaze and the brain activity of the user interacting with the user interface within the mixed reality environment comprises monitoring via a mixed reality headset that displays the mixed reality environment (see [0053-0055])
Regarding claim 7, the rejection of claim 5 is incorporated herein. Alcaide further disclose:
monitoring the gaze and the brain activity of the user interacting with the user interface within the mixed reality environment comprises monitoring via a brain-computer interface that is separate from a mixed reality headset that displays the mixed reality environment (see [0054]; BCI 110 sensors 108 separate from eye tracker 104)
Regarding claim 8, the rejection of claim 5 is incorporated herein. Alcaide further disclose:
Triggering the interaction with the element comprises triggering a change within the mixed reality environment (see [0054]; selectable options within UI)
Regarding claim 10, the rejection of claim 6 is incorporated herein. Alcaide further disclose:
monitoring the brain activity of the user comprises monitoring the brain activity via one or more sensors that comprise an electro-encephalogram (see [0047]).
Regarding claim 11, the rejection of claim 10 is incorporated herein. Alcaide further disclose:
the one or more sensors comprise one or more electrodes placed on the scalp in proximity to at least one occipitoparietal region or occipital region of the user's brain (see Fig. 3d; [0047, 0069]).
Regarding claim 12, the rejection of claim 10 is incorporated herein. Alcaide further disclose:
the one or more sensors comprise one or more electrodes placed on the scalp in proximity to a primary motor cortex of the user's brain (see Fig. 3d; [0047, 0069]).
Regarding claim 13, the rejection of claim 10 is incorporated herein. Alcaide further disclose:
the one or more sensors comprise one or more electrodes placed on the scalp in proximity to each of: an occipital region of the user's brain; an occipitoparietal region of the user's brain; a parietal region of the user's brain; a temporal region of the user's brain; and a primary motor cortex of the user's brain (see Fig. 3d; [0047, 0069]).
Regarding claim 16, the rejection of claim 4 is incorporated herein. Alcaide further disclose:
detecting the expectancy wave in the brain activity of the user in temporal conjunction with the gaze comprises detecting the expectancy wave in temporal sequence with the gaze at the element (see [0064])
Regarding claim 17, the rejection of claim 4 is incorporated herein. Alcaide further disclose:
detecting that the gaze of the user is directed at an additional element of the user interface; detecting a lack of the expectancy wave in the brain activity of the user in temporal conjunction with the gaze at the additional element; determining, based on the lack of the expectancy wave in temporal conjunction with the gaze, that the users does not intend to interact with the additional element; and preventing triggering an interaction with the additional element in response to determining that the user does not intend to interact with the additional element (see Fig. 8a; [0069, 0089-0090]; gaze detection provides signals, additional element is one of P tags, where ‘lack of expectancy’ in conjunction with ‘additional element’ becomes a non-target based on low scoring to then prevent interaction with the non-target (e.g., non-intention)).
Regarding claim 18, the rejection of claim 4 is incorporated herein. Alcaide further disclose:
detecting that the gaze of the user is directed at an element of the user interface is determined by receipt and processing of gaze information from an event tracker component (see [0079-0083])
Regarding claim 19, the rejection of claim 18 is incorporated herein. Alcaide further disclose:
as a function of the monitoring the gaze of the user while interacting with the user interface, the eye-tracker component processes the gaze of the user over a dwell time of 70-500ms to determine a gaze direction (see [0079, 0082-0083]; tracking time of oculomotor signals determined within the 70-500ms range)
Regarding claims 20-22 and 25, claims 20-22 and 25 are rejected under the same rationale as claims 1, 5, 10, and 13, respectively.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to KENNETH BUKOWSKI whose telephone number is (571)270-7913. The examiner can normally be reached Monday - Friday // 0730-1530.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Amr Awad can be reached at 571.272.7764. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/kenneth bukowski/ Primary Examiner, Art Unit 2621