Detailed Action
Notice of Pre-AIA or AIA status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Objections
The Office objects to claims 1, 7–11, 13, and 14 for having the following informalities. Appropriate correction is required.
Claims 1, 13, and 14
On the fifth line of claim 1, the phrase “detecting gaze input” seems to be missing an article of speech prior to “gaze input.” Claims 13 and 14 contain a similar informality.
Claims 7–10
Claims 7–10 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Claim 11
Claim 11 lacks antecedent basis for “the third application” and “the fourth application.” It is believed that claim 11 was likely meant to depend from a different dependent claim than claim 1—possibly claim 8.
The Examiner also wishes to bring the Applicant’s attention to the following potential problem in the logic of claim 11, but acknowledges there is no basis for an objection, because the language is both clear and fully disclosed in paragraph 409 of the Written Description.
As understood by the Examiner, the literal words of claim 11 and paragraph 409 say to send an indication of the air gesture to the third application’s processes without informing the fourth application, regardless of whether the most recent gaze was directed to the third application or the fourth application.
It is unclear why the method considers whether the user last gazed at the third application and whether the user last gazed at the fourth application, if the goal is to always route the gesture to the third application in both instances. However, as mentioned above, this is also what paragraph 409 discloses, so there is no basis for an objection or a rejection related to this issue, unless the Applicant believes there was a typographical error in the specification that propagated to the claims. The Examiner has no extrinsic evidence of such an error, so no objection will be made.
Allowable Subject Matter
Claims 7–10 are allowable, but for their dependence from a rejected claim. The following is a statement of reasons for the indication of allowable subject matter:
Each of these claims recite variations of the invention that involve multitasking with multiple application user interfaces, and essentially using the open or closed state of the one or more eyes to determine which application should ultimately receive an air gesture.
The prior art discloses user interfaces that modify the meaning of a gesture depending on whether a user’s eyes are open or closed as part of a gesture. See, e.g., U.S. Patent Application Publication No. 2017/0103574 A1 (discussed in the rejection of other claims below) and 2015/0338651 A1 (FIG. 5). Additionally, the technique of directing hand or air gestures to different user interfaces depending on the direction of an open-eyed gaze was also known prior to the claimed invention. See, e.g., U.S. Patent Application Publication Nos. 2012/0272179 A1 and 2017/0262168 A1. In another variation, U.S. Patent No. 9,483,113 B1 discloses a user interface where other controllable user interfaces are made available when the user closes one eye, but the user must always close one eye in order to access the additional user interface.
There does not appear to be a single reference that discloses using the closed or open state of the eyes to direct gestures to one user interface over another. Nor is there any evidence of a sufficient reason one of ordinary skill in the art would have hybridized the two categories of prior art discussed above to create a system or method that uses the open/closed state of the eyes to decide which user interface to direct input, instead of using the direction of the user’s gaze.
Claim Rejections – 35 U.S.C. § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. § 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1–6, 11–12, and 13–14 are rejected under 35 U.S.C. § 102(a)(1) as being anticipated by U.S. Patent Application Publication No. 2017/0103574 A1 (“Faaborg”).
Claim 1
Faaborg discloses:
A method, comprising:
Reference is made to FIG. 5, which shows “[a] flowchart of [a] process” along with its results that are shown in FIGS. 4A–4D. Faaborg ¶ 38.
at a computer system that is in communication with a display generation component and one or more input devices:
“First, at block 510, a virtual immersive experience may be initiated by, for example, a first electronic device such as the HMD 100 shown in FIGS. 1 and 2A–2B.” Faaborg ¶ 38.
while presenting, via the display generation component, a user interface object, detecting gaze input directed to the user interface object;
“When a physical boundary of the real world space, or room, is detected by the system, at block 520, the system may generate an alert, at block 530,” such as “a visual indicator” possibly with “a grid overlaid on and/or extending from the virtual scene.” Faaborg ¶ 40. As presently claimed, both/either the alert or the virtual scene itself each fall within the scope of the claimed “user interface object.”
At this time, an “optical tracking device 165” detects the state of the user’s eyes with respect to the virtual world,” Faaborg ¶ 39, in order to resolve the determination in block 540 (discussed below). Faaborg ¶ 41.
after detecting the gaze input directed to the user interface object, detecting, via the one or more input devices, an air gesture,
With or without performing the first command (i.e., of closing the user’s eyes), the system may detect when the user moves or turns his body (but responds differently depending on whether the user’s eyes were closed, as will be discussed below). See Faaborg ¶ 30 (“the user may move in the real world, and that real world movement may be translated into corresponding movement in the virtual world”) and ¶ 42 (“upon receiving the first command (for example, detection of the closing of the user's eyes), the user may initiate a turn, to physically re-orient in the real world space and allow for continued movement in the real world space and corresponding movement in the virtual world.”).
This real world movement falls within the scope of the claimed “air gesture,” based on the description and definition provided in paragraph 191 of the Applicant’s disclosure.
wherein the air gesture is detected before detecting that the gaze input is directed to a location that does not correspond to the user interface object; and
The movement in the real world may occur before the user “stop[s] walking, and closes his eyes.” Faaborg ¶ 33.
in response to detecting the air gesture:
in accordance with a determination that one or more eyes of the user were opened in conjunction with the air gesture being detected, performing an operation corresponding to the user interface object; and
Whenever the user’s eyes are open, the real world movements are translated into corresponding movement in the virtual world. See Faaborg ¶ 34; see also Faaborg FIG. 5 (illustrating that the pausing of virtual experience 550 does not occur until the first command (of closing the user’s eyes) is received).
in accordance with a determination that one or more eyes of the user were closed for more than a threshold amount of time in conjunction with the air gesture being detected, forgoing performing the operation corresponding to the user interface object.
In contrast, when “the system receives a first command, at block 540, the system may essentially pause activity in the virtual world experience, at block 550. This essential pause in activity in the immersive virtual experience experience may allow the user to shift, or re-orient, in the real world space, to allow for continued physical movement in the real world space, and corresponding movement in the virtual world.” Faaborg ¶ 41. The “first command” may be a “detection of the closing of the user's eyes.” Faaborg ¶ 42.
Note that Faaborg also discloses the “threshold amount of time,” by explaining that “the optical tracking device 165 may detect the closing of the user's eyes as a deliberate closing intended to trigger a pause in activity in the virtual world . . . and may distinguish the deliberate closing and opening of eyes from an involuntary blink.” Faaborg ¶ 39. In other words, the threshold amount of time is the time it takes to involuntarily blink.
Claim 2
Faaborg discloses the method of claim 1, further comprising:
in response to detecting the air gesture: in accordance with a determination that one or more eyes of the user were not closed for more than the threshold amount of time in conjunction with the air gesture being detected, performing the operation corresponding to the user interface object.
Whenever the user’s eyes are open, the real world movements are translated into corresponding movement in the virtual world. See Faaborg ¶ 34; see also Faaborg FIG. 5 (illustrating that the pausing of virtual experience 550 does not occur until the first command (of closing the user’s eyes) is received).
Claim 3
Faaborg discloses the method of claim 1, further comprising:
in response to detecting the air gesture: in accordance with a determination that one or more eyes of the user were closed for more than the threshold amount of time in conjunction with the air gesture being detected, performing a second operation that is different from the operation corresponding to the user interface object.
“As the turn may be executed with the user's eyes closed, to facilitate an essentially seamless and continuous immersive virtual experience, in some embodiments, the system, for example, the HMD 100 and/or the handheld electronic device 102, may generate an alert indicating completion of the turn. As the turn is executed with eyes closed, this alert may include, for example, an audible indicator such as a tone and the like, and/or a physical indicator such as vibration and the like.” Faaborg ¶ 42.
Claim 4
Faaborg discloses the method of claim 3,
wherein the second operation does not correspond to the user interface object.
The second operation corresponds to generating an alert, rather than translating movement in the virtual world. Faaborg ¶ 42.
Claim 5
Faaborg discloses the method of claim 3, further comprising:
in response to detecting the air gesture: in accordance with a determination that one or more eyes of the user were not closed for more than the threshold amount of time in conjunction with the air gesture being detected, forgoing performing the second operation.
Whenever the user’s eyes are open, the real world movements are translated into corresponding movement in the virtual world. See Faaborg ¶ 34; see also Faaborg FIG. 5 (illustrating that the pausing of virtual experience 550 does not occur until the first command (of closing the user’s eyes) is received).
Notable for claim 5, the optical tracking device 165 “may distinguish the deliberate closing and opening of eyes from an involuntary blink,” Faaborg ¶ 39, thus disclosing the more specific language of the eyes being “not closed for more than the threshold amount of time,” rather than merely open.
Claim 6
Faaborg discloses the method of claim 3, further comprising:
in response to detecting the air gesture: in accordance with a determination that one or more eyes of the user were opened in conjunction with the air gesture being detected, forgoing performing the second operation.
Whenever the user’s eyes are open, the real world movements are translated into corresponding movement in the virtual world. See Faaborg ¶ 34; see also Faaborg FIG. 5 (illustrating that the pausing of virtual experience 550 does not occur until the first command (of closing the user’s eyes) is received).
Claim 11
Claim 11 recites three elements: (1) the method of claim 1, (2) “in response to detecting the air gesture . . . in accordance with a determination that one or more eyes of the user were closed for less than the threshold amount of time in conjunction with the air gesture being detected and a respective gaze input was directed to the third application before the one or more eyes of the user were closed, sending the indication that the air gesture was detected to one or more processes corresponding to the third application without sending the indication that the air gesture was detected to one or more processes corresponding to the fourth application,” and (3) “in accordance with a determination that one or more eyes of the user were closed for less than the threshold amount of time in conjunction with the air gesture being detected and the respective gaze input was directed to the fourth application before the one or more eyes of the user were closed, sending the indication that the air gesture was detected to one or more processes corresponding to the third application without sending the indication that the air gesture was detected to one or more processes corresponding to the fourth application.”
Faaborg discloses element (1) for the reasons given in the rejection of claim 1. Faaborg does not need to disclose elements (2) or (3) in order to anticipate claim 11, because elements (2) and (3) are each contingent limitations in a method claim that does not recite a required condition precedent for either element. See MPEP § 2111.04 (subsection II.).
In other words, element (2) says what must happen if there is a determination “that one or more eyes of the user were closed for less than the threshold amount of time in conjunction with the air gesture being detected and a respective gaze input was directed to the third application before the one or more eyes of the user were closed,” but claims 1 and 11 do not require such a condition in every instance of the method, so “sending the indication that the air gesture was detected to one or more processes corresponding to the third application without sending the indication that the air gesture was detected to one or more processes corresponding to the fourth application” is also not a required element of the claim. Likewise for element (3), but with respect to “the respective gaze input [being] directed to the fourth application before the one or more eyes of the user were closed.”
Accordingly, since Faaborg discloses each and every required element of claim 11, Faaborg anticipates the claim.
Claim 12
Faaborg discloses the method of claim 1, further comprising:
after detecting the gaze input directed to the user interface object, detecting, via the one or more inputs devices, a touch input directed to the user interface object; and
“In response to the alert, the user may use a pointing device, such as, for example a beam or ray emitted by the handheld electronic device, to point in a new direction (away from the wall).” Faaborg ¶ 29.
in response to detecting the touch input directed to the user interface object: in accordance with a determination that one or more eyes of the user were opened while the touch input was detected, performing the operation corresponding to the user interface object; and in accordance with a determination that one or more eyes of the user were closed for more than a threshold amount of time while the touch input was detected, performing the operation corresponding to the user interface object.
As an initial matter, it is noted that the language of claim 13 is italicized above to highlight that the method always performs the operation corresponding to the user interface object, regardless of whether the one or more eyes of the user were opened or closed for the threshold amount of time.
Faaborg likewise discloses that using the pointing device to point elsewhere within the virtual scene “caus[es] the current virtual environment to gradually fade out and the newly selected virtual environment to fade in.” Faaborg ¶ 29. All the while, “the optical tracking device 165 may detect and track optical gestures such as, for example eyelid movement associated with opening and/or closing of the user's eyes (e.g., closing for a threshold period of time and then opening, opening for a threshold period of time and then closing, closing and/or opening in particular pattern).” Faaborg ¶ 20. Whichever determination optical tracking device 165 makes with respect to the eyes, the outcome of the pointing device input is the same, which is what claim 13 requires.
Claims 13–14
Claims 13 and 14 recite an electronic device and computer readable medium, each of which are programmed to perform exactly the same method that claim 1 recites as being performed at a computer system that is in communication with a display generation component and one or more input devices. Therefore, claims 13 and 14 are rejected over the same findings and rationale as provided above for claim 1 (which include findings for the computer hardware in the same paragraphs).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Justin R. Blaufeld whose telephone number is (571)272-4372. The examiner can normally be reached M-F 9:00am - 4:00pm ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, James K Trujillo can be reached at (571) 272-3677. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
Justin R. Blaufeld
Primary Examiner
Art Unit 2151
/Justin R. Blaufeld/Primary Examiner, Art Unit 2151