Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1, 8 and 15 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Clarkson (2013/0211843) hereinafter, Clarkson.
In regards to claim 1, Clarkson teaches obtaining hand tracking data for a first hand based on sensor data (fig. 1 110)),
Methods, apparatuses, systems, and computer-readable media for performing engagement-dependent gesture recognition are presented. According to one or more aspects, a computing device may detect an engagement of a plurality of engagements, and each engagement of the plurality of engagements may define a gesture interpretation context of a plurality of gesture interpretation contexts. Subsequently, the computing device may detect a gesture. Then, the computing device may execute at least one command based on the detected gesture and the gesture interpretation context defined by the detected engagement. In some arrangements, the engagement may be an engagement pose, such as a hand pose, while in other arrangements, the detected engagement may be an audio engagement, such as a particular word or phrase spoken by a user.(abstract)
wherein the first hand is associated with a current state from a group consisting of an active state and an inactive state [0030-0032] (fig. 2 limited detection and full detection) (fig. 3 315 Y and N), and
[0030] FIG. 2 illustrates an example timeline showing how a computing device may switch from a limited detection mode into a full detection mode in response to detecting an engagement input in accordance with one or more illustrative aspects of the disclosure. As seen in FIG. 2, at a start time 205, a computing device, such as device 100, may be in a limited detection mode. In the limited detection mode, the device processes sensor data to detect an engagement input. However, in this mode, the device may not execute commands associated with user inputs available for controlling the device in the full detection mode. In other words, only engagement inputs are valid in the limited detection mode in some embodiments.
PNG
media_image1.png
562
626
media_image1.png
Greyscale
wherein the active state is associated with first gesture detection parameters [0036, 0095] (fig. 3 (Y) to 340)), and wherein the inactive state is associated with second gesture detection parameters (fig. 3 (N) and back to 310)) [0036-0040]
[0036] Continuing to refer to FIG. 2, once the computing device has entered the full detection mode, the computing device may detect one or more gestures. In response to detecting a particular gesture, the device may interpret the gesture based on the gesture interpretation context corresponding to the most recent engagement input. The recognizable gestures in the active gesture interpretation context may each be associated with a command. In this way, when any one of the gestures is detected as input, the device determines the command with which the gesture is associated, and executes the determined command. In some embodiments, the most recent engagement input may not only determine which commands are associated with which gestures, but the engagement input may be used to determine one or more parameters used to detect one or more of those gestures.
PNG
media_image2.png
688
620
media_image2.png
Greyscale
detecting a first input gesture by the first hand based on the hand tracking data and in accordance with the current state (fig. 318 and 320 and 325))[0040-0045];
PNG
media_image3.png
446
522
media_image3.png
Greyscale
initiating an input action associated with the first input gesture in response to detecting the first input gesture based on the first gesture detection parameters (fig. 3 (340)) or (Examiner notes “or” operator only requires one element) the second gestures detection parameters in according with the current state of the first hand (fig. 3 (310))(fig. 9 (802-808) (Table A and Table B [0043-0044].
In regards to claim 8, In regards to claim 1, Clarkson a non-transitory computer readable medium comprising computer readable executable by one ore more processors to: (fig. 1 110)),
Obtain hand tracking data for a first hand based on sensor data, wherein the first hand is associated with a current state from a group consisting of an active state and an inactive state [0030] (fig. 2 limited detection and full detection) (fig. 3 315 Y and N), and
wherein the active state is associated with first gesture detection parameters [0036, 0095] (fig. 3 (Y) to 340)), and wherein the inactive state is associated with second gesture detection parameters (fig. 3 (N) and back to 310))
detecting a first input gesture by the first hand based on the hand tracking data and in accordance with the current state (fig. 318 and 320 and 325));
initiate an input action associated with the first input gesture in response to detecting the first input gesture based on the first gesture detection parameters (fig. 3 (340)) or the second gestures detection parameters in according with the current state of the first hand (fig. 3 (310) [0040-0046).
In regards to claim 15, Clarkson teaches system comprising: one or more processors (abstract); and one or more computer readable media comprising computer readable code executable by the one or more processors to (fig. 5 (510]) obtain hand tracking data for a first hand based on sensor data (fig. 1 (110)): readable medium comprising computer readable executable by one ore more processors to: (fig. 5 (510 and 535))),
Obtain hand tracking data for a first hand based on sensor data, wherein the first hand is associated with a current state from a group consisting of an active state and an inactive state [0030-0036] (fig. 2 limited detection and full detection) (fig. 3 315 Y and N), and
wherein the active state is associated with first gesture detection parameters [0036-0042, 0095] (fig. 3 (Y) to 340)), and wherein the inactive state is associated with second gesture detection parameters (fig. 3 (N) and back to 310))
detect a first input gesture by the first hand based on the hand tracking data and in accordance with the current state (fig. 318 and 320 and 325));
initiate an input action associated with the first input gestures based on a first set of parameters (fig. 3 (340)) (fig. 3 (310)).
in accordance with a determination that the first hand is in the inactive states, initiate the input action associated with the first input gesture in response to detecting the first input gesture based on a second set of parameters different from the first set of parameters. (fig. 3 (340)) (fig. 3 (310) (fig. 4 (405, 410, 420) [0040-0046].
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 2-6, 9-12,14,16-18 and 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Clarkson in view of Zhou et al (11,112,875) hereinafter, Zhou.
In regards to claim 2, Clarkson fails to teach wherein a probability that the hand tracking data causes the input action to be initiated in the active state is greater than a probability that the hand tracking data causes the input action to be initiate in the inactive state.
However, Zhou teaches wherein a probability that the hand tracking data causes the input action to be initiated in the active state is greater than a probability that the hand tracking data causes the input action to be initiate in the inactive state. (fig. 19a (1904 yes to 1910 and 1912) vs less chance 1904 no to 1906 ignore gesture) Zhou
It would have been obvious to one of ordinary skill in the art to modify the teachings of Clarkson to further include wherein a probability that the hand tracking data causes the input action to be initiated in the active state is greater than a probability that the hand tracking data causes the input action to be initiate in the inactive state as taught by Zhou in order to provide improved methods and systems techniques for gesture control of a device in a multi-user environment that make efficient use of computing resources, minimize confusion, and resolve interactions between hand gestures performed by more than one user. )(col. 1, lines 35-55) and in order to determine user’s intent.
In regards to claim 3, Clarkson fails to teach the method of claim 1, further comprising, while the first hand is in the active state: determining that an inactive criterion is satisfied; and in accordance with a determination that the inactive criterion is satisfied, determining that the first hand is in an inactive state.
However, Zhou teaches, further comprising, while the first hand is in the active state: determining that an inactive criterion is satisfied; and in accordance with a determination that the inactive criterion is satisfied, determining that the first hand is in an inactive state. (fig. 19a (1902))
PNG
media_image4.png
726
612
media_image4.png
Greyscale
(196) At step 1906, a time-out timer is consulted to determine whether the hand gesture being performed outside of the activation zone has timed out by remaining outside of the activation zone for more than a predetermined period of time. If the gesture is determined at step 1906 to have timed out, then the hand gesture is ignored by the gesture recognition subsystem 322 until a reset condition is detected. In some examples, the reset condition may be that the gesture recognition subsystem 322 detects that the hand gesture is no longer being performed by the user's hand. Once the reset condition has been triggered, the hand gesture may once again be recognized by the gesture recognition subsystem 322 at step 1902. (col. 35, lines 35-50))
It would have been obvious to one of ordinary skill in the art to modify the teachings of Clarkson to further include comprising, while the first hand is in the active state: determining that an inactive criterion is satisfied; and in accordance with a determination that the inactive criterion is satisfied, determining that the first hand is in an inactive state as taught by Zhou in order to determine user’s intent.
In regards to claim 4 Clarkson in view of Zhou teaches the method of claim 3, wherein the inactive criterion comprises a timeout. (col. 35, lines 35-50))Zhou and (fig. 3 (330) Clarkson.
In regards to claim 5, Clarkson in view of Zho method of claim 3, teaches further comprising: transitioning the first hand from the inactive state to the active state in response to initiating the input action associated with the first input gesture based on the second gesture detection parameters (fig. 3 (305-340) Clarkson in view of (fig. 19a (1900), 1910)) Zhou .
In regards to claim 6, Clarkson in view of Zho, see rational of claim 3, the method of claim 1, further comprising, while the first hand is in the inactive state: determining that an active criterion is satisfied; and in accordance with a determination that the active criterion is satisfied, determining that the first hand is in the active state (fig. 3 340 to 320) Clarkson in view of (fig. 19a (1900), 1910))Zhou.
In regards to claim 9, see rational of claim 2, Clarkson in view of Zho teaches the non-transitory computer readable medium of claim 8, wherein a probability that the hand tracking data causes the input action to be initiated in the active state is greater than a probability that the hand tracking data causes the input action to be initiate in the inactive state. (fig. 19a (1904 yes to 1910 and 1912) vs less chance 1904 no to 1906 ignore gesture) Zhou
In regards to claim 10, see rational of claim 2, Clarkson in view of Zho teaches non-transitory computer readable medium of claim 8, further comprising computer readable code to, while the first hand is in the active state: determining that an inactive criterion is satisfied; and in accordance with a determination that the inactive criterion is satisfied, determining that the first hand is in an inactive state (fig. 19a 1902 to 1904 to 1906 to ignore) Zhou.
In regards to claim 11, see rational of claim 2, Clarkson in view of Zho teaches non-transitory computer readable medium of claim 10, further comprising computer readable code to: transition the first hand from the inactive state to the active state in response to initiating an input action based on the second set of parameters (fig. 19a 1904 to 1906 to no to 1912)) Zhou.
In regards to claim 12, see rational of claim 2, Clarkson in view of Zho teaches non-transitory computer readable medium of claim 8, further comprising computer readable code to, while the first hand is in the inactive state: determine that an active criterion is satisfied; and in accordance with a determination that the active criterion is satisfied, determine that the first hand is in the active state (fig. 19a 1904 to No to 1906 to 1912) Zhou.
In regards to claim 14, see rational of claim 2, Clarkson in view of Zho teaches non-transitory computer readable medium of claim 8, wherein the first input gesture is performed by the first hand and a second hand, and wherein initiating the input action comprises transitioning the second hand to the active state. (fig. 3 340 to 320) Clarkson in view of (fig. 19a (1900), 1912))Zhou.
In regards to claim 16, see rational of claim 2, Clarkson in view of Zho teaches system of claim 15, further comprising computer readable code to, wherein a probability that the hand tracking data causes the input action to be initiated in the active state is greater than a probability that the hand tracking data causes the input action to be initiate in the inactive state. (fig. 19a (1904 yes to 1910 and 1912) vs less chance 1904 no to 1906 ignore gesture) Zhou
In regards to claim 17, see rational of claim 2, Clarkson in view of Zho teaches system of claim 16, wherein comprises a timeout causes the input action to be initiate in the inactive state. (fig. 3 330 to 310) Clarkson (col. 35, lines 35-50))Zhou
In regards to claim 18, see rational of claim 2, Clarkson in view of Zho teaches system of claim 16, further comprising computer readable code to: transition the first hand from the inactive state to the active state in response to initiating an input action based on the second set of parameters. (fig. 19a 1904 to 1906 to no to 1912)) Zhou.
In regards to claim 20, see rational of claim 2, Clarkson in view of Zho teaches system of claim 15, wherein the first input gesture is performed by the first hand and a second hand, and wherein initiating the input action comprises transitioning the second hand to the active state. [004 0029,0033] Clarkson in view of (fig. 19a (1900), 1912))Zhou.
Claim(s) 7, 13 and 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Clarkson in view of Poupyrev (2020/0150776) hereinafter, Poupyrev.
In regards to claim 7, Clarkson fails to teach (Original) the method of claim 1, further comprising: obtaining hand tracking data for a second hand based on the sensor data; detecting a second input gesture by the second hand based on the hand tracking data; determining whether the second hand is in an active state; and in accordance with a determination that the first hand is not in the active state, disregarding the second input gesture while performing the input action.
However, Poupyrev teaches further comprising: obtaining hand tracking data for a second hand based on the sensor data; detecting a second input gesture by the second hand based on the hand tracking data; determining whether the second hand is in an active state; and in accordance with a determination that the first hand is not in the active state, disregarding the second input gesture while performing the input action.[0043-00046].(fig. 4 408 and 410) and (fig. 5 (502 and 504)) Poupyrev
It would have been obvious to one of ordinary skill in the art to modify the teachings of Clarkson to further include further comprising: obtaining hand tracking data for a second hand based on the sensor data; detecting a second input gesture by the second hand based on the hand tracking data; determining whether the second hand is in an active state; and in accordance with a determination that the first hand is not in the active state, disregarding the second input gesture while performing the input action as taught by Poupyrev in order to help with some gestures may be difficult to detect, distinguish from actions that are not intended as gestures, and determine a context for the gestures.[0003]
In regards to claim 13, Clarkson in view of Poupyrev, see rational of claim 7, teaches (Original) The non-transitory computer readable medium of claim 8, further comprising computer readable code to:obtain hand tracking data for a second hand based on the sensor data; detect a second input gesture by the second hand based on the hand tracking data; determine whether the second hand is in an active state; and in accordance with a determination that the first hand is not in the active state, disregard the second input gesture while performing the input action in accordance with the first input gesture. .[0043-00046].(fig. 4 408 and 410) and (fig. 5 (502 and 504)) Poupyrev
In regards to claim 19, Clarkson in view of Poupyrev, see rational of claim 7, teaches (Original) The system of claim 15, further comprising computer readable code to: obtain hand tracking data for a second hand based on the sensor data; detect a second input gesture by the second hand based on the hand tracking data; determine whether the second hand is in an active state; and in accordance with a determination that the first hand is not in the active state, disregard the second input gesture while performing the input action. .[0043-00046].(fig. 4 408 and 410) and (fig. 5 (502 and 504)) Poupyrev
Response to Arguments
Applicant’s arguments with respect to claim(s) 1-20 have been considered but are moot because the new ground of rejection does not rely on any combination of references applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Examiner is open to an interview if Applicant thinks it would help advance prosecution.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to GRANT SITTA whose telephone number is (571)270-1542. The examiner can normally be reached M-F 7:30-4:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Patrick Edouard can be reached at 571-272-6084. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/GRANT SITTA/Primary Examiner, Art Unit 2622