DETAILED ACTION
This action is responsive to the filing of 4/24/24. Claims 1-20 are pending and have been considered below.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Allowable Subject Matter
Claims 3, and 17 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
The following is an examiner's statement of reasons for allowance. The prior art of record fails to disclose the air gesture performed within at three different predefined distances to the three different surfaces in combination with other limitations recited within the claimed context. The claims present a combination of limitations that differ from the cited art, and there is no reasonable combination of references that would teach it.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-2. 4-16. 18-20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Davis (2019/0121522.)
Claim 1, 19, 20: Davis discloses a method, comprising:
at a first computer system (Fig. 55, computer system) that is in communication (Fig. 55: 20, Bus) with one or more input devices (Fig. 55: 30, Input Device(s)):
detecting, via the one or more input devices, a first air gesture (Fig. 10A: 420, finger pointing gesture); and
in response to detecting the first air gesture:
in accordance with a determination that the first air gesture (Fig. 10A: 420, finger pointing gesture) is performed relative to a first surface (Fig. 10A: front of the forearm), changing a first setting of a control (Fig. 10B, the control, ‘SEARCH’ is changed to being selected);
in accordance with a determination that the first air gesture (Fig. 10A: 420, finger pointing selection gesture) is performed relative to a second surface different from the first surface (Fig. 13A: back of the forearm), changing the first setting of the control (Fig. 10B-13A, the control, ‘SEARCH’ is changed to being selected); and
in accordance with a determination that the first air gesture (Fig. 10A: 420, finger pointing gesture) is performed relative to a third surface (Fig. 10A, palm of the hand surface) different from the first surface and the second surface (Fig. 10A, not forearm back / front surfaces), forgoing changing the first setting of the control (Fig. 10B, instead, the selection is of Local Restaurants, Italian, French, etc. controls, and not SEARCH.)
Claim 2: Davis discloses the method of claim 1, further comprising: after detecting the first air gesture, detecting, via the one or more input devices, a second air gesture separate from the first air gesture; and in response to detecting the second air gesture: in accordance with a determination that the second air gesture is performed relative to a fourth surface and that a first subject performed the second air gesture, performing a first operation; and in accordance with a determination that the second air gesture is performed relative to the fourth surface and that a second subject different from the first subject performed the second air gesture, performing a second operation different from the first operation (par. 612, AGUI system comprises an AGUI device 201 located on a table 1252 between two users 1002, 1020 that allows the two users to collaborate with the same user interface 534. AGUI system can use a single or multiple AGUI devices to present displays optimized for either or both of the two users. Both users can simultaneously use the system or it may be configured that one user may only view content while the other may both view and control content.)
Claim 4: Davis discloses the method of claim 1, wherein the first surface is separate from the first computer system, wherein the second surface is separate from the first computer system, and wherein the third surface is separate from the first computer system (Fig. 10B, forearm, back and front, and the palm, are separate from the computer system 101.)
Claim 5: Davis discloses the method of claim 1, wherein the first computer system includes the first surface, the second surface, or the third surface (Fig. 10B, forearm, back and front, and the palm, are separate from the computer system 101.)
Claim 6: Davis discloses the method of claim 1, wherein the first setting of the control corresponds to a second computer system different from the first computer system, and wherein the second computer system includes the first surface, the second surface, or the third surface (Fig. 10B, the computer system 101, utilizes (includes) the forearm, back and front, and the palm surfaces.)
Claim 7: Davis discloses the method of claim 1, wherein the control corresponds to the first computer system (par. 931-932, FIG. 55 is a diagrammatic representation of an embodiment of a machine in the form of a computer system 1 of an AGUI device.)
Claim 8: Davis discloses the method of claim 1, wherein the control corresponds to a third computer system different from the first computer system (controlling other computer systems: par. 816-817, The AGUI system may also be programmed to only allow a specific user to interface with a device, appliance, object, vehicle or other system when an authorized user is present.)
Claim 9: Davis discloses the method of claim 1, further comprising: after detecting the first air gesture, detecting, via the one or more input devices, a third air gesture different from the first air gesture; and in response to detecting the third air gesture and in accordance with a determination that the third air gesture is performed relative to the first surface, changing a second setting of the control, wherein the second setting is different from the first setting (par. 772, a finger from the other hand might drag content from one finger to another, or from a finger to the palm area.)
Claim 10: Davis discloses the method of claim 9, wherein the third air gesture is a different type of air gesture than the first air gesture (par. 772, a finger from the other hand might drag content from one finger to another, or from a finger to the palm area.)
Claim 11: Davis discloses the method of claim 1, further comprising: after detecting the first air gesture, detecting, via the one or more input devices, a fourth air gesture different from the first air gesture; and in response to detecting the fourth air gesture: in accordance with a determination that the fourth air gesture is in a first direction, changing a third setting (par. 770, The user may slide a finger of the other hand over the control area of a virtual volume or brightness control to change the control's level); and in accordance with a determination that the fourth air gesture is a second direction different from the first direction, changing a fourth setting different from the third setting (par. 770, The user may slide a finger of the other hand over the control area of a virtual volume or brightness control to change the control's level.)
Claim 12: Davis discloses the method of claim 1, further comprising: after detecting the first air gesture, detecting, via the one or more input devices, a fifth air gesture different from the first air gesture; and in response to detecting the fifth air gesture: in accordance with a determination that the fifth air gesture is in a third direction, changing a fifth setting of a first device different from the first computer system (par. 770, The user may slide a finger of the other hand over the control area of a virtual volume or brightness control to change the control's level); and in accordance with a determination that the fifth air gesture is a fourth direction different from the third direction, changing a sixth setting of a second device different from the first device (par. 770, The user may slide a finger of the other hand over the control area of a virtual volume or brightness control to change the control's level.)
Claim 13: Davis discloses the method of claim 1, further comprising: in response to detecting the first air gesture and in accordance with the determination that the first air gesture is performed relative to the third surface, changing a seventh setting different from the first setting (Fig. 10B, instead, the selection is of Local Restaurants, Italian, French, etc. controls, and not SEARCH.)
Claim 14: Davis discloses the method of claim 1, wherein the control is a first control, the method further comprising: in response to detecting the first air gesture and in accordance with the determination that the first air gesture is performed relative to the third surface, changing a second control different from the first control (Fig. 10B, instead, the selection is of Local Restaurants, Italian, French, etc. controls, and not SEARCH.)
Claim 15: Davis discloses the method of claim 1, further comprising: after detecting the first air gesture, detecting, via the one or more input devices, a sixth air gesture different from the first air gesture; and in response to detecting the sixth air gesture: in accordance with a determination that the sixth air gesture is a third type of air gesture, changing an eighth setting of a third control; and in accordance with a determination that the sixth air gesture is a fourth type of air gesture different from the third type of air gesture, changing a ninth setting of a fourth control, wherein the fourth control is different from the third control (par. 568, The user may also use finger, hand, body, eye facial gesture and movements, voice command or other user commands and inputs; par. 770, The user may manipulate the interface using a finger of the other hand. The user may touch a virtual control to make a selection. The user may use a finger gesture, such as a flick of a fingertip, to make a selection. The user may slide a finger of the other hand over the control area of a virtual volume or brightness control to change the control's level. The user may tap the displayed virtual key of a keypad to enter that value.)
Claim 16: Davis discloses the method of claim 1, further comprising: after changing the first setting of the control in accordance with the determination that the first air gesture is performed relative to the first surface, detecting, via the one or more input devices, a first request to change a configuration of one or more surfaces (par. 597, Gestures may be employed by the user to indicate the exact position and size of the area of the table to be used as a display); after detecting the first request to change the configuration of the one or more surfaces, detecting, via the one or more input devices, a seventh air gesture different from the first air gesture (Fig. 10B, e.g. a gesture to select ‘DIRECTIONS’ or ‘STREET VIEW’); and in response to detecting the seventh air gesture and in accordance with a determination that the seventh air gesture is performed relative to the first surface (Fig. 10A: front of the forearm), forgoing changing the first setting of the control (Fig. 10B, instead, the selection is of ‘DIRECTIONS’ or ‘STREET VIEW’ and not SEARCH.)
Claim 18: Davis discloses the method of claim 1, wherein the one or more input devices include one or more cameras, and wherein detecting the first air gesture is performed using the one or more cameras (par. 101, 102, 115, cameras for determining interface inputs.)
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: Horowitz (2025/0390178) tracking hands and gestures.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ANDREY BELOUSOV whose telephone number is (571) 270-1695 and Andrew.belousov@uspto.gov email. The examiner can normally be reached Mon-Friday EST.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Queler, can be reached at telephone number 571-272-4140. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from Patent Center and the Private Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from Patent Center or Private PAIR. Status information for unpublished applications is available through Patent Center and Private PAIR for authorized users only. Should you have questions about access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) Form at https://www.uspto.gov/patents/uspto-automated- interview-request-air-form.
/Andrey Belousov/
Primary Examiner
Art Unit 2172
1/14/2026