DETAILED ACTION
This Final action is in response to an amendment filed 2/20/2026. Currently claims 1-20 are pending.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over Forlines et al. in US 2019/0294258 (hereinafter Forlines).
Regarding claim 1, Forlines disclose a method comprising: obtaining image data of a first hand (Forlines’s par. 232: hand postures measured by camera or ToF-3D sensors) in accordance with an indication of peripheral input (Forlines’s par. 179-182, 199: upon hands/fingers in a typing or resting-on-keys state of the touch sensitive keyboard, then further gestures are detected) from a user interaction (Forlines’s par. 179: hover) with a peripheral event (Forlines’s par. 179: typing) at a peripheral device (Forlines’s par. 179: keyboard); determining, based on the image data (Forlines’s par. 232: hand postures measured by camera or ToF-3D sensors), a hand pose of the first hand (Forlines’s par. 232: hand posture measured by camera, e.g. par. 186: palm on trackpad); determining a hand pose corresponding to an input gesture (Forlines’s par. 228: palm down, par. 168: curling, such as for closing of par. 183) associated with a gesture-based input action (Forlines’s par. 228: orchestra control, par. 183: selection based on hand closed or open); and in accordance with a determination that characteristics of the first hand (Forlines’s par. 186: fingers resting on keyboard) in the image data (Forlines’s par. 232: hand postures measured by camera) satisfy a peripheral use criterion (Forlines’s par. 179, 186: typing state): processing the peripheral input from the peripheral device (Forlines’s par. 179, 186: typing), and rejecting track input (Forlines’s par. 186: ignoring track input).
Forlines fails to explicitly disclose determining that the hand pose corresponds to an input gesture associated with a gesture based input action, or explicitly rejecting the input gesture.
However, Forlines does disclose ignoring input by the palm resting below the keyboard when the user’s fingers are resting on the keyboard for typing (Forlines’s par. 186), and input gestures that correspond to the palm (Forlines’s par. 228: palm down for orchestra control, par. 168: curling as detected by separation between fingertip and palm such as hand closing/opening for selection per par. 183).
Therefore, it would have been obvious to one of ordinary skill in the art, that Forlines palm input gestures (Forlines’s par. 168, 183, 228) are ignored when they are detected to corresponds to the palm below the keyboard (Forlines’s par. 186), in order to obtain the predictable result of mitigating accidental input (Forlines’s par. 186).
By doing such combination Forlines disclose a method comprising:
obtaining image data of a first hand (Forlines’s par. 232: hand postures measured by camera or ToF-3D sensors) in accordance with an indication of peripheral input (Forlines’s par. 179-182, 199: upon hands/fingers in a typing or resting-on-keys state of the touch sensitive keyboard, then further gestures are detected) from a user interaction (Forlines’s par. 179: hover) with a peripheral event (Forlines’s par. 179: typing) at a peripheral device (Forlines’s par. 179: keyboard);
determining, based on the image data (Forlines’s par. 232: hand postures measured by camera or ToF-3D sensors), a hand pose of the first hand (Forlines’s par. 232: hand posture measured by camera, e.g. par. 186: palm on trackpad);
determining that the hand pose (Forlines’s par. 186: palm resting below keyboard) corresponds to an input gesture (Forlines’s par. 186 palm resting below keyboard upon combination includes palm input gestures such as palm down of par. 228, or hand curling of par. 168: curling, such as for closing of par. 183) associated with a gesture-based input action (Forlines’s par. 186, 228: orchestra control, par. 183: selection based on hand closed or open); and
in accordance with a determination that characteristics of the first hand (Forlines’s par. 186: fingers resting on keyboard) in the image data (Forlines’s par. 232: hand postures measured by camera) satisfy a peripheral use criterion (Forlines’s par. 179, 186: typing state):
processing the peripheral input from the peripheral device (Forlines’s par. 179, 186: typing), and
rejecting the input gesture (Forlines’s par. 186: ignoring track input when palm is detected below keyboard, which upon combination includes the input gesture of orchestra control associated with a palm down of par. 228, or the selection based on closing a hand of par. 183 associated with curling of par. 168).
Regarding claim 8, Forlines disclose a non-transitory computer readable medium (Forlines’s par. 61: [memory] in MCU, FPGA, CPU, GPU) comprising computer readable code executable by one or more processors (Forlines’s par. 61, 262) to:
perform the steps as explained for claim 1.
Regarding claim 15, Forlines disclose a system (Forlines’s par. 61, 262) comprising:
one or more processors (Forlines’s par. 61, 262); and
one or more computer readable media (Forlines’s par. 61: [memory] in MCU, FPGA, CPU, GPU) comprising computer readable code executable by the one or more processors (Forlines’s par. 61, 262) to:
perform the steps as explained for claim 1.
Regarding claims 2, 9 and 16, Forlines disclose wherein the input gesture (Forlines’s par. 186: accidental trackpad input) is detected based on hand tracking data from a current frame and one or more prior frames (Forlines’s par. 103-106: object tracking [finger] frame-to-frame).
But Forlines fails to explicitly disclose the hand tracking data from a current frame of the image data and one or more prior frames. However, because Forlines does disclose using cameras to measure hand postures to compute a hand skeleton mode of the user hand for gesture recognition (Forlines’s par. 232), it would also have been obvious to one of ordinary skill in the art, that the hand tracking data (Forlines’s par. 103-106) is also from a current frame of the image data and one or more prior frames (Forlines’s par. 232: based on camera for gesture recognition, and thus using frames of image to perform the hand tracking of par. 103-106), in order to obtain the predictable result of reliable gesture recognition which would enable the gestures described with the keyboard (Forlines’s par. 232).
Regarding claims 3, 10 and 17, Forlines fails to explicitly disclose determining that a relationship between the input gesture and the peripheral event satisfy a cancellation criterion, wherein the peripheral event is processed further in response to a determination that the cancellation criterion fails to be satisfied.
However, Forlines does disclose a typing state cancellation criterion (Forlines’s par. 186: the condition “hand leaving to the right of the keyboard” which changes the state from typing to mouse, thus cancelling typing state) that is based on a horizontal movement of the right hand leaving the keyboard to the right (Forlines’s par. 186), and discretized states that are triggered based on hand conditions (Forlines’s par. 179).
Therefore, it would also have been obvious to one of ordinary skill in the art that a relationship between the input gesture of an accidental input to the trackpad (Forlines’s par. 186) and the peripheral event of typing when the fingers are resting on the keyboard (Forlines’s par. 186) is the right hand leaving the keyboard to transition to mouse state (Forlines’s par. 186), and that the typing would continue to be processed in response to the transition to a mouse state not being triggered (Forlines’s par. 179, 186), in order to obtain the predictable result of transition of states only when triggered by the hand movement (Forlines’s par. 179, 186).
By doing such combination, Forlines further disclose further comprising:
determining that a relationship (Forlines’s par. 186 hand movement) between the input gesture (Forlines’s par. 186: accidental trackpad input caused by palm on trackpad) and the peripheral event (Forlines’s par. 179: typing state of the touch sensitive keyboard by fingers on keyboard) satisfy a cancellation criterion (Forlines’s par. 186: the condition “hand leaving to the right of the keyboard” which changes the state from typing to mouse, thus cancelling typing state), wherein
the peripheral event (Forlines’s par. 179: typing state of the touch sensitive keyboard) is processed further (Forlines’s par. 179, 186: typing) in response to a determination that the cancellation criterion (Forlines’s par. 186: hand leaving keyboard to indicate a mouse state) fails to be satisfied (Forlines’s par. 186: when the hand does leave the keyboard, and thus the transition between states is not triggered per par. 179).
Regarding claims 4, 11 and 18, Forlines disclose wherein rejecting the input gesture (Forlines’s par. 186: ignoring track input) comprises disregarding a user input action (Forlines’s par. 186, 228: orchestra control, par. 183: selection based on hand closed or open) associated with the input gesture (Forlines’s par. 186 palm resting below keyboard upon combination includes palm input gestures such as palm down of par. 228, or hand curling of par. 168: curling, such as for closing of par. 183).
Forlines fails to explicitly disclose wherein the input gesture is further processed to provide a graphical indication that the input gesture is detected.
However, Forlines does disclose graphical indications when input gestures are detected (Forlines’s par. 153: holographic visual feedback).
Therefore, it would have been obvious to one of ordinary skill in the art, that Forlines’s input gesture (Forlines’s par. 186, 228: orchestra control, par. 183: selection based on hand closed or open) is further processed to provide a graphical indication that the input gesture is detected (Forlines’s par. 153: holographic visual feedback);
In order to obtain the predictable result of visual feedback (Forlines’s par. 153).
Regarding claims 5, 12 and 19, Forlines disclose further comprising:
detecting a hand movement of the first hand (Forlines’s par. 186: horizontal movement of the hand to the right);
determining whether the hand movement satisfies a cancellation threshold (Forlines’s par. 186: the condition “hand leaving to the right of the keyboard” which changes the state from typing to mouse, thus cancelling typing state); and
in accordance with a determination that the hand movement satisfies the cancellation threshold (Forlines’s par. 186: the condition “hand leaving to the right of the keyboard” which changes the state from typing to mouse, thus cancelling typing state), determining that the first hand is in a gesture input mode (Forlines’s par. 179: hover or mouse state).
Regarding claims 6, 13 and 20, Forlines disclose further comprising:
determining a second hand pose for the first hand (Forlines’s Figs. 19 and par. 195-196: finger resting in the center of the K key);
determining, based on the second hand pose (Forlines’s Figs. 19 and par. 195-196: finger resting in the center of the K key), that the first hand is in a peripheral use mode (Forlines’s Figs. 18-19 and par. 194, 196: key is resting on keyboard, thus on typing state);
detecting a second input gesture from the first hand determined to be in the peripheral use mode (Forlines’s Figs. 19 and par. 195-196: rocking of finger over the K key); and
in accordance with a determination that the second input gesture is allowable during the peripheral use mode (Forlines’s Figs. 19 and par. 194-196: rocking of finger over the K key is allowed in typing state which is entered when fingers rest on keyboard), processing a user input action associated with the second input gesture (Forlines’s par. 196: control direction of cursor based on rocking of finger over the K key).
Regarding claims 7 and 14, Forlines disclose wherein a second hand is determined to be in a gesture input mode (Forlines’s Figs. 16-17 and par. 192-193: e.g. the left hand is in typing because two fingers rest on the CTRL and ALT keys, and the right hand generates touch input events, e.g. swiping right input action).
Response to Arguments
Applicant's arguments filed 2/20/2026 have been fully considered but they are not persuasive.
On the Remarks pg. 8, Applicant argues that Forlines’s touch pad sensitivity is disabled when a user’s fingers are determined to be resting on keys, and therefore, Forlines fails to disclose rejecting the input gesture when the first hand satisfies a peripheral use criterion.
The office must respectfully disagree, the rejection above has been changed to address the amended limitations, Forlines’s disclose ignoring trackpad input such as entered by a palm on a trackpad below a keyboard when the fingers are resting on the keyboard (par. 186), thus the input is detected but ignored. Furthermore, Forlines disclose palm input gestures in par. 168, 183, 228. Therefore, it would have been obvious to one of ordinary skill in the art, that the input being ignored in the trackpad (par. 186) are input gestures associated with a palm (par. 168, 183, 228). See above rejection for further details.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Liliana Cerullo whose telephone number is (571)270-5882. The examiner can normally be reached 8AM to 3PM MT.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Amr Awad can be reached at 571-272-7764. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/LILIANA CERULLO/Primary Examiner, Art Unit 2621