Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on February 19, 2026 has been entered.
Response to Arguments
Applicant's arguments filed February 19, 2026 have been fully considered but they are not persuasive. Applicant’s arguments are directed to the amendments of independent claim 1, and similarly, independent claims 16 and 19. Examiner respectfully submits these new limitations are properly addressed by the cited sections of Wagner and Faulkner below. Therefore, Examiner respectfully submits the claims stand rejected.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-2, 4-6, 8-9, 16-17 and 19-29 are rejected under 35 U.S.C. 103 as being unpatentable over Wagner et al., US 2016/0313801 A1 (hereinafter “Wagner”) in view of Faulkner et al., US 2021/0096726 A1 (hereinafter “Faulkner”).
Regarding claim 1, Wagner discloses a non-transitory, computer-readable storage medium including instructions (see [0065] computer system that manipulates or transforms quantities within… “non-transitory storage medium… that may store instructions”) that, when executed by an electronic device (see [0065] “electronic computing device”), cause performance of operations for:
causing presentation, of application content associated with an application (see FIG. 2 and computerized device 29, [0081-0083, 0088-0090] and [0143]) to a user ([0143] “user”; FIGS. 12A-12B hand 150 of user) of a wrist-wearable device (see FIG. 1A-1B, 10; FIG. 12A-12B with smart watch 160 and smart wrist 165), the wrist-wearable device including a biopotential-signal-sensing component (FIG. 1A and 1B with bio-potential sensors 12);
in response to receiving a first indication (surface nerve conduction signals from electrodes 12 and 16 at [0150]-[0154]), based on data from the biopotential-signal-sensing component (FIGS. 12A-B and [0150]-[0154]), that a thumb of the user is moving in a first lateral direction along a portion of a hand of the user (FIGS. 11B, 11D, 11F thumb gesture movements in one of multiple directions [0142]-[0149]), causing presentation of different application content(e.g. icon) associated with the application (see FIGS. 11A-F and [0143]-[0144] describing controlling computerized device therein); and
in response to receiving a second indication (FIG. 11B arrows as indicated), based on data from the biopotential-signal-sensing component (FIGS. 12A-B and [0150]-[0154]), that the thumb of the user is moving in a second direction, distinct from the first direction (FIG. 11B illustrating arrows with multiple directions of movement, any one of which would be the first and the second different direction would be the second direction distinct from the first), along the portion of the phalange of the hand of the user (FIG. 11B illustrating arrows with multiple directions of movement, any one of which would be the first and the second different direction would be the second direction distinct from the first), causing presentation of other application content associated with another application, the other application distinct from the application (see at least FIGS. 14A-14D illustrating 210 as first movement of thumb for operation of a cursor FIG. 14A and [0159]; second thumb movement being with FIGS. 14C and 14D and [0161]-[0163] for volume changing of another application);
However, Wagner does not explicitly disclose causing presentation at a head-wearable device; and
while the application content associated with the application is being presented at the head-wearable device:
thumb moving in a lateral direction of a phalange, wherein thumb movements in the first direction are associated with intra-applications that are directed to the application content being presented at the head-wearable device,
wherein thumb movements in a second direction are associated with inter-application interactions that are not directed to a specific application that is currently being presented at the head-wearable device, causing presentation, at the head-wearable device, of different application content associated with the application, and of other application content associated with another application instead of the different application content associated with the application;
and in response to receiving a third indication, based on data from the biopotential-signal- sensing component, that the thumb of the user is providing a press contact against the portion of the phalange of the user in a third direction that is different than the first direction and the second direction, causing selection of a portion of the other application content associated with the other application.
In the same field of endeavor, Faulkner discloses a gesture input device using the thumb movements (FIGS. 7A-C and [0122]) for causing presentation at a head-wearable device (HMD at [0042], [0052], [0055] display generation component 120 is an HMD; FIGS. 7A-7N and [0119]-[0127]); and
while the application content associated with the application is being presented at the head-wearable device (Faulkner at FIG. 7A-C and menu 7170 at [0119]-[0122] presented at the HMD): thumb moving in a lateral direction of a phalange (Faulkner at FIG. 7A-C lateral movements at 7130 at [0109]-[0112] generally movement and contact of the thumb with phalanx of the digits of the hand), wherein thumb movements in the first direction are associated with intra-applications that are directed to the application content being presented at the head-wearable device (FIGS. 7A-7G with thumb movements, describing intra-application (e.g., within an application) at [0121]-0122] thumb movements being associated directly with operations within a specific application (e.g., play pause, fast-forward etc.), and additionally at [0167] and [0207]-[0208] describing swipe gestures along an axis of the thumb for intra-application navigation inputs),
wherein thumb movements in a second direction are associated with inter-application interactions that are not directed to a specific application that is currently being presented at the head-wearable device (FIGS. 7A-8 and [0107-0122] describing gestures that are global changes such as volume, brightness at [0036] and [0207]-[0208] describing adjustments of system settings of a device based on a directional input substantially perpendicular to a first direction),
causing presentation, at the head-wearable device, of different application content associated with the application (Faulkner at FIG. 7A-C and menu 7170 at [0119]-[0122] presented at the HMD, describing navigation within the application therein at [0121]), and of other application content associated with another application instead of the different application content associated with the application (Faulkner at FIGS. 7A-7C and [0119]-[0122] describing launching icons for launching the corresponding applications and controls within an application upon selection at [0121]). and in response to receiving a third indication (Faulkner at FIG. 7A-7G and [0107]-[0122] tap of thumb on index, specifically 7110), based on data from the biopotential-signal- sensing component (as suggested by Faulkner at [0244] and [0256] wrist-worn sensor detection, in view of Wagner above), that the thumb of the user is providing a press contact against the portion of the phalange of the user in a third direction that is different than the first direction and the second direction (Faulkner at FIG. 7B and [0107]-[0122] tap of thumb on index and specifically 7110 direction, further at [0208]), causing selection of a portion of the other application content associated with the other application (Faulkner at FIG. 7A-C and [0107]-[0122] and further at [0207]-[0208] describing selection in a list of items based on directional input of the thumb different from a first and second direction and producing a third result).
Before the effective filing date, it would have been obvious to a person of ordinary skill in the art to modify the biopotential input sensor detection of Wagner to incorporate the thumb movement input determination and outputted display as disclosed by Faulkner because the references are within the same field of endeavor, namely, input devices using finger gesture input. The motivation to combine these references would have been to improve the intuitive interfacing with display elements with greater efficiencies (see Faulkner at least at [0007] and [0207]). Therefore, a person of ordinary skill in the art would have been motivated to combine the prior art to achieve the claimed invention and there would have been a reasonable expectation of success.
Regarding claim 2, Wagner in view of Faulkner discloses the non-transitory, computer-readable storage medium of claim 1 (see above), wherein the first direction is substantially perpendicular to the second direction (see FIG. 11B [0146] and further FIGS. 14A and 14D described at [0159]-[0162], noting that the arrows in FIG. 11B are perpendicular as illustrated).
Regarding claim 4, Wagner in view of Faulkner discloses the non-transitory, computer-readable storage medium of claim 1 (see above), wherein the other application content is caused to be presented with a focus selector directed to a selectable user interface element within the other application content (Faulkner at FIG. 7B and item 7190 and selection indicator 7198 and [0122]), and the selection of the portion of the other application content associated with the other application includes an operation associated with the selectable user interface element (Faulkner at FIG. 7A-7F and [0107]-[0122] tap of thumb on index or other fingers to produce separate outputs accordingly, such as selecting within an application, further at [0207]-[0209]).
Regarding claim 5, Wagner in view of Faulkner discloses the non-transitory, computer-readable storage medium of claim 4 (see above), wherein the instructions, when executed by the electronic device, further cause performance of operations for: while the other application content is being presented: in conjunction with detecting the third indication that the thumb of the user is providing the press contact (Faulkner at FIGS. 7A-7G [0120]-[0122]), causing an adjustment to a visual characteristic used to present the selectable user interface element as part of the other application content currently being displayed to the user (Faulkner at FIGS. 7A-7G [0120]-[0122] and [0131] the further displayed content).
Regarding claim 6, Wagner in view of Faulkner discloses the non-transitory, computer-readable storage medium of claim 5 (see above), wherein: causing the adjustment to the visual characteristic used to present the selectable user interface element includes causing presentation of a zoomed-in selection interface (see Faulkner at least FIGS. 7A-7G and [0152]-[0156] and [0254] describing zooming as the selected); and the zoomed-in selection interface includes selectable user interface elements associated with a set of actions presented within a user interface, including the selectable user interface element that remains in focus within the zoomed-in selection interface (see Faulkner at least FIGS. 7A-7G and [0152]-[0156] and [0254] describing zooming as the selected).
Regarding claim 8, Wagner in view of Faulkner discloses the non-transitory, computer-readable storage medium of claim 4 (see above), wherein the instructions, when executed by the electronic device, further cause performance of operations for: while the other application content is being presented, and before causing performance of the operation associated with the selectable user interface element: in response to receiving a fourth indication, via the biopotential-signal-sensing component (taught above by Wagner and also suggested by Faulkner at [0244] and [0256] wrist-worn sensor detection), that the thumb of the user is providing a press contact against another portion of the hand of the user, distinct from the portion (see Faulkner FIG. 7A, 7138-7166 [0110]-[0113]), causing performance of a secondary operation, distinct from the operation, associated with the selectable user interface element (see Faulkner FIG. 7A, 7138-7166 [0110]-[0114]).
Regarding claim 9, Wagner in view of Faulkner discloses the non-transitory, computer-readable storage medium of claim 4 (see above), wherein the instructions, when executed by the electronic device, further cause performance of operations for: while the other application content is being caused to be presented to the user: in response to receiving a fifth indication (see Faulkner FIG. 7A, 7138-7166 [0110]-[0113]), based on a sensor distinct biopotential-signal-sensing component (taught above by Wagner and also suggested by Faulkner at [0244] and [0256] wrist-worn sensor detection), of a translational movement of the hand, causing movement of the focus selector to be directed to a different selectable user interface element, distinct from the selectable user interface element, within the other application content (see Faulkner at least FIGS. 7A-7G and [0148] and [0152]-[0156] and [0254] describing movements of the hand as potential additional input in addition to thumb).
Regarding claim 16, it is similar in scope to claim 1 above, the only difference being claim 16 is directed to method of using thumb-based gestures to navigate within and between applications (see Wagner methods of use at FIGS. 15-16 and further illustrating thumb based gestures at least at FIGS. 11A-11F and FIGS. 14A-D generally). Therefore, claim 16 is similarly analyzed and rejected as claim 1 above.
Regarding claim 17, it is similar in scope to claim 2 above; therefore, claim 17 is similarly analyzed and rejected as claim 2.
Regarding claim 19, it is similar in scope to claim 1 above, the only difference being claim 19 is directed to a wrist-wearable device (Wagner at FIGS. 1A-1B, and [0077] flexible user interface 10, and FIGS. 11-13 and [0151]-[0154], smart wrist straps 165), comprising: one or more processors (Wagner FIGS. 2-3 signal processor [0083]); and memory (Wagner at FIGS. 2-3, 4A with flash memory 24 at [0084]) including instructions (Wagner at [0139]) that, when executed by the one or more processors perform the steps of claim 1. Therefore, claim 19 is similarly analyzed and rejected as claim 1 above.
Regarding claim 20, it is similar in scope to claim 2 above; therefore, claim 20 is similarly analyzed and rejected as claim 2.
Regarding claim 21, it is similar in scope to claim 4 above; therefore, claim 21 is similarly analyzed and rejected as claim 4.
Regarding claim 22, it is similar in scope to claim 5 above; therefore, claim 21 is similarly analyzed and rejected as claim 5.
Regarding claim 23, it is similar in scope to claim 6 above; therefore, claim 23 is similarly analyzed and rejected as claim 6.
Regarding claim 24, it is similar in scope to claim 8 above; therefore, claim 24 is similarly analyzed and rejected as claim 8.
Regarding claim 25, it is similar in scope to claim 9 above; therefore, claim 25 is similarly analyzed and rejected as claim 9.
Regarding claim 26, it is similar in scope to claim 4 above; therefore, claim 26 is similarly analyzed and rejected as claim 4.
Regarding claim 27, it is similar in scope to claim 5 above; therefore, claim 21 is similarly analyzed and rejected as claim 5.
Regarding claim 28, it is similar in scope to claim 6 above; therefore, claim 28 is similarly analyzed and rejected as claim 6.
Regarding claim 29, it is similar in scope to claim 8 above; therefore, claim 29 is similarly analyzed and rejected as claim 8.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Yan et al., US 12,158,992 B1: Abstract describing sensing thumb movements using neuromuscular signals for input on a user interface;
Huang et al., US 2023/0359320 A1: FIGS. 18A-20 and [0096]-[0122] and [0240]-[0250] describing thumb movement input gestures using an EMG sensor;
Whitmire et al., US 2024/0019938 A1: FIGS. 15A-16B and [0378]-[0428] with various gesture inputs including thumb gesture inputs using a neuromuscular-signal sensor on the wrist of the user.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SARVESH J. NADKARNI whose telephone number is (571)270-7562. The examiner can normally be reached 8AM-5PM M-F.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Benjamin C. Lee can be reached at (571)272-2963. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SARVESH J NADKARNI/Examiner, Art Unit 2629