Prosecution Insights
Last updated: April 19, 2026
Application No. 18/114,629

Gesture Tutorial for a Finger-Wearable Device

Final Rejection §103
Filed
Feb 27, 2023
Examiner
STORK, KYLE R
Art Unit
2128
Tech Center
2100 — Computer Architecture & Software
Assignee
Apple Inc.
OA Round
4 (Final)
64%
Grant Probability
Moderate
5-6
OA Rounds
4y 0m
To Grant
92%
With Interview

Examiner Intelligence

Grants 64% of resolved cases
64%
Career Allow Rate
554 granted / 865 resolved
+9.0% vs TC avg
Strong +28% interview lift
Without
With
+28.3%
Interview Lift
resolved cases with interview
Typical timeline
4y 0m
Avg Prosecution
51 currently pending
Career history
916
Total Applications
across all art units

Statute-Specific Performance

§101
14.9%
-25.1% vs TC avg
§103
58.5%
+18.5% vs TC avg
§102
12.1%
-27.9% vs TC avg
§112
6.1%
-33.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 865 resolved cases

Office Action

§103
DETAILED ACTION The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This final office action is in response to the amendment filed 24 November 2025. Claims 1-15 and 17-22 are pending. Claims 1, 19, and 20 are independent claims. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1-3, 7-15, 20, and 22 remain rejected under 35 U.S.C. 103 as being unpatentable over Martin et al. (WO 2019/229698, published 5 December 2019, hereafter Martin) and further in view of Marvin et al. (US 11423798, filed 22 March 2018, hereafter Marvin) and further in view of Chatterjee et al. (Gaze+Gesture: Expressive, Precise, and Targeted Free-Space Interactions, 2015, hereafter Chatterjee). As per independent claim 1, Martin discloses a method, comprising: at an electronic device with one or more processors, a non-transitory memory, a display, and a communication interface provided to communication with a finger-wearable device (paragraph 0047): displaying, on the display, first instructional content that is associated with a first gesture, wherein the first instructional content includes a first object (paragraph 0103) obtaining finger manipulation data from the finger-wearable device via the communication interface (paragraph 0073) determining an engagement score that characterizes a level of user engagement with respect to the first object (paragraph 0104) determining that the finger-wearable device performs the first gesture based on a function of the finger manipulation data (paragraph 0073) in response to determining that the finger-wearable device performs the first gesture: determining that the engagement score indicates a respective location of the finger-wearable device satisfies a proximity threshold distance with respect to the first object (paragraph 0110: Here, a pointing controller is used to define a pointing vector in three-dimensional space. This pointing vector comprises a line from the pointing controller to a cursor and a virtual object (first object) is selected when the pointing vector and the virtual object intersect. This intersection between the pointing vector and the coordinates occupied by the virtual object defines the proximity threshold. Once the intersection is detected, the object may be selected and placed into a “selected state” for manipulation) in accordance with a determination that the engagement score satisfies an engagement criterion, displaying, on the display, an indication indicating that the first gesture is directed to the first object to select the first object for manipulation (paragraph 0110: Here, a pointing controller is used to define a pointing vector in three-dimensional space. This pointing vector comprises a line from the pointing controller to a cursor and a virtual object (first object) is selected when the pointing vector and the virtual object intersect. This intersection between the pointing vector and the coordinates occupied by the virtual object defines the proximity threshold. Once the intersection is detected, the object may be selected and placed into a “selected state” for manipulation) Martin fails to specifically disclose wherein the engagement score is based at least in part on eye tracking data from an eye tracker. However, Marvin, which is analogous to the claimed invention because it is directed toward measuring engagement via eye tracking, discloses an engagement score based at least in part on a sensor measuring eye tracking (Figure 7, item 700; column 7, lines 37-52: Here, engagement tracking of an inmate (user) is scored based upon a sensor tracking inmate (user) eye movement). It would have been obvious to one of ordinary skill in the art at the time of the applicant’s effective filing date to have combined Marvin with Martin, with a reasonable expectation of success, as it would have enabled tracking multiple parameters, including eye movement, to determine engagement (Marvin: column 7, lines 49-52). This would have provided a more accurate engagement score than a single parameter engagement score. Further, Martin fails to specifically disclose determining that the engagement score satisfies at least one engagement criterion during performance of the first gesture. However, Chatterjee, which is analogous to the claimed invention because it is directed toward combining gaze and gesture inputs, discloses determining that the engagement score satisfies at least one engagement criterion during performance of the first gesture (pages 133-134: Here, during a target acquisition phase, it is determined whether the engagement criterion (accuracy above a threshold value) is met to allow for gestures actions to be performed on the selected target items (Section 4.1)). It would have been obvious to one of ordinary skill in the art at the time of the applicant’s effective filing date to have combined Chatterjee with Martin-Marvin, with a reasonable expectation of success, as it would have allowed for applying actions associated with gestures to be performed on the object (Chatterjee: page 133). As per dependent claim 2, Martin, Marvin, and Chatterjee disclose the limitations similar to those in claim 1, and the same rejection is incorporated herein. Martin further discloses wherein the finger manipulation data includes positional data, the method further comprising: identifying the respective location that corresponds to the finger-wearable device based on a function of the positional data (paragraph 0110) As per dependent claim 3, Martin, Marvin, and Chatterjee disclose the limitations similar to those in claim 1, and the same rejection is incorporated herein. Martin discloses, in response to determining that the respective location satisfies the proximity threshold with respect to the first object, setting a user handedness value based on a function of the finger manipulation data (paragraph 0111). As per dependent claim 7, Martin and Marvin disclose the limitations similar to those in claim 1, and the same rejection is incorporated herein. Martin discloses wherein the finger-wearable device is not viewable on the display (Figure 6E: Here, the finger wearable device is placed on a user’s finger and the device itself is not shown in the virtual space). As per dependent claim 8, Martin, Marvin, and Chatterjee disclose the limitations similar to those in claim 1, and the same rejection is incorporated herein. Martin discloses wherein the content includes a representation of the finger-wearable device and wherein the representation of the finger-wearable device is based on the finger manipulation data (Figures 6A-6E; paragraph 0110: Here, a visual indicator is used to represent the finger-wearable device. Based upon user actions, such as pinching, the visual indicator selects a virtual object). As per dependent claim 9, Martin, Marvin, and Chatterjee disclose the limitations similar to those in claim 1, and the same rejection is incorporated herein. Martin discloses: detecting that the finger-wearable device is proximate to the electronic device (paragraph 0047) in response to detecting that the finger-wearable device is proximate to the electronic device, pairing the electronic device with the finger-wearable device and displaying the first instructional content (paragraphs 0110-0111) As per dependent claim 10, Martin, Marvin, and Chatterjee disclose the limitations similar to those in claim 1, and the same rejection is incorporated herein. Martin discloses wherein the finger manipulation data includes sensor data associated with one or more sensors integrated in the finger-wearable device (Figure 8, item 812). As per dependent claim 11, Martin, Marvin, and Chatterjee disclose the limitations similar to those in claim 10, and the same rejection is incorporated herein. Martin discloses wherein the sensor data includes positional data output from one or more positional sensors integrated in the finger-wearable device (Figure 10; paragraph 0115). As per dependent claim 12, Martin, Marvin, and Chatterjee disclose the limitations similar to those in claim 1, and the same rejection is incorporated herein. Martin discloses wherein the sensor data includes contact intensity data output from a contact intensity sensor integrated in the finger-wearable device (Figure 8, item 812). As per dependent claim 13, Martin, Marvin, and Chatterjee disclose the limitations similar to those in claim 1, and the same rejection is incorporated herein. Martin discloses: detecting a gesture change request associated with a second gesture (Figures 6A-6E; paragraph 0110) in response to detecting the gesture change request, replacing the first instructional content with second instructional content, wherein the second instructional content is associated with the second gesture that is different from the first gesture, and wherein the second instructional content includes a second object (Figures 6A-6E; paragraph 0110: Here, a first gesture may be selection of an object. A change of gesture, such as dragging to move the object along the x-axis or scrolling to move the object along the z-axis, is received. This gesture is different than the gesture to select the object) As per dependent claim 14, Martin, Marvin, and Chatterjee disclose the limitations similar to those in claim 13, and the same rejection is incorporated herein. Martin discloses wherein detecting the gesture change request is based on a function of the finger manipulation (Figures 6A-6E; paragraph 0110: Here, the gesture to select an object is achieved by a user clicking their fingers together (Figure 6C). The second gesture is achieved by one of the user moving their hand (Figure 6D) or a user scrolling (Figure 6E)). As per dependent claim 15, Martin, Marvin, and Chatterjee disclose the limitations similar to those in claim 1, and the same rejection is incorporated herein. Martin discloses obtaining, from the finger-wearable device via the communication interface, status information associated with the finger-wearable device (paragraph 0066). Martin fails to specifically disclose displaying, on the display, a status indicator indicating the status information associated with the finger wearable device. However, the examiner takes official notice that it was notoriously well-known in the art at the time of the applicant’s effective filing date to display a status indicator indicating status information. For example, cell phones display a status indicator that indicates the battery status of the device. It would have been obvious to one of ordinary skill in the art at the time of the applicant’s effective filing date to have combined the well-known with Martin, with a reasonable expectation of success, as it would have allowed a user to view one or more status items related to the device. This would have facilitated better user interaction with the device. As per dependent claim 17, Martin, Marvin, and Chatterjee disclose the limitations similar to those in claim 1, and the same rejection is incorporated herein. Martin discloses wherein the electronic device corresponds to a head-mountable device (HMD) (paragraph 0043). As per dependent claim 18, Martin, Marvin, and Chatterjee disclose the limitations similar to those in claim 1, and the same rejection is incorporated herein. Martin discloses in response to determining that the finger-wearable device performs the first gesture in accordance with a determination that the engagement score does not satisfy the engagement criterion, foregoing displaying the indication (paragraphs 0109-0110). With respect to claims 19 and 20, the applicant discloses the limitations substantially similar to those in claim 1. Claims 19 and 20 are similarly rejected. As per dependent claim 22, Martin, Marvin, and Chatterjee disclose the limitations similar to those in claim 1, and the same rejection is incorporated herein. Martin discloses an indicator of a user gaze based upon eye tracking data (Figure 6A, item 602). Martin fails to specifically disclose determining that the engagement score satisfies at least one engagement criterion during performance of the gesture and focus within a ring. However, Chatterjee, which is analogous to the claimed invention because it is directed toward combining gaze and gesture inputs, discloses determining that the engagement score satisfies at least one engagement criterion during performance of the first gesture (pages 133-134: Here, during a target acquisition phase, it is determined whether the engagement criterion (accuracy above a threshold value) is met to allow for gestures actions to be performed on the selected target items (Section 4.1)). It would have been obvious to one of ordinary skill in the art at the time of the applicant’s effective filing date to have combined Chatterjee with Martin-Marvin, with a reasonable expectation of success, as it would have allowed for applying actions associated with gestures to be performed on the object (Chatterjee: page 133). Finally, the examiner takes official notice that use of various focus indicator shapes, such a ring, were notoriously well-known in the art at the time of the applicant’s effective filing date. It would have been obvious to one of ordinary skill in the art at the time of the applicant’s effective filing date to have combined the well-known with Martin-Marvin-Chatterjee, with a reasonable expectation of success, as it would have allowed for displaying a ring of focus, instead of a pointing vector, as it would have allowed for displaying the totality of the area engaged by a user instead of a single point. Claim 4 remains rejected under 35 U.S.C. 103 as being unpatentable over Martin, Marvin, and Chatterjee and further in view of Chen et al. (US 2019/0236344, published 1 August 2019, hereafter Chen). As per dependent claim 4, Martin, Marvin, and Chatterjee disclose the limitations similar to those in claim 3, and the same rejection is incorporated herein. Martin disclose wherein the electronic device includes an extremity tracking system that outputs extremity tracking data (paragraph 0110-0111). Martin fails to specifically disclose wherein the extremity tracking data is indicative of one or more locations that respectively correspond to one or more user extremities, and wherein setting the user handedness value is based on a comparison between the respective location that corresponds to the finger-wearable device and the one or more locations that respectively correspond to the one or more user extremities. However, Chen, which is analogous to the claimed invention because it is directed toward determine user handedness, discloses wherein the extremity tracking data is indicative of one or more locations that respectively correspond to one or more user extremities, and wherein setting the user handedness value is based on a comparison between the respective location that corresponds to the finger-wearable device and the one or more locations that respectively correspond to the one or more user extremities (paragraph 0003). It would have been obvious to one of ordinary skill in the art at the time of the applicant’s effective filing date to have combined Chen with Martin, with a reasonable expectation of success, as it would have enabled setting handedness of a user to more accurately provided user inputs. Claims 5-6 remain rejected under 35 U.S.C. 103 as being unpatentable over Martin, Marvin, and Chatterjee and further in view of Heo et al. (KR 20200081529, published 8 July 2020, hereafter Heo). As per dependent claim 5, Martin, Marvin, and Chatterjee disclose the limitations similar to those in claim 1, and the same rejection is incorporated herein. Martin discloses determining that the engagement score satisfies the engagement criterion based on the input (paragraphs 0110-0111). Martin fails to specifically disclose receiving an untethered input via the first input device. However, Heo, which is analogous to the claimed invention because it is directed toward identifying hand gesture inputs, discloses receiving an untethered input via the first input device (paragraphs 0012-0013). It would have been obvious to one of ordinary skill in the art at the time of the applicant’s effective filing date to have combined Heo with Martin, with a reasonable expectation of success, as it would have enabled independent, untethered, hand movement. This would have provided a user with greater range of movement, as they would not be tethered. As per dependent claim 6, Martin, Marvin, Chatterjee, and Heo disclose the limitations similar to those in claim 5, and the same rejection is incorporated herein. Martin further discloses determining that the engagement score satisfies the engagement criterion based on the first location satisfying a proximity threshold with respect to the first object (paragraphs 0110-0111). Additionally, Heo discloses identifying a first location based on a function of the untethered input (paragraphs 0012-0013). It would have been obvious to one of ordinary skill in the art at the time of the applicant’s effective filing date to have combined Heo with Martin, with a reasonable expectation of success, as it would have enabled independent, untethered, hand movement. This would have provided a user with greater range of movement, as they would not be tethered. Claim 21 remains rejected under 35 U.S.C. 103 as being unpatentable over Martin, Marvin, and Chatterjee and further in view of Cederlund et al. (US 10025381, patented 17 July 2018, hereafter Cederlund). As per dependent claim 21, Martin, Marvin, and Chatterjee disclose the limitations similar to those in claim 1, and the same rejection is incorporated herein. Martin fails to specifically disclose determining that the engagement score satisfies the engagement criterion based on the eye tracking data indicating that a gaze of a user is directed to the first object while the finger-wearable device performs the first gesture. However, Cederlund, which is analogous to the claimed invention because it is directed toward identifying an object that is the target of the user gaze and gesture, discloses determining that the engagement score satisfies the engagement criterion based on the eye tracking data indicating that a gaze of a user is directed to the first object while the finger-wearable device performs the first gesture (Claim 1: Here, user input from gaze tracking and gesture controls are used to identify a target object). It would have been obvious to one of ordinary skill in the art at the time of the applicant’s effective filing date to have combined Cederlund with Martin-Marvin, with a reasonable expectation of success, as it would have allowed for executing a user manipulation of an object based upon both gaze and gesture controls (Cederlund: Abstract). This would have provided improved selection of target objects by incorporating both gaze and gesture parameters in order to more accurately identify the target object. Response to Arguments Applicant's arguments filed 24 November 2025 have been fully considered but they are not persuasive. The applicant argues that the prior art fails to disclose “determining that the engagement score indicates a respective location of the finger-wearable device satisfies a proximity threshold distance with respect to the first object; and in accordance with a determination that the engagement score satisfies the engagement criterion, displaying, on the display, an indication indicating that the first gesture was directed to the first object to select the first object for manipulation (page 10).” The examiner respectfully disagrees. Martin discloses determining that the engagement score indicates a respective location of the finger-wearable device satisfies a proximity threshold distance with respect to the first object (paragraph 0110). Specifically, a pointing controller is used to define a pointing vector in three-dimensional space. This pointing vector comprises a line from the pointing controller to a cursor and a virtual object (first object) is selected when the pointing vector and the virtual object intersect. This intersection between the pointing vector and the coordinates occupied by the virtual object defines the proximity threshold. Once the intersection is detected, the object may be selected and placed into a “selected state” for manipulation. In this instance, the first gesture is a gesture align the pointing vector and the virtual object; responsive to this first gesture, the object is placed into a selected state for additional manipulation. Additionally, the examiner acknowledges that Martin fails to disclose the engagement score based at least in part on eye tracking data from an eye tracker. However, Chatterjee, which is analogous to the claimed invention because it is directed toward combining gaze and gesture inputs, discloses determining that the engagement score satisfies at least one engagement criterion during performance of the first gesture (pages 133-134). During a target acquisition phase, it is determined whether the engagement criterion (accuracy above a threshold value) is met to allow for gestures actions to be performed on the selected target items (Section 4.1). For these reasons, this argument is not persuasive. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: Josephson et al. (US 2019/0370545): Discloses a selectable object active area predicted with a threshold degree of certainty based upon a distance proximity (paragraph 0023) THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to KYLE R STORK whose telephone number is (571)272-4130. The examiner can normally be reached 8am - 2pm; 4pm - 6pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Omar Fernandez Rivas can be reached at 571/272-2589. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /KYLE R STORK/Primary Examiner, Art Unit 2128
Read full office action

Prosecution Timeline

Feb 27, 2023
Application Filed
Feb 13, 2024
Response after Non-Final Action
Aug 23, 2024
Non-Final Rejection — §103
Nov 21, 2024
Interview Requested
Dec 09, 2024
Examiner Interview Summary
Dec 09, 2024
Applicant Interview (Telephonic)
Dec 17, 2024
Response Filed
Feb 28, 2025
Final Rejection — §103
May 08, 2025
Applicant Interview (Telephonic)
May 09, 2025
Examiner Interview Summary
Jun 05, 2025
Request for Continued Examination
Jun 10, 2025
Response after Non-Final Action
Jun 23, 2025
Non-Final Rejection — §103
Nov 19, 2025
Interview Requested
Nov 24, 2025
Response Filed
Dec 09, 2025
Applicant Interview (Telephonic)
Jan 16, 2026
Examiner Interview Summary
Feb 25, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12585935
EXECUTION BEHAVIOR ANALYSIS TEXT-BASED ENSEMBLE MALWARE DETECTOR
2y 5m to grant Granted Mar 24, 2026
Patent 12585937
SYSTEMS AND METHODS FOR DEEP LEARNING ENHANCED GARBAGE COLLECTION
2y 5m to grant Granted Mar 24, 2026
Patent 12585869
RECOMMENDATION PLATFORM FOR SKILL DEVELOPMENT
2y 5m to grant Granted Mar 24, 2026
Patent 12579454
PROVIDING EXPLAINABLE MACHINE LEARNING MODEL RESULTS USING DISTRIBUTED LEDGERS
2y 5m to grant Granted Mar 17, 2026
Patent 12579412
SPIKE NEURAL NETWORK CIRCUIT INCLUDING SELF-CORRECTING CONTROL CIRCUIT AND METHOD OF OPERATION THEREOF
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
64%
Grant Probability
92%
With Interview (+28.3%)
4y 0m
Median Time to Grant
High
PTA Risk
Based on 865 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month