Prosecution Insights
Last updated: April 19, 2026
Application No. 19/220,044

KEY FUNCTION EXECUTION METHOD AND APPARATUS, DEVICE, AND STORAGE MEDIUM

Non-Final OA §DP
Filed
May 27, 2025
Examiner
ABEBE, SOSINA
Art Unit
2626
Tech Center
2600 — Communications
Assignee
Tencent Technology (Shenzhen) Company Limited
OA Round
1 (Non-Final)
73%
Grant Probability
Favorable
1-2
OA Rounds
3y 0m
To Grant
91%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
332 granted / 457 resolved
+10.6% vs TC avg
Strong +18% interview lift
Without
With
+18.5%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
16 currently pending
Career history
473
Total Applications
across all art units

Statute-Specific Performance

§101
2.0%
-38.0% vs TC avg
§103
59.6%
+19.6% vs TC avg
§102
25.4%
-14.6% vs TC avg
§112
5.8%
-34.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 457 resolved cases

Office Action

§DP
DETAILED ACTION This is a first office action in response to Application No. 19/220,044 originally filed on 05/27/2025, in which claims 1 - 20 are presented for examination. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP §§ 706.02(l)(1) - 706.02(l)(3) for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp. Claims 1 - 20 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1 - 20 of U.S. Patent No. 12,340,083. Although the claims at issue are not identical, the claims are not patentably distinct from each other because the patent and the current application are claiming common subject matter, as follows: Comparing the current application with Patent No. 12,340,083 as below: Current Application No. 19/220,044 Patent No. 12,340,083 Claim 1, A key function execution method, applied to a key function execution system, the method comprising: displaying a virtual keyboard, the virtual keyboard comprising a first region and a second region, the first region and the second region respectively comprising at least two keys, the first region corresponding to a left hand, and the second region corresponding to a right hand; displaying, in response to detecting that a hand is in a target gesture, a cursor at a first position according to a biometric feature of the target gesture, the biometric feature indicating whether the hand is the left hand or the right hand and at least one finger of the hand forming the target gesture, the first position being in a region corresponding to the biometric feature in the virtual keyboard, the first position being a position corresponding to the biometric feature; displaying, in response to the hand making a movement while keeping the target gesture, that the cursor moves with the movement of the hand; and executing, in response to a change in a gesture of the hand, a function of a key corresponding to a second position of the virtual keyboard, the second position being a position where the cursor is located when the gesture of the hand changes. Claim 1, A key function execution method, applied to a key function execution system, the method comprising: displaying a virtual keyboard, the virtual keyboard comprising a first region and a second region, the first region and the second region each comprising at least two keys, the first region and the second region each comprising a plurality of subregions, the first region corresponding to a left hand, and the second region corresponding to a right hand, each of the plurality of subregions corresponding to a respective non-thumb finger; displaying, in response to detecting that a hand is in a target gesture, a cursor at a first position of the virtual keyboard according to a biometric feature of the target gesture, the biometric feature indicating the target gesture is formed by a non-thumb finger of the hand and a thumb of the hand, the first position being in a subregion of a region, the region corresponding to the hand, the subregion corresponding to the non-thumb finger; displaying, in response to the hand making a movement while keeping the target gesture, that the cursor moves with the movement of the hand; and executing, in response to a change in a gesture of the hand, a function of a key corresponding to a second position of the virtual keyboard, the second position being a position where the cursor is located when the gesture of the hand changes. Claim 2, The method according to claim 1, wherein the displaying a cursor at a first position according to a biometric feature of the target gesture, the first position being located in a region corresponding to the biometric feature in the virtual keyboard comprises: displaying, in response to the biometric feature indicating that the hand is the left hand, the cursor at a target position in the first region in the virtual keyboard; and displaying, in response to the biometric feature indicating that the hand is the right hand, the cursor at a target position in the second region in the virtual keyboard. Claim 2, The method according to claim 1, wherein the displaying the cursor at the first position of the virtual keyboard according to the biometric feature of the target gesture, the first position being located in the subregion of the region corresponding to the biometric feature comprises: displaying, in response to the biometric feature indicating that the hand is the left hand, the cursor at a target position in the first region in the virtual keyboard; or displaying, in response to the biometric feature indicating that the hand is the right hand, the cursor at a target position in the second region in the virtual keyboard. Claim 3, The method according to claim 1, wherein the displaying a cursor at a first position according to a biometric feature of the target gesture comprises: displaying, in response to the biometric feature indicating that the hand is the left hand and the target gesture is formed by a first finger and a thumb of the hand, the cursor at a target position in a subregion corresponding to the first finger in the first region; and displaying, in response to the biometric feature indicating that the hand is the right hand and the target gesture is formed by a second finger and a thumb of the hand, the cursor at a target position in a subregion corresponding to the second finger in the second region. Claim 3, The method according to claim 1, wherein the displaying the cursor at the first position of the virtual keyboard according to the biometric feature of the target gesture comprises: displaying, in response to the biometric feature indicating that the hand is the left hand and the target gesture is formed by a first non-thumb finger and the thumb of the hand, the cursor at a target position in a first subregion corresponding to the first non-thumb finger in the first region; or displaying, in response to the biometric feature indicating that the hand is the right hand and the target gesture is formed by a second non-thumb finger and the thumb of the hand, the cursor at a target position in a second subregion corresponding to the second non-thumb finger in the second region. Claim 4, The method according to claim 2, wherein the target position is a central position of a region. Claim 4, The method according to claim 3, wherein the target position is a central position of the first subregion or the second subregion. Claim 5, The method according to claim 1, wherein the displaying, in response to the hand making a movement while keeping the target gesture, that the cursor moves with the movement of the hand comprises: acquiring, in response to the hand making the movement while keeping the target gesture, displacement of the hand; determining a third position according to the first position and the displacement of the hand; and displaying that the cursor moves from the first position to the third position. Claim 5, The method according to claim 1, wherein the displaying, in response to the hand making a movement while keeping the target gesture, that the cursor moves with the movement of the hand comprises: acquiring, in response to the hand making the movement while keeping the target gesture, displacement of the hand; determining a third position according to the first position and the displacement of the hand; and displaying that the cursor moves from the first position to the third position. Claim 6, The method according to claim 1, wherein the executing a function of a key corresponding to a second position of the virtual keyboard comprises any one of the following: inputting, in response to the key corresponding to the second position being a character key, a character represented by the key corresponding to the second position into an input box or a character display region of the virtual keyboard; deleting, in response to the key corresponding to the second position being a delete key, a last character in the input box or character display region of the virtual keyboard; displaying, in response to the key corresponding to the second position being a new line key, that an input cursor in the input box of the virtual keyboard changes to next line; and taking, in response to the key corresponding to the second position being a confirm key, content inputted in the input box of the virtual keyboard as target content, and canceling the display of the virtual keyboard. Claim 6, The method according to claim 1, wherein the executing a function of a key corresponding to a second position of the virtual keyboard comprises any one of the following: inputting, in response to the key corresponding to the second position being a character key, a character represented by the key corresponding to the second position into an input box or a character display region of the virtual keyboard; deleting, in response to the key corresponding to the second position being a delete key, a last character in the input box or character display region of the virtual keyboard; displaying, in response to the key corresponding to the second position being a new line key, that an input cursor in the input box of the virtual keyboard changes to next line; and taking, in response to the key corresponding to the second position being a confirm key, content inputted in the input box of the virtual keyboard as target content, and canceling the display of the virtual keyboard. Claim 7, The method according to claim 1, wherein the method further comprises: displaying, in response to a character combination inputted in the character display region of the virtual keyboard has a corresponding candidate non-English character, the candidate non-English character within a target range of the character display region. Claim 7, The method according to claim 1, wherein the method further comprises: displaying, in response to a character combination inputted in the character display region of the virtual keyboard has a corresponding candidate non-English character, the candidate non-English character within a target range of the character display region. Claim 8, The method according to claim 1, wherein the virtual keyboard further comprises at least one virtual control; and the method further comprises: displaying, in response to detecting that the hand is not in the target gesture and the hand moves, that the cursor moves with the movement of the hand; and executing, in response to detecting that the hand makes a target action and the cursor is located on a target virtual control in the at least one virtual control, a function corresponding to the target virtual control. Claim 8, The method according to claim 1, wherein the virtual keyboard further comprises at least one virtual control; and the method further comprises: displaying, in response to detecting that the hand is not in the target gesture and the hand moves, that the cursor moves with the movement of the hand; and executing, in response to detecting that the hand makes a target action and the cursor is located on a target virtual control in the at least one virtual control, a function corresponding to the target virtual control. Claim 9, The method according to claim 8, wherein the executing a function corresponding to the target virtual control comprises: switching, in response to the target virtual control being a switch control for uppercase input and lowercase input, a character displayed in the virtual keyboard between an uppercase input character and a lowercase input character; switching, in response to the target virtual control being a switch control for symbol input and letter input, the character displayed in the virtual keyboard between a letter and a symbol; and switching, in response to the target virtual control being a switch control for Chinese input and English input, a character input mode of the virtual keyboard between Chinese input and English input. Claim 9, The method according to claim 8, wherein the executing a function corresponding to the target virtual control comprises: switching, in response to the target virtual control being a switch control for uppercase input and lowercase input, a character displayed in the virtual keyboard between an uppercase input character and a lowercase input character; switching, in response to the target virtual control being a switch control for symbol input and letter input, the character displayed in the virtual keyboard between a letter and a symbol; and switching, in response to the target virtual control being a switch control for Chinese input and English input, a character input mode of the virtual keyboard between Chinese input and English input. Claim 10, The method according to claim 9, wherein the executing a function corresponding to the target virtual control further comprises: updating display content of the target virtual control, the updated display content being consistent with the switching of the character in the virtual keyboard or the switching of the character input mode. Claim 10, The method according to claim 9, wherein the executing a function corresponding to the target virtual control further comprises: updating display content of the target virtual control, the updated display content being consistent with the switching of the character in the virtual keyboard or the switching of the character input mode. Claim 11, A key function execution system, the system comprising an electronic device, a gesture tracking sensor, and a display device; the gesture tracking sensor and the display device being connected to the electronic device respectively; wherein: the display device is configured to display a virtual keyboard, the virtual keyboard comprising a first region and a second region, the first region and the second region respectively comprising at least two keys, the first region corresponding to a left hand, and the second region corresponding to a right hand; the gesture tracking sensor is configured to detect that a hand is in a target gesture; the display device is further configured to display, in response to detecting that the hand is in the target gesture, a cursor at a first position according to a biometric feature of the target gesture, the biometric feature indicating whether the hand is the left hand or the right hand and at least one finger of the hand forming the target gesture, the first position being located in a region corresponding to the biometric feature in the virtual keyboard, the first position being a position corresponding to the biometric feature; the gesture tracking sensor is further configured to detect that the hand makes a movement while keeping the target gesture; the display device is further configured to display, in response to the hand making a movement while keeping the target gesture, that the cursor moves with the movement of the hand; and the electronic device is configured to execute, in response to a change in a gesture of the hand, a function of a key corresponding to a second position of the virtual keyboard, the second position being a position where the cursor is located when the gesture of the hand changes. Claim 11, A key function execution system, the system comprising an electronic device, a gesture tracking sensor, and a display device; the gesture tracking sensor and the display device being connected to the electronic device respectively; wherein: the display device is configured to display a virtual keyboard, the virtual keyboard comprising a first region and a second region, the first region and the second region each comprising at least two keys, the first region and the second region each comprising a plurality of subregions, the first region corresponding to a left hand, and the second region corresponding to a right hand, each of the plurality of subregions corresponding to a respective non-thumb finger; the gesture tracking sensor is configured to detect that a hand is in a target gesture; the display device is further configured to display, in response to detecting that the hand is in the target gesture, a cursor at a first position of the virtual keyboard according to a biometric feature of the target gesture, the biometric feature indicating the target gesture is formed by a non-thumb finger of the hand and a thumb of the hand, the first position being located in a subregion of a region, the region corresponding to the hand, the subregion corresponding to the non- thumb finger; the gesture tracking sensor is further configured to detect that the hand makes a movement while keeping the target gesture; the display device is further configured to display, in response to the hand making a movement while keeping the target gesture, that the cursor moves with the movement of the hand; and the electronic device is configured to execute, in response to a change in a gesture of the hand, a function of a key corresponding to a second position of the virtual keyboard, the second position being a position where the cursor is located when the gesture of the hand changes. Claim 12, The system according to claim 11, wherein the display device is further configured to: display, in response to the biometric feature indicating that the hand is the left hand, the cursor at a target position in the first region in the virtual keyboard; and display, in response to the biometric feature indicating that the hand is the right hand, the cursor at a target position in the second region in the virtual keyboard. Claim 12, The system according to claim 11, wherein the display device is further configured to: display, in response to the biometric feature indicating that the hand is the left hand, the cursor at a target position in the first region in the virtual keyboard; or display, in response to the biometric feature indicating that the hand is the right hand, the cursor at a target position in the second region in the virtual keyboard. Claim 13, The system according to claim 11, wherein the display device is further configured to: display, in response to the biometric feature indicating that the hand is the left hand and the target gesture is formed by a first finger and a thumb of the hand, the cursor at a target position in a subregion corresponding to the first finger in the first region; and display, in response to the biometric feature indicating that the hand is the right hand and the target gesture is formed by a second finger and a thumb of the hand, the cursor at a target position in a subregion corresponding to the second finger in the second region. Claim 13, The system according to claim 11, wherein the display device is further configured to: display, in response to the biometric feature indicating that the hand is the left hand and the target gesture is formed by a first non-thumb finger and the thumb of the hand, the cursor at a target position in a first subregion corresponding to the first non-thumb finger in the first region; or display, in response to the biometric feature indicating that the hand is the right hand and the target gesture is formed by a second non-thumb finger and the thumb of the hand, the cursor at a target position in a second subregion corresponding to the second non-thumb finger in the second region. Claim 14, The system according to claim 12, wherein the target position is a central position of a region. Claim 14, The system according to claim 13, wherein the target position is a central position of the first subregion or the second subregion. Claim 15, The system according to claim 11, wherein: the gesture tracking sensor is configured to acquire, in response to the hand making the movement while keeping the target gesture, displacement of the hand; the electronic device is further configured to determine a third position according to the first position and the displacement of the hand; and the display device is further configured to display that the cursor moves from the first position to the third position. Claim 15, The system according to claim 11, wherein: the gesture tracking sensor is configured to acquire, in response to the hand making the movement while keeping the target gesture, displacement of the hand; the electronic device is further configured to determine a third position according to the first position and the displacement of the hand; and the display device is further configured to display that the cursor moves from the first position to the third position. Claim 16, The system according to claim 11, wherein the electronic device is further configured to perform any one of the following: inputting, in response to the key corresponding to the second position being a character key, a character represented by the key corresponding to the second position into an input box or a character display region of the virtual keyboard; deleting, in response to the key corresponding to the second position being a delete key, a last character in the input box or character display region of the virtual keyboard; displaying, in response to the key corresponding to the second position being a new line key, that an input cursor in the input box of the virtual keyboard changes to next line; and taking, in response to the key corresponding to the second position being a confirm key, content inputted in the input box of the virtual keyboard as target content, and canceling the display of the virtual keyboard. Claim 16, The system according to claim 11, wherein the electronic device is further configured to perform any one of the following: inputting, in response to the key corresponding to the second position being a character key, a character represented by the key corresponding to the second position into an input box or a character display region of the virtual keyboard; deleting, in response to the key corresponding to the second position being a delete key, a last character in the input box or character display region of the virtual keyboard; displaying, in response to the key corresponding to the second position being a new line key, that an input cursor in the input box of the virtual keyboard changes to next line; and taking, in response to the key corresponding to the second position being a confirm key, content inputted in the input box of the virtual keyboard as target content, and canceling the display of the virtual keyboard. Claim 17, The system according to claim 11, wherein the display device is further configured to: display, in response to a character combination inputted in the character display region of the virtual keyboard has a corresponding candidate non-English character, the candidate non-English character within a target range of the character display region. Claim 17, The system according to claim 11, wherein the display device is further configured to: display, in response to a character combination inputted in the character display region of the virtual keyboard has a corresponding candidate non-English character, the candidate non-English character within a target range of the character display region. Claim 18, The system according to claim 11, wherein: the virtual keyboard further comprises at least one virtual control; the display device is further configured to display, in response to detecting that the hand is not in the target gesture and the hand moves, that the cursor moves with the movement of the hand; and the electronic device is further configured to execute, in response to detecting that the hand makes a target action and the cursor is located on a target virtual control in the at least one virtual control, a function corresponding to the target virtual control. Claim 18, The system according to claim 11, wherein: the virtual keyboard further comprises at least one virtual control; the display device is further configured to display, in response to detecting that the hand is not in the target gesture and the hand moves, that the cursor moves with the movement of the hand; and the electronic device is further configured to execute, in response to detecting that the hand makes a target action and the cursor is located on a target virtual control in the at least one virtual control, a function corresponding to the target virtual control. Claim 19, The system according to claim 11, wherein the display device is a virtual reality (VR) display device or a screen display. Claim 19, The system according to claim 11, wherein the display device is a virtual reality (VR) display device or a screen display. Claim 20, A non-transitory computer- readable storage medium, the storage medium storing at least one computer program, the at least one computer program being loaded and executed by a processor to implement: displaying a virtual keyboard, the virtual keyboard comprising a first region and a second region, the first region and the second region respectively comprising at least two keys, the first region corresponding to a left hand, and the second region corresponding to a right hand; displaying, in response to detecting that a hand is in a target gesture, a cursor at a first position according to a biometric feature of the target gesture, the biometric feature indicating whether the hand is the left hand or the right hand and at least one finger of the hand forming the target gesture, the first position being in a region corresponding to the biometric feature in the virtual keyboard, the first position being a position corresponding to the biometric feature; displaying, in response to the hand making a movement while keeping the target gesture, that the cursor moves with the movement of the hand; and executing, in response to a change in a gesture of the hand, a function of a key corresponding to a second position of the virtual keyboard, the second position being a position where the cursor is located when the gesture of the hand changes. Claim 20, A non-transitory computer- readable storage medium, the storage medium storing at least one computer program, the at least one computer program being loaded and executed by a processor to implement: displaying a virtual keyboard, the virtual keyboard comprising a first region and a second region, the first region and the second region each comprising at least two keys, the first region and the second region each comprising a plurality of subregions, the first region corresponding to a left hand, and the second region corresponding to a right hand, each of the plurality of subregions corresponding to a respective non-thumb finger; displaying, in response to detecting that a hand is in a target gesture, a cursor at a first position of the virtual keyboard according to a biometric feature of the target gesture, the biometric feature indicating the target gesture is formed by a non-thumb finger of the hand and a thumb of the hand, the first position being in a subregion of a region, the region corresponding to the hand, the subregion corresponding to the non-thumb finger; displaying, in response to the hand making a movement while keeping the target gesture, that the cursor moves with the movement of the hand; and executing, in response to a change in a gesture of the hand, a function of a key corresponding to a second position of the virtual keyboard, the second position being a position where the cursor is located when the gesture of the hand changes. Contact Information 8. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Sosina Abebe whose telephone number is (571) 270-7929. The examiner can normally be reached on Mon-Friday from 9:00-5:30 If attempts to reach the examiner by telephone are unsuccessful, the examiner's Supervisor, Temesghen Ghebretinsae can be reached on (571) 272-3017. The fax phone number for the organization where this application or proceeding is assigned is 703-872-9306. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). /S.A/Examiner, Art Unit 2626 /TEMESGHEN GHEBRETINSAE/Supervisory Patent Examiner, Art Unit 2626 1/27/26
Read full office action

Prosecution Timeline

May 27, 2025
Application Filed
Jan 24, 2026
Non-Final Rejection — §DP
Feb 11, 2026
Interview Requested
Feb 24, 2026
Applicant Interview (Telephonic)
Mar 04, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12578821
TOUCH SENSING DEVICE HAVING MALFUNCTION PREVENTION FUNCTION
2y 5m to grant Granted Mar 17, 2026
Patent 12578815
TOUCH-CONTROL DISPLAY PANEL AND DISPLAY APPARATUS
2y 5m to grant Granted Mar 17, 2026
Patent 12572209
TACTILE-FEEDBACK MODULE AND DRIVING METHOD THEREOF, AND TACTILE-FEEDBACK DEVICE
2y 5m to grant Granted Mar 10, 2026
Patent 12566515
ARCHITECTURE FOR DIFFERENTIAL DRIVE AND SENSE TOUCH TECHNOLOGY
2y 5m to grant Granted Mar 03, 2026
Patent 12554356
TOUCH DEVICE, TOUCH SYSTEM INCLUDING THE SAME, AND DRIVING METHOD OF THE TOUCH DEVICE
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
73%
Grant Probability
91%
With Interview (+18.5%)
3y 0m
Median Time to Grant
Low
PTA Risk
Based on 457 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month