Prosecution Insights
Last updated: April 19, 2026
Application No. 18/634,253

METHODS AND SYSTEMS FOR TOUCHSCREEN DEVICE INTERACTIONS USING A VIRTUAL JOYSTICK

Non-Final OA §102§103
Filed
Apr 12, 2024
Examiner
SILVERMAN, SETH ADAM
Art Unit
2172
Tech Center
2100 — Computer Architecture & Software
Assignee
Huawei Technologies Co., Ltd.
OA Round
1 (Non-Final)
73%
Grant Probability
Favorable
1-2
OA Rounds
2y 4m
To Grant
88%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
327 granted / 449 resolved
+17.8% vs TC avg
Moderate +15% lift
Without
With
+14.8%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
47 currently pending
Career history
496
Total Applications
across all art units

Statute-Specific Performance

§101
8.9%
-31.1% vs TC avg
§103
58.5%
+18.5% vs TC avg
§102
20.1%
-19.9% vs TC avg
§112
9.4%
-30.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 449 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted on 4/12/204 & 9/17/2025 was filed before the first office action. The submissions are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statements are being considered by the examiner. Claim Rejection Notes In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-4, 6-15, and 17-20, are rejected under 35 U.S.C. 102a1 as being anticipated by Westerman et al. (US 20170344213 A1, published: 11/30/2017). Claim 1: Westerman teaches a method at an electronic device, the method comprising: receiving a touch input detected at or near a display of the electronic device (as long as the two or more fingers remain touching the sensor panel, the fingers can be moved around to effect position control on the object or cursor [Westerman, 0006]); generating touch contact information including a centroid and a contact shape of the touch input (the tip of velocity vector can be coincident with the calculated centroid of the patch generated by the finger [Westerman 0009]. One or more additional fingers are touched down on the sensor panel while the one or more already-touching fingers remain in contact with the sensor panel [Westerman, 0067]); determining, by a model, a speed and a direction associated with an interactive element on the display, based on the touch contact information (the speed and direction of the scrolling or dragging can be initially established by the speed and direction of the touching fingers at the time the motion continuation mode was invoked [Westerman, 0008]); and controlling the interactive element on the display, based on the speed and the direction (as long as the two or more fingers remain touching the sensor panel, the fingers can be moved around to effect position control on the object or cursor [Westerman, 0006]). Claims 12 and 20, sharing similar elements as claim 1, are likewise rejected. Claim 2: Westerman teaches the method of claim 1. Westerman further teaches wherein the touch input corresponds to contact with the display (when using a touch screen, a user typically makes a selection on the display screen by pointing directly to objects (such as graphical user interface (GUI) objects) displayed on the screen (usually with a stylus or finger) [Westerman, 0004]). Claim 13, sharing similar elements as claim 2, is likewise rejected. Claim 3: Westerman teaches the method of claim 1. Westerman further teaches wherein the touch input corresponds to input by a finger of a user in contact with the display (when using a touch screen, a user typically makes a selection on the display screen by pointing directly to objects (such as graphical user interface (GUI) objects) displayed on the screen (usually with a stylus or finger) [Westerman, 0004]). Claim 14, sharing similar elements as claim 3, is likewise rejected. Claim 4: Westerman teaches the method of claim 1. Westerman further teaches further comprising: prior to receiving the touch input: activating a virtual joystick on the display of the electronic device, wherein a user interface (UI) controller of the electronic device is configured to control the interactive element in response to a user interaction with the virtual joystick (when the motion continuation mode is invoked, a virtual control ring or joystick can be generated to provide enhanced motion continuation capabilities. The virtual control ring can be used as a joystick to navigate within a document, photo, web page, e-mail list, address book, calendar, game, and the like, especially on small touchscreens [Westerman, 0009]). Claim 15, sharing similar elements as claim 4, is likewise rejected. Claim 6: Westerman teaches the method of claim 1. Westerman further teaches wherein generating the touch contact information comprises: receiving a raw capacitive signal associated with the touch input; generating a high-resolution contact image based on the raw capacitive signal; and determining the contact shape and the centroid based on the high-resolution contact image (the touch sensing device can be based on sensing technologies including but not limited to capacitive sensing, resistive sensing, surface acoustic wave sensing, pressure sensing, optical sensing, and/or the like. Furthermore, the touch sensing means can be based on single point sensing or multipoint sensing [Westerman, 0039]). Claim 17, sharing similar elements as claim 6, is likewise rejected. Claim 7: Westerman teaches the method of claim 1. Westerman further teaches wherein the touch input is a first touch input corresponding to an anchor position on the display, the method further comprising: receiving a sequence of further touch inputs; generating further touch contact information including at least one further contact shape or at least one further centroid, based on the sequence of further touch inputs; comparing at least one of the centroid of the first touch input with the at least one further centroid or the contact shape of the first touch input with the at least one further contact shape, to determine a displacement from the anchor position, based on the comparison; determining, by the model, an updated speed and an updated direction associated with the interactive element on the display, based on the displacement; and controlling the interactive element on the display, based on the updated speed and the updated direction (In general, the farther a centroid of touch generated by the touching finger is located from the null, the greater the velocity (up to some maximum velocity), and the closer to the null, the slower the velocity [Westerman, 0073]. Note that point 910 of velocity vector 906 can be coincident with the calculated centroid of patch 912 generated by finger 900. Virtual control ring 904 can follow finger 900 whether it becomes stationary after the motion continuation mode is invoked or continues to move in direction 902, or in other embodiments can remain stationary whether finger 900 continues to move or remains stationary [Westerman, 0074]. As long as the finger centroid remains forward of the null and the vector points forward, forward motion continues. If finger 900 continues to move backward to a position shown in FIG. 9c (i.e. directly over null 908), velocity vector 906 can disappear, indicating no forward velocity. In other words, if the finger centroid is pulled back to the null, the vector can shrink down to zero, and motion can stop [Westerman, 0075]). Claim 18, sharing similar elements as claim 7, is likewise rejected. Claim 8: Westerman teaches the method of claim 1. Westerman further teaches wherein controlling the interactive element on the display comprises: providing the speed and the direction to a user interface (UI) controller of the electronic device for controlling the interactive element on the display (the velocity vector within the virtual control ring can act as a joystick, with the velocity and direction of the motion continuation being controllable by a finger as it moves within the control ring [Westerman, 0076]). Claim 19, sharing similar elements as claim 8, is likewise rejected. Claim 9: Westerman teaches the method of claim 7. Westerman further teaches wherein the interactive element is a cursor (the fingers can be placed down anywhere on a sensor panel, and a cursor can appear near or under the fingers [Westerman, 0006]). Claim 10: Westerman teaches the method of claim 7. Westerman further teaches wherein the interactive element is a menu (viewing a menu [Westerman, 0033]). Claim 11: Westerman teaches the method of claim 1. Westerman further teaches wherein the display is a capacitive touch-sensitive display (the touch sensing device can be based on sensing technologies including but not limited to capacitive sensing, resistive sensing, surface acoustic wave sensing, pressure sensing, optical sensing, and/or the like [Westerman, 0039]). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 5 and 6, are rejected under 35 U.S.C. 103 as being unpatentable over Westerman et al. (US 20170344213 A1, published: 11/30/2017), in view of Marsden (US 20210405870 A1, published: 12/30/2021). Claim 5: Westerman teaches the method of claim 4. Westerman does not teach wherein the user interaction with the virtual joystick corresponds to a micromovement of the user’s finger in contact with the display, the micromovement comprising at least one of: a rolling movement; a rocking movement; or a pivoting movement. However Marsden teaches wherein the user interaction with the virtual joystick corresponds to a micromovement of the user’s finger in contact with the display, the micromovement comprising at least one of: a rolling movement; a rocking movement; or a pivoting movement (allowing a user to roll a single finger over the virtual keyboard [Westerman, 0335]). Therefore, it would have been obvious to a person of ordinary skill in the art, before the invention was filed, to modify the virtual joystick displayed on a touchscreen invention of Westerman to include the micromovement feature of Marsden. One would have been motivated to make this modification to deliver more entertaining animated gestures to keep user's engaged with an otherwise generic touchscreen interface. Claim 16, sharing similar elements as claim 5, is likewise rejected. Additional References The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. The following references include virtual joysticks displayed on a touchscreen: Uradnik et al. (US 20200050337 A1, published: 2/13/2020) Yu (US 20230271091 A1, published: 8/31/2023) Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to SETH A SILVERMAN whose telephone number is (571)272-9783. The examiner can normally be reached Mon-Thur, 8AM-4PM MST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Queler can be reached at (571)272-4140. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Seth A Silverman/Primary Examiner, Art Unit 2172
Read full office action

Prosecution Timeline

Apr 12, 2024
Application Filed
Jan 15, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12587581
SYSTEMS, METHODS, AND MEDIA FOR CAUSING AN ACTION TO BE PERFORMED ON A USER DEVICE
2y 5m to grant Granted Mar 24, 2026
Patent 12579201
INFORMATION PROCESSING SYSTEM
2y 5m to grant Granted Mar 17, 2026
Patent 12578200
NAVIGATIONAL USER INTERFACES
2y 5m to grant Granted Mar 17, 2026
Patent 12572269
PERFORMING A CONTROL OPERATION BASED ON MULTIPLE TOUCH POINTS
2y 5m to grant Granted Mar 10, 2026
Patent 12572261
SPATIAL NAVIGATION AND CREATION INTERFACE
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
73%
Grant Probability
88%
With Interview (+14.8%)
2y 4m
Median Time to Grant
Low
PTA Risk
Based on 449 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month