Prosecution Insights
Last updated: April 19, 2026
Application No. 18/422,288

SYSTEMS AND METHODS FOR ELECTRONIC GAME CONTROL AND GAME CONTROLLER CONFIGURATIONS WITH EMG SENSING

Non-Final OA §102§112
Filed
Jan 25, 2024
Examiner
THAI, XUAN MARIAN
Art Unit
3715
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Sony Interactive Entertainment Inc.
OA Round
1 (Non-Final)
2%
Grant Probability
At Risk
1-2
OA Rounds
3y 11m
To Grant
8%
With Interview

Examiner Intelligence

Grants only 2% of cases
2%
Career Allow Rate
4 granted / 175 resolved
-67.7% vs TC avg
Moderate +6% lift
Without
With
+5.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 11m
Avg Prosecution
28 currently pending
Career history
203
Total Applications
across all art units

Statute-Specific Performance

§101
22.3%
-17.7% vs TC avg
§103
37.0%
-3.0% vs TC avg
§102
17.7%
-22.3% vs TC avg
§112
18.8%
-21.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 175 resolved cases

Office Action

§102 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 4, 5, 10-12 and 17- 20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 5, 12, 19 and 20 recites the limitation "the user control signal". There is insufficient antecedent basis for this limitation in the claim. Claims 4, 10, 11, 17 and 18 recites the limitation “wherein detecting a user control signal,” which appears to be referring to the “detecting an EMG user control signal” in the independent claims 1, 6 and 13. It is not clear if the limitations “a user control signal” and “an EMG user control signal” are referring to the same control signal or not. Thus rendering the claim limitations indefinite. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Ballagas et al. [US20220057865], hereinafter Ballagas. Regarding claims 1-5, please refer to the claim rejections of claims 13, 15-17 and 19. Regarding claims 6-12, please refer to the claim rejections of claims 13-19. Regarding claim 13, Ballagas discloses a system (Fig. 2) comprising: a gaming device ([0026], “The application program 240 on the computer system 206 can be for various purposes. As one example, the application program 240 can be a gaming application program”, and Fig. 2); and a game controller ([0027], “There can be various implementations of the user controller 204 and the biometric device 208”, and Fig. 2), a strap including at least one electromyography (EMG) sensor (Figs. 3A-4B, and [0028], “In this example, the user is shown with two biometric devices 208 with one biometric device 208 on each arm, wrist, and/or forearm to extract biometric sensing data (e.g., EMG signals)”); a game controller body supporting at least one input control (Fig. 3A, 204); and a controller housed in the game controller body electrically coupled to the at least one input control, wherein the controller is configured to receive output of at least one electromyography (EMG) sensor for a user of an electronic game, the output including at least one EMG signal for the user ([0027], “The user controller 204 can have various selectable options 220 on the front, back, side, etc, which are accessible by the user while holding the user controller 204 in the user's hand. The biometric device 208 is illustrated as a wrist strap or biometric sensing bracelet. The biometric device 208 can be operatively connected to the user controller 204 via connection 230. Although a wired connection is shown in FIG. 3A, the connection 230 could be a wireless connection. The biometric device 208 can have numerous biometric sensors 202 on an inner surface of the biometric device 208 to be pressed against the user's arm or wrist, depicted in FIG. 3B. FIG. 3B illustrates an example of the biometric device 208 disconnected from the user controller 204. The biometric device 208 can have a connector 302 for coupling to the user controller 204”); detect an EMG user control signal in the at least one EMG signal for the user; and output the EMG user control signal for control of electronic game presentation to the gaming device ([0027], “In FIGS. 3A and 3B, the prediction module 102A in user controller 204 receives the user's biometric sensing data from biometric device 208 of the user targeting selectable options 220 and predicts the targeted selectable options 220 from the biometric sensing data before the user presses any of the targeted selectable options 220 (e.g., before pressing the buttons on the user controllers). The prediction module 102A and/or command application program 222 can send a preemptive command signal corresponding to each one of the targeted selectable options 220 to the computer system 206, prior to the user selecting any one of the targeted selectable options 220”). Regarding claim 14, Ballagas discloses the system of claim 13, wherein the at least one EMG sensor includes a plurality of EMG sensor elements retained by the strap, wherein the strap is coupled to the game controller body and configured to secure the game controller to the user, and wherein the strap includes at least one electrical connection for each of the EMG sensor elements and an interface of the game controller body (Figs. 3A and 3B, [0027], ““The user controller 204 can have various selectable options 220 on the front, back, side, etc, which are accessible by the user while holding the user controller 204 in the user's hand. The biometric device 208 is illustrated as a wrist strap or biometric sensing bracelet. The biometric device 208 can be operatively connected to the user controller 204 via connection 230. Although a wired connection is shown in FIG. 3A, the connection 230 could be a wireless connection. The biometric device 208 can have numerous biometric sensors 202 on an inner surface of the biometric device 208 to be pressed against the user's arm or wrist, depicted in FIG. 3B. FIG. 3B illustrates an example of the biometric device 208 disconnected from the user controller 204. The biometric device 208 can have a connector 302 for coupling to the user controller 204””). Regarding claim 15, Ballagas discloses the system of claim 13, wherein the output of the at least one EMG sensor includes electrical activity detected from a user wrist and associated with at least one of a hand movement and finger movement of a user ([0027], “In FIGS. 3A and 3B, the prediction module 102A in user controller 204 receives the user's biometric sensing data from biometric device 208 of the user targeting selectable options 220 and predicts the targeted selectable options 220 from the biometric sensing data before the user presses any of the targeted selectable options 220 (e.g., before pressing the buttons on the user controllers). The prediction module 102A and/or command application program 222 can send a preemptive command signal corresponding to each one of the targeted selectable options 220 to the computer system 206, prior to the user selecting any one of the targeted selectable options 220” and [0030], “At block 502, the predication module 102A, 102B can detect new biometric sensor data (e.g., EMG signals) indicating motion (such as, e.g., finger motion related to muscles of the user) from the biometric device 208. At block 504, the prediction module 102A, 102B may use controller sensors 224 on the user controller 204 to detect current finger position. The prediction module 102A, 102B can receive proximity data from the controller sensors 224 of the user's position of the fingers relative to selectable options 220 on the user controller 204”). Regarding claim 16, Ballagas discloses the system of claim 13, wherein the output of the at least one EMG sensor includes electrical activity associated with user EMG data for at least one of a button press, directional pad input, analog stick motion, finger gesture, wrist motion, and controller movement ([0030], “Turning to FIG. 5, a flowchart 500 is depicted for using biometric sensing data to preemptively propagate a prediction selection (e.g., button press) of a targeted selectable option prior to the user actually pressing a button. At block 502, the predication module 102A, 102B can detect new biometric sensor data (e.g., EMG signals) indicating motion (such as, e.g., finger motion related to muscles of the user) from the biometric device 208”). Regarding claim 17, Ballagas discloses the system of claim 13, wherein detecting a user control signal includes correlating at least a portion of the EMG signal to recorded EMG data for a user game controller action ([0017], “The predefined model module 110 can use the matched biometric sensing data to determine/predict selectable options on a user controller corresponding to the labels, classes, and/or categories before receiving an actual response or selection of the selectable option from the user. The model module 110 may receive previous history of the previous selectable option (e.g., key press, joystick movement, etc.) on the user controller from a feedback module 124, and the location of the previous selectable option can be used as guide with biometric sensing data to the next selectable option”). Regarding claim 18, Ballagas discloses the system of claim 13, wherein detecting a user control signal includes using a machine learning model to detect and identify the user control signal from the EMG signal ([0014], “Statistical, predictive, and/or machine learning based techniques can be utilized to train the predictive model module 110. The model module 110 can employ supervised learning, unsupervised learning, reinforcement learning, deep learning, and/or any other techniques. Examples of machine learning processes or techniques executed by the predictive model module 110 can include, but are not limited to, linear regression, logistic regression, linear discriminant analysis, classification and regression trees, naïve Bayes, K-nearest neighbors, learning vector quantization, support vector machines, bagging and random forest, boosting and Adaboost, etc. Other machine learning processes or techniques employed by the predictive model module 110 can include, but are not limited to, artificial neural networks (ANN), nonparametric Gaussian process (GP) regressor, etc.”). Regarding claim 19, Ballagas discloses the system of claim 13, further comprising detecting user activation of the game controller after detecting the EMG user control signal and generating controller output based on the user activation, and wherein outputting the user control signal includes output of the user control signal to a game controller prior to generating the controller output based on the user activation ([0017], “The predefined model module 110 can use the matched biometric sensing data to determine/predict selectable options on a user controller corresponding to the labels, classes, and/or categories before receiving an actual response or selection of the selectable option from the user”). Regarding claim 20, Ballagas discloses the system of claim 13, wherein the gaming device is configured to control presentation of electronic game data, and wherein the gaming device is configured to generate game data for a first game state based on the user control signal detected in the at least one EMG signal for the user ([0025], “The preemptive command (or control signal) can cause an application program 240 on the computer system 206 to change state based on the prediction by the prediction module 102A, 102B. In some implementations, the command application program 222 and/or portions of the command application program 222 can be on the computer system 206 which is illustrated as a dashed box. As such, the command application program 222 on computer system 206 can replicate the preemptive command and cause the preemptive command to change the state of the application program 240 on the computer system 206 in cases when the prediction module 102A or the prediction module 102B predicts the selectable option 220 to be selected”), and wherein the gaming device is configured to generate game data for a second game state in response to the user control signal and expiration of a user controller activation window ([0017], “The previous selectable option(s) can have a weight that reinforces the matched label and/or changes the matched label when a threshold is reached. The weighting can be based on an elapsed time from the previous selectable option(s) made by the user, where a shorter time interval has a greater weight than a longer time interval between the previously selectable option by the user on the user controller and the predicted selectable option. The predictive model module 110 is output the predicted selectable option(s) as predicted user input based on the biometric sensing data and the previously selected selectable option made by the used on the user controller, prior to the user selecting the selectable option(s) on the user controller”). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to YINGCHUAN ZHANG whose telephone number is (571)272-1375. The examiner can normally be reached 8:00 - 4:30 M-F. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Xuan Thai can be reached at (571) 272-7147. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /YINGCHUAN ZHANG/Primary Examiner, Art Unit 3715
Read full office action

Prosecution Timeline

Jan 25, 2024
Application Filed
Dec 03, 2025
Non-Final Rejection — §102, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12551797
VIRTUAL OBJECT CONTROL METHOD AND APPARATUS, DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Feb 17, 2026
Patent 8657605
VIRTUAL TESTING AND INSPECTION OF A VIRTUAL WELDMENT
2y 5m to grant Granted Feb 25, 2014
Patent 8398404
SYSTEM AND METHOD FOR ELEVATED SPEED FIREARMS TRAINING
2y 5m to grant Granted Mar 19, 2013
Patent null
Video display of high contrast graphics for newborns and infants
Granted
Patent null
Device including a lens array
Granted
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
2%
Grant Probability
8%
With Interview (+5.9%)
3y 11m
Median Time to Grant
Low
PTA Risk
Based on 175 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month