Prosecution Insights
Last updated: April 19, 2026
Application No. 18/361,589

MULTI-MODAL COMPUTER GAME FEEDBACK USING CONVERSATIONAL DIGITAL ASSISTANT

Final Rejection §103
Filed
Jul 28, 2023
Examiner
MCCULLOCH JR, WILLIAM H
Art Unit
3715
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Sony Interactive Entertainment Inc.
OA Round
3 (Final)
54%
Grant Probability
Moderate
4-5
OA Rounds
3y 5m
To Grant
87%
With Interview

Examiner Intelligence

Grants 54% of resolved cases
54%
Career Allow Rate
330 granted / 614 resolved
-16.3% vs TC avg
Strong +33% interview lift
Without
With
+33.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 5m
Avg Prosecution
32 currently pending
Career history
646
Total Applications
across all art units

Statute-Specific Performance

§101
22.6%
-17.4% vs TC avg
§103
27.7%
-12.3% vs TC avg
§102
21.3%
-18.7% vs TC avg
§112
15.8%
-24.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 614 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. This application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid. Applicant's submission filed on 1/15/2026 has been entered. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 10, 12, 13, 16, and 18 are rejected under 35 U.S.C. 103 as being unpatentable over US 2021/0407165 to Bentell et al. (hereinafter Bentell) in view of US 20200101383 to Hwang et al. (hereinafter Hwang). Regarding claims 10 and 18, Bentell discloses a method, and system therefor, comprising: executing a computer game (computer game client application; ¶¶ [0024], [0096-0098], [0114]); receiving verbal input of a user, the verbal input related to the execution of the computer game (voice command user input for requesting media relating to the game; ¶¶ [0024-0026], [0085], [0092-0098], [0107-0110]); executing a model to process the verbal input (capture audio and perform a voice recognition (model) process; ¶¶ [0055], [0085]); based on the processing of the verbal input, determining a non-verbal response related to the execution of the computer game (visual feedback animations (non-verbal response) related to the input requesting access/management of the game; ¶¶ [0023-0026], [0038-0039], [0060-0065], [0095-0098]), the non-verbal response being a response other than execution of a command within the computer game (updating a dynamic feedback indication indicating the recognition, and thinking/processing state of the device request (command) before indicating completion/execution of the command; ¶¶ [0004-0006], [0054-0056], [0059]); and based on the determination, outputting the non-verbal response (the determination updates the display of the response; ¶¶ [0004-0006], [0054-0056]. [0059]). Further regarding claims 10 and 18, Bentell teaches the invention substantially as described, but lacks in explicitly teaching receiving game state data indicating a current state of the computer game and executing a model to process the game state data, and that the verbal inputs related to the execution of the computer game are received during gameplay. In a related disclosure, Hwang teaches a method and apparatus for recognizing a game command from text data or voice data input by a user, and generating a game action sequence data using the extracted game command element and game action data, and executing the generated game action sequence data (abstract). More particularly, Hwang explains that the “game command recognition apparatus refers to an apparatus that is configured to recognize and process a game command input from the user when the user plays a game using the user terminal 130,” wherein “in response to a game command input from the user through a text or voice input, the game command recognition apparatus may recognize the input game command and may execute a game control corresponding to the recognized game command” (¶ [0042]). Hwang additionally teaches that “the game action data may include information on each of states in gameplay and at least one game action available in each state, may represent a connection relationship between game actions performable on each game screen, …and may be updated together in response to updating of a game program, and game commands executable in a current state may be identified based on graph information represented in the game action data” (¶ [0017]). This demonstrates that the voice commands are recognized during gameplay and in relation to the execution of the computer game based on the state of the game (i.e., the game state). Hwang additionally teaches benefits achieved by the invention, including that “the user may readily play a game without a need to directly execute the game command through the game control,” “design cost of a game command recognition system may decrease since there is no need to design a separate game command for each stage of the user interface,” and “a personalized game command may be configured” (¶ [0042]). As such, it would have been obvious to one of ordinary skill in the art before the effective date to modify the system of Bentell to include receiving verbal inputs of a user during gameplay related to the execution of the computer game and using the game state data, as taught by Hwang, in order to provide the advantages of readily playing a game without direct commands through a typical game controller, reduced design cost, and/or personalized game commands. Regarding claim 12, the combination of Bentell and Hwang teaches or suggests queueing a user request for execution at an indeterminate time in future gameplay, wherein the non-verbal response indicates that the user request has been registered for execution at the indeterminate time in future gameplay rather than immediate execution; and at the indeterminate time in future gameplay, executing the user request (the response indicates a user command has been recognized and is thinking/processing the request before an indication of execution/completion after (in the future) the processing has completed in ¶¶ [0005-0006], [0054-0055], [0059-0061] of Bentell; see also discussion of game command inputs that encompass a command, an entity, and a number of iterations, the number of iterations being future inputs because they are carried out in series at an indeterminate time in future gameplay, in at least ¶ [0060] of Hwang). Regarding claim 13, the combination of Bentell and Hwang teaches or suggests wherein the non-verbal response is a binary response (a success or fail result (binary response); ¶¶ [0057-0059] of Bentell). Regarding claim 16, the combination of Bentell and Hwang teaches or suggests wherein the model is executed outside of a game engine used to execute the computer game (the model is an external (executed outside) voice command processing listening device separate from a game server in ¶¶ [0024], [0082-0085], [0108], [0114-0115] of Bentell). Claims 1-4, 9, 11, and 21 are rejected under 35 U.S.C. 103 as being unpatentable over Bentell and Hwang in view of US 2023/0237277 to Reza et al. (hereinafter Reza). Regarding claims 1, 11, and 21, the combination of Bentell and Hwang teaches the invention substantially as described above, but fails to explicitly disclose a large language model. In a related disclosure, Reza discloses a large language model (large dataset language models boost language application performance for recognizing game controller voice commands; ¶¶ [0002-0003], [0025], [0079], [0088], [0131]). It would have been obvious to one of ordinary skill before the effective date to modify the invention of Bentell and Hwang to include a large language model as taught by Reza in order to improve language recognition performance of the system. Regarding claim 2, the combination of Bentell, Hwang, and Reza teaches or suggests wherein the non-verbal response indicates that a user request has been registered for execution in future gameplay rather than immediate execution (the response indicates a user command has been recognized and is thinking/processing the request before an indication of execution/completion after (in the future) the processing has completed in ¶¶ [0005-0006], [0054-0055], [0059-0061] of Bentell; see also discussion of game command inputs that encompass a command, an entity, and a number of iterations, the number of iterations being future inputs because they are carried out in series at a future time, in at least ¶ [0060] of Hwang). Regarding claim 3, the combination of Bentell, Hwang, and Reza teaches or suggests wherein the non-verbal response comprises a nonverbal visual response (visual animation; ¶¶ [0005], [0029] of Bentell). Regarding claim 4, the combination of Bentell, Hwang, and Reza teaches or suggests wherein the non-verbal visual response comprises one or more of: presenting a screen border in a particular color, illuminating a button on a computer game controller, illuminating a light on a peripheral device different from the computer game controller (displaying an icon shape line in a distinct color on a display (screen border in a particular color) or displaying a color (illuminating a light) on a separate speaker/non-main external presentation device (peripheral device different); ¶¶ [0022], [0040-0042], [0085] of Bentell). Regarding claim 9, the combination of Bentell, Hwang, and Reza teaches or suggests wherein the non-verbal response is a first output (a process initiating state output; ¶¶ [0004-0005], [0040] of Bentell), wherein the verbal input comprises a request to be alerted when a predetermined game action occurs, wherein the non-verbal response comprises an acknowledgement of the request (a media/game processing access (predetermined game action) voice command (the verbal input) includes/is followed by a wake word (request to be alerted) when a confirmation/listening state for listening to/initiating the access command is provided (occurs) and a confirmation (acknowledgement) feedback is provided; ¶¶ [0004-0005], [0024], [0040], [0055-0057], [0095-0098] of Bentell), and wherein the at least one processor assembly is programmed with instructions to: subsequent to outputting the non-verbal response, identify the game action as occurring (dynamically change (subsequent) the response to indicate the game action process is thinking/processing; ¶¶ [0005], [0040], [0055] of Bentell); and based on identification of the game action as occurring, rendering a second output different from the first output, the second output establishing an alert in conformance with the request (indicating (rendering) indication (identification) of the action processing (occurring) as a distinct second state (output providing a confirmation/feedback reflecting (in conformance) the request; ¶¶ [0026], [0055-0057], [0059], [0079-0080] of Bentell). Claims 14-15 are rejected under 35 U.S.C. 103 as being unpatentable over Bentell, Hwang, and Reza in view of US 2021/0398402 to Richards et al. (hereinafter Richards). Regarding claim 14, the combination of Bentell, Hwang, and Reza teaches or suggests the invention substantially as described above, but fails to explicitly disclose the non-verbal response comprises actuating a first button on a computer game controller for "yes" and/or actuating a second button on the computer game controller for "no". In a related disclosure, Richards discloses the non-verbal response comprises actuating a first button on a computer game controller for "yes" and/or actuating a second button on the computer game controller for "no" (driving/actuating control circuitry for haptic game buttons response/feedback for yes/no process actions; ¶¶ [0096-0102], [0116], [0122], [0154], [0161], [0164] of Richards). It would have been obvious to one of ordinary skill before the relevant date to modify the combination of Bentell, Hwang, and Reza to include the non-verbal response comprises actuating a first button on a computer game controller for "yes" and/or actuating a second button on the computer game controller for "no" as taught by Richards in order to provide additional feedback to the user to ensure commands are successfully executed. Regarding claim 15, the combination of Bentell, Hwang, Reza, and Richards teaches or suggests actuating the first and second buttons comprises one or more of: vibrating the respective button relative to other portions of the computer game controller [and] illuminating the respective button (vibrating the respective haptic buttons; ¶¶ (0096-0102], [0122], [0154], [0164] of Richards). Claim 17 is rejected under 35 U.S.C. 103 as being unpatentable over Bentell and Hwang in view of US 2019/0308097 to Yamano et al. (hereinafter Yamano). Regarding claim 17, the combination of Bentell and Hwang teaches the invention substantially as described above, but fails to explicitly disclose based on a game context associated with the computer game, determining the non-verbal response. In a related disclosure, Yamano teaches based on a game context associated with the computer game, determining the non-verbal response (sending a vibration command depending on a game situation; ¶¶ [0050], [0081]). It would have been obvious to one of ordinary skill before the effective date to modify the invention of Bentell and Hwang to include based on a game context associated with the computer game, determining the non-verbal response as taught by Yamano in order to beneficially provide enhanced interaction between the user and the system. Claim 19 is rejected under 35 U.S.C. 103 as being unpatentable over Bentell and Hwang in view of US 2016/0062489 to Li (hereinafter “Li”). Regarding claim 19, the combination of Bentell and Hwang teaches or suggests the invention substantially as described above, but fails to explicitly disclose a suggestion to re-map buttons on a computer game controller from a first configuration to a second configuration. Li discloses a suggestion to re-map buttons on a computer game controller from a first configuration to a second configuration (a prompt to switch/translate (re-map) game controller buttons to a different modes/configurations and settings to adjust operation as needed by the user; ¶¶ [0055], [0058], [0105], [0154], [0285]). It would have been obvious to one of ordinary skill before the effective date to modify the invention of Bentell and Hwang to include a suggestion to re-map buttons on a computer game controller from a first configuration to a second configuration as taught by Li for the advantage of providing customized controller input settings. Claim 20 is rejected under 35 U.S.C. 103 as being unpatentable over Bentell and Hwang in view of US 2021/0373676 to Jorasch et al (hereinafter Jorasch). Regarding claim 20, the combination of Bentell and Hwang teaches or suggests the invention substantially as described above, but fails to explicitly disclose the verbal input comprises a user question asking which input element on a computer game controller is usable to provide a particular game command, and wherein the non-verbal response comprises one or more of: illumination of the input element usable to provide the particular game command, vibration of the input element usable to provide the particular game command. In a related disclosure, Jorasch teaches wherein the verbal input comprises a user question asking which input element on a computer game controller is usable to provide a particular game command (verbal commands from a user asking/requesting for game controller key and button coaching/tips to reduce user frustration; ¶¶ [0284], [0394], [0851], [1458-1459], [2298-2299], [2351-2353]), and wherein the non-verbal response comprises one or more of: illumination of the input element usable to provide the particular game command, vibration of the input element usable to provide the particular game command (buttons/keys vibrating/pulsating and modifying lighting color/intensity to provide a tutorial; ¶¶ [03941, [0992], [1458-1459], [1651]). It would have been obvious to one of ordinary skill before the relevant date to modify the invention of Bentell and Hwang to include the verbal input comprises a user question asking which input element on a computer game controller is usable to provide a particular game command, and wherein the non-verbal response comprises one or more of: illumination of the input element usable to provide the particular game command, vibration of the input element usable to provide the particular game command as taught by Jorasch in order to reduce user frustration and provide better quality of interaction. Claims 5-6 are rejected under 35 U.S.C. 103 as being unpatentable over Bentell, Hwang, and Reza in view of US 2014/0223462 Aimone et al. (hereinafter “Aimone”). Regarding claim 5, the combination of Bentell, Hwang, and Reza teaches or suggests the invention substantially as described above, but lacks in explicitly teaching wherein the non-verbal response comprises a nonverbal tactile response. In a related disclosure, Aimone discloses the non-verbal response comprises a nonverbal tactile response (a tactile presentation feedback component of a game controller; ¶ [0129]). It would have been obvious to one of ordinary skill before the effective date to modify the invention of Bentell, Hwang, and Reza to include the non-verbal response comprises a nonverbal tactile response as taught by Aimone in order to provide additional modes of feedback. Regarding claim 6, the combination of Bentell, Hwang, Reza, and Aimone teaches or suggests the non-verbal tactile response comprises dynamically providing feedback, to indicate a direction indicated in the verbal input, a computer game controller in the direction and/or at a particular location corresponding to the direction (dynamic feedback indicating a voice command (direction) of the input by dynamically changing by a game server (controller)/output interface a direction in accordance with/reflecting the input; ¶¶ [0004-0006], [0024-0026], [0039] of Bentell), and Aimone further discloses the non-verbal tactile response comprises vibrating (vibrating feedback of a game controller indicating a command (direction) received by speech input; ¶¶ [0129], [0230], [0402] of Bentell). Claims 7-8 are rejected under 35 U.S.C. 103 as being unpatentable over Bentell, Hwang, and Reza in view of US 2008/0144134 to Ahmed et al. (hereinafter “Ahmed”). Regarding claim 7, the combination of Bentell, Hwang, and Reza teaches or suggests the invention as described above, but lacks in explicitly teaching wherein the non-verbal response comprises a nonverbal audible response. In a related disclosure, Ahmed discloses the non-verbal response comprises a nonverbal audible response (a sound feedback indication (response) to verbal user input; ¶¶ [0038], [0039], [0044]). It would have been obvious to one of ordinary skill before the effective date to modify the invention of Bentell, Hwang, and Reza to include the non-verbal response comprises a nonverbal audible response as taught by Ahmed in order to provide additional forms of feedback to the user. Regarding claim 8, the combination of Bentell, Hwang, Reza, and Ahmed teaches or suggests wherein the non-verbal audible response comprises a first predetermined sound corresponding to "yes" and/or a second predetermined sound corresponding to "no" (selecting different sounds indicating a respective positive (yes)/negative (no) to the verbal request input; ¶¶ [0037], [0039], [0041], [0044] of Ahmed). Response to Arguments Applicant's arguments filed 1/15/2026 have been fully considered but they are not persuasive. The majority of the Remarks section addresses the prior art individually, rather than taking the references in combination as provided in the grounds of rejection. In response to Applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). On page 10 of the Remarks, Applicant asserts that Hwang teaches only one passage that mentions game state, namely at ¶ [0068]. The Examiner respectfully disagrees. Hwang discusses data representing the state of the game at numerous points throughout the reference. For purposes of illustration, the Examiner relies on ¶ [0017]. Hwang explains, inter alia, that the game command recognition apparatus employs the game action data which may include information on each of the states in gameplay and at least one game action available in each state. Taken in context of the reference, this data reads on the claimed “game state data” because the game action data informs the model about the types of inputs that could possibly be made by the player in a given situation. As such, the amendments to the independent claims are not sufficient to overcome the teachings of Hwang in combination with other references of record. In response to Applicant’s argument on page 10 that there is no teaching, suggestion, or motivation to combine the references, the Examiner recognizes that obviousness may be established by combining or modifying the teachings of the prior art to produce the claimed invention where there is some teaching, suggestion, or motivation to do so found either in the references themselves or in the knowledge generally available to one of ordinary skill in the art. See In re Fine, 837 F.2d 1071, 5 USPQ2d 1596 (Fed. Cir. 1988), In re Jones, 958 F.2d 347, 21 USPQ2d 1941 (Fed. Cir. 1992), and KSR International Co. v. Teleflex, Inc., 550 U.S. 398, 82 USPQ2d 1385 (2007). In this case, Bentell and Hwang both relate to the goal of accepting voice inputs to computer systems that are traditionally operated by pressing buttons. Bentell discusses a broad class of applications that may be executed, including video games, and Hwang is focused specifically on video games. While Hwang teaches a pretrained neural network-based game command element extraction model may be used to extract the game command elements (¶ [0071]), Hwang does not specify that the type of model used is a large language model (LLM). Reza discusses the benefits of using an LLM, which may be seen as an improvement on previous models. Therefore, the combination of references provided by the grounds of rejection would have been obvious to one of ordinary skill in the art before the effective date. Conclusion All claims are identical to or patentably indistinct from, or have unity of invention with claims in the application prior to the entry of the submission under 37 CFR 1.114 (that is, restriction (including a lack of unity of invention) would not be proper) and all claims could have been finally rejected on the grounds and art of record in the next Office action if they had been entered in the application prior to entry under 37 CFR 1.114. Accordingly, THIS ACTION IS MADE FINAL even though it is a first action after the filing of a request for continued examination and the submission under 37 CFR 1.114. See MPEP § 706.07(b). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to WILLIAM H MCCULLOCH whose telephone number is (571)272-2818. The examiner can normally be reached M-F 9:30-5:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, David Lewis can be reached at 571-272-7673. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /WILLIAM H MCCULLOCH JR/Primary Examiner, Art Unit 3715
Read full office action

Prosecution Timeline

Jul 28, 2023
Application Filed
Jun 12, 2025
Non-Final Rejection — §103
Sep 16, 2025
Response Filed
Oct 10, 2025
Final Rejection — §103
Jan 15, 2026
Request for Continued Examination
Feb 17, 2026
Response after Non-Final Action
Feb 25, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12582911
DISPLAY METHOD AND APPARATUS FOR VIRTUAL VEHICLE, DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 24, 2026
Patent 12582910
COMPUTER SYSTEM, GAME SYSTEM, AND GAME PROGRESS CONTROL METHOD
2y 5m to grant Granted Mar 24, 2026
Patent 12582915
STORAGE MEDIUM, GAME APPARATUS, GAME SYSTEM, AND GAME PROCESSING METHOD
2y 5m to grant Granted Mar 24, 2026
Patent 12582870
ESTIMATING SPIN RATE AND AXIS OF A BALL USING DEEP LEARNING
2y 5m to grant Granted Mar 24, 2026
Patent 12576343
COMMUNICATION SYSTEM
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

4-5
Expected OA Rounds
54%
Grant Probability
87%
With Interview (+33.3%)
3y 5m
Median Time to Grant
High
PTA Risk
Based on 614 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month