Prosecution Insights
Last updated: April 19, 2026
Application No. 18/669,643

Picture Display

Non-Final OA §101§102
Filed
May 21, 2024
Examiner
HSU, RYAN
Art Unit
3715
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Tencent Technology (Shenzhen) Company Limited
OA Round
1 (Non-Final)
57%
Grant Probability
Moderate
1-2
OA Rounds
3y 8m
To Grant
75%
With Interview

Examiner Intelligence

Grants 57% of resolved cases
57%
Career Allow Rate
347 granted / 613 resolved
-13.4% vs TC avg
Strong +18% interview lift
Without
With
+18.5%
Interview Lift
resolved cases with interview
Typical timeline
3y 8m
Avg Prosecution
55 currently pending
Career history
668
Total Applications
across all art units

Statute-Specific Performance

§101
30.6%
-9.4% vs TC avg
§103
29.6%
-10.4% vs TC avg
§102
16.8%
-23.2% vs TC avg
§112
14.4%
-25.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 613 resolved cases

Office Action

§101 §102
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Status Claims 1-20 are pending. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a grouping of abstract ideas without significantly more. The claims, as exemplified by independent Claim 1, recites limitations directed to a grouping of abstract ideas such as: 1. A method comprising: initiating an interactive virtual environment; establishing a first spawn point for a first virtual object in a first round of an adversarial interactive game; - certain method of organizing human activity; granting a right to control a first virtual camera in the interactive virtual environment to the first virtual object during the first round based on the first virtual object having won a previous round of the adversarial interactive game; -certain method of organizing human activity such as managing a social activity including rules and/or instructions placing the first virtual camera at a location in the virtual environment based on the first spawn point; - certain method of organizing human activity; controlling the first round of the interactive game based on user inputs received from a plurality of user terminals, wherein each terminal corresponds to a different user account and controls a different virtual object in the interactive game; -certain method of organizing human activity including rules and/or instructions; causing a virtual environment picture to be displayed on a first user terminal corresponding to the first virtual object, wherein the virtual environment picture corresponds to a real-time view of the virtual environment based on a then current location of the first virtual object in the virtual environment; and causing, in response to user input originating from the first user terminal and indicating a picture switching operation, a virtual camera picture to be displayed on the first user terminal, wherein the virtual camera picture corresponds to a view based on the location and a lens direction of the first virtual camera. The limitations, as underlined and exemplified by independent Claim 1, are found to recite a certain method of organizing human activity because they recite steps and/or instructions for managing an adversarial interactive game. For at least these reasons, the claims are found to recite a grouping of abstract ideas under Step 2A-prong 1. This judicial exception is not integrated into a practical application because the additional limitations such as: “initiating an interactive virtual environment;” “causing a virtual environment picture to be displayed on a first user terminal corresponding to the first virtual object, wherein the virtual environment picture corresponds to a real-time view of the virtual environment based on a then current location of the first virtual object in the virtual environment;” and “causing, in response to user input originating from the first user terminal and indicating a picture switching operation, a virtual camera picture to be displayed on the first user terminal, wherein the virtual camera picture corresponds to a view based on the location and a lens direction of the first virtual camera.” are found to merely invoke a computer as a tool to implement the abstract idea, insignificant extra solution activity, and/or provide a technological environment in which to perform the abstract idea (see MPEP 2106.05(f)-(h)). For at least these reasons, the additional limitations of the claims, as exemplified by independent Claim 1, are not found integrate the claim into a practical application under Step 2A-prong 2. The claims, as exemplified by independent claim 1, recite additional elements such as: “a first user terminal” that when viewed individually and/or as a whole is not sufficient to amount to significantly more than the judicial exception because it amounts to mere instructions to invoke a computer as a tool to implement the abstract idea, perform insignificant extra solution activity, and/or provide a technological environment to perform the abstract idea (see MPEP 2106.05(f)-(h)). It follows that the additional elements are similar to the court’s decision in Alice v. CLS, wherein the additional element is not sufficient to show an improvement to computer functionality and/or a different field that provides significantly more than the abstract idea. claim 1 does not recite additional elements under Step 2B. With respect to independent Claims 16 and 20, the claims recite substantially the same subject matter that has been analyzed above. The analysis of independent Claim 1 is incorporated herein. The differences in the claims are directed to additional elements that are recited by independent Claims 18 and 20. For instance, independent claim 18 recites “One or more non-transitory computer readable media storing computer readable instructions”, “a processor” and “a data processing system”. Independent Claim 20 recites “A data processing system” comprising a process and memory storing computer readable instructions” which does not change or alter the analysis from independent Claim 1 because these additional elements when viewed individually and/or as a collection of elements amount to highly-generalized commercially available computer components that perform their ordinary and general functions similar to Alice v. CLS. It follows that these additional elements do not amount to significantly more but are analogous to reciting a computer as a tool to implement the abstract idea, insignificant extra solution activity, and/or provide a technological environment in which to perform the abstract idea (see MPEP 2106.05(f)-(h)). For at least these reasons, independent Claims 16 and 20 are found to be directed to a grouping of abstract ideas without significantly more. Regarding dependent claims 2-15 and 17-19, the limitations have been reviewed and analyzed and were found to further recite limitations directed to at least one of: a grouping of abstract ideas (see MPEP 2106.04(a)), mere instructions to apply the exception (see MPEP 2106.05(f)), insignificant extra solution activity; and/or provide a technological environment in which to perform the abstract idea. For at least these reasons, claims 1-20 are found to be directed to a grouping of abstract ideas without significantly more. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-4, 6-8, 16-18, and 20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by the game Battlefield 2, herein “Battlefield 2”, as evidenced by [revive project]’s “Battlefield 2 Commander Gameplay”, retrieved from YouTube URL<Battlefield 2 Commander Gameplay - Nice Team effort! Dragon Valley>, posted May 20, 2021, hereinafter ‘Commander Gameplay’, and Battlefield Wiki’s ‘Battlefield 2’, retrieved via Wayback Machine, June 27, 2022; ‘Commander(Feature)’ retrieved via Wayback Machine, June 28, 2022; ‘Commander View’, retrieved via Wayback Machine, December 08, 2019; ‘Control Point’, retrieved via Wayback Machine, June 25, 2022). Regarding claims 1, Battlefield 2 discloses a method comprising: initiating an interactive virtual environment (see Commander Gameplay - @0:00-1:00 – wherein the multiplayer game Battlefield 2 initiates an interactive virtual environment); establishing a first spawn point for a first virtual object in a first round of an adversarial interactive game (see ‘Control Point’ – pg. 1-4 of Battlefield Wiki, wherein control points serve spawn points); granting a right to control a first virtual camera in the interactive virtual environment to the first virtual object during the first round based on the first virtual object having won a previous round of the adversarial interactive game (see ‘Commander (Feature)’ -pg. 1-5 of Battlefield Wiki - wherein the Commander role is a right to control, provided to the senior-most player in terms of ‘experience’ which is analogous to the first virtual object having won a previous round of the adversarial interactive game, to control a first virtual camera;); placing the first virtual camera at a location in the virtual environment based on the first spawn point (see ‘Commander Gameplay - @0:00-1:00; ‘Commander (Feature)’-pg. 1-5 –‘Control Point’- pg. 1-4 and ‘Commander View’- pg. 30-33 of Battlefield Wiki); controlling the first round of the interactive game based on user inputs received from a plurality of user terminals (see ‘Commander Gameplay’ – 0:09-0:59; ‘Battlefield 2’ of Battlefield Wiki- wherein the first round is a round of Battlefield 2), wherein each terminal corresponds to a different user account and controls a different virtual object in the interactive game (see ‘Battlefield 2’ of Battlefield Wiki – pg. 1-8, wherein the multiplayer game mode provides factions made up of squads using different users of user accounts on the terminals that control different virtual objects (e.g., characters vehicles, units); causing a virtual environment picture to be displayed on a first user terminal corresponding to the first virtual object (see ‘Commander Gameplay’ – 0:09-0:59; ‘Commander View’ – pg. 30-33), wherein the virtual environment picture corresponds to a real-time view of the virtual environment based on a then current location of the first virtual object in the virtual environment (see ‘Commander Gameplay’; ‘Commander View’ of Battlefield Wiki – pg. 30-33 – wherein the picture switching allow for the satellite view camera to switch to the real-time camera in the Commander View); and causing, in response to user input originating from the first user terminal and indicating a picture switching operation (see ‘Commander Gameplay’ – 0:009-0:59; ‘Commander View’ – pg. 30-33); a virtual camera picture to be displayed on the first user terminal, wherein the virtual camera picture corresponds to a view based on the location and a lens direction of the first virtual camera (see ‘Commander Gameplay’ – 0:09-1:00; ‘Commander View’ of Battlefield Wiki, pg. 30-33, wherein the picture switching operation allows for user inputs to switch between the in-game camera between a satellite view and a real-time camera that located some distance above ground). Regarding claim 16, Battlefield 2 discloses one or more non-transitory computer readable media storing computer readable instructions which, when executed by a processor, configure a data processing system to perform (see Commander Gameplay - @0:00-1:00; ‘Battlefield 2’ via Battlefield Wiki – wherein Battlefield 2 is a computer game stored and executed on a system that configures a data processing system to perform the game): initiating an interactive virtual environment (see Commander Gameplay - @0:00-1:00); establishing a first spawn point for a first virtual object in a first round of an adversarial interactive game (see ‘Control Point’ – pg. 1-4 of Battlefield Wiki,); granting a right to control a first virtual camera in the interactive virtual environment to the first virtual object during the first round based on the first virtual object having won a previous round of the adversarial interactive game (see ‘Commander (Feature)’ -pg. 1-5 of Battlefield Wiki); placing the first virtual camera at a location in the virtual environment based on the first spawn point (see ‘Commander Gameplay - @0:00-1:00; ‘Commander (Feature)’-pg. 1-5 –‘Control Point’- pg. 1-4 and ‘Commander View’- pg. 30-33 of Battlefield Wiki); controlling the first round of the interactive game based on user inputs received from a plurality of user terminals (see ‘Commander Gameplay’ – 0:09-0:59; ‘Battlefield 2’ of Battlefield Wiki-), wherein each terminal corresponds to a different user account and controls a different virtual object in the interactive game (see ‘Battlefield 2’ of Battlefield Wiki – pg. 1-8); causing a virtual environment picture to be displayed on a first user terminal corresponding to the first virtual object, wherein the virtual environment picture corresponds to a real-time view of the virtual environment based on a then current location of the first virtual object in the virtual environment; and causing, in response to user input originating from the first user terminal and indicating a picture switching operation, a virtual camera picture to be displayed on the first user terminal, wherein the virtual camera picture corresponds to a view based on the location and a lens direction of the first virtual camera. Regarding claim 20, Battlefield 2 disclose a data processing system comprising: a processor; and memory storing computer readable instructions which, when executed by the processor, configure the data processing system to perform (see Battlefield 2, ‘Commander Gameplay’ is presented on a data processing computing system that executes the game using a processor and a memory storing instructions for the game and presented on the display): initiating an interactive virtual environment (see Commander Gameplay - @0:00-1:00); establishing a first spawn point for a first virtual object in a first round of an adversarial interactive game (see ‘Control Point’ – pg. 1-4 of Battlefield Wiki); granting a right to control a first virtual camera in the interactive virtual environment to the first virtual object during the first round based on the first virtual object having won a previous round of the adversarial interactive game (see ‘Commander (Feature)’ -pg. 1-5 of Battlefield Wiki); placing the first virtual camera at a location in the virtual environment based on the first spawn point ((see ‘Commander Gameplay - @0:00-1:00; ‘Commander (Feature)’-pg. 1-5); controlling the first round of the interactive game based on user inputs received from a plurality of user terminals (see ‘Commander Gameplay’ – 0:09-0:59; ‘Battlefield 2’ of Battlefield Wiki), wherein each terminal corresponds to a different user account and controls a different virtual object in the interactive game (see ‘Battlefield 2’ of Battlefield Wiki – pg. 1-8); causing a virtual environment picture to be displayed on a first user terminal corresponding to the first virtual object, wherein the virtual environment picture corresponds to a real-time view of the virtual environment based on a then current location of the first virtual object in the virtual environment (see ‘Commander Gameplay’; ‘Commander View’ of Battlefield Wiki – pg. 30-33); and causing, in response to user input originating from the first user terminal and indicating a picture switching operation (see ‘Commander Gameplay’ – 0:009-0:59; ‘Commander View’ – pg. 30-33), a virtual camera picture to be displayed on the first user terminal, wherein the virtual camera picture corresponds to a view based on the location and a lens direction of the first virtual camera (see ‘Commander Gameplay’ - ‘Commander View’ of Battlefield Wiki, pg. 30-33). Regarding claims 2 and 17, Battlefield 2 discloses the method according to claim 1 and the computer readable medium according to claim 16. Battlefield 2 further discloses wherein after causing the virtual camera picture to be displayed, the method further comprises: receiving user input originating from the first terminal requesting a marking operation on a second virtual object within a field of view of the first virtual camera (see Commander Gameplay – 0:09-1:30 – wherein the user input from the Commander request ‘spotting’ ‘scanning’ and command assets and issue orders and zoom in and/or change the camera view); causing marking information corresponding to the second virtual object to be displayed on a second user terminal corresponding to an account controlling a third virtual object, wherein the third virtual object is on a same team as the first virtual object (see Commander Gameplay – 0:09-1:30; Commander (Feature) – pg. 1-5; Battlefield 2 – pg. 1-5 – wherein the Commander gains the ability to mark information on the map such as spot, scan, deliver resources, issue orders to other members of the squad such as providing assets and resources and ordering specific objectives that are displayed to the other users in the squad). Regarding claim 3 and 18, Battlefield 2 discloses the method of claim 1 and the computer readable media according to claim 16. Battlefield 2 further discloses wherein the instructions further configure the data processing system to perform: placing a plurality of virtual cameras based on the first spawn point (see Control Point – pg. 1-4 of Battlefield Wiki), wherein the plurality of virtual cameras comprises the first virtual camera, wherein the granting comprises granting the first virtual object the right to control the plurality of the virtual cameras (see Commander (Feature) – pg. 1-5; Commander View – pg. 30-33 wherein the granting of the Commander View is a right to control the plurality of virtual cameras on the map); wherein the first virtual camera is selected from the plurality of virtual cameras based on a usage status of each of the plurality of virtual cameras (see Commander Gameplay – 0:09-1:00; Control Point – pg. 1-4, wherein the home base is a team’s initial spawn point selected by the wherein the Commander role is present (e.g., a usage status of the players); Commander View – pg. 30-33. Regarding claim 4, Battlefield 2 discloses the method according to claim 1, further comprising: placing a plurality of virtual cameras based on the first spawn point, wherein the plurality of virtual cameras comprises the first virtual camera (see Commander Gameplay -0:09-1:00; Control Point – pg. 1-4, wherein the control points of the virtual cameras are placed upon spawn points including the first spawn point and the control points comprise the first virtual camera). Regarding claim 6, Battlefield 2 discloses the method according to claim 4, further comprising: receiving further user input originating from the first terminal indicating the camera switching operation causing display on the first terminal of a second camera picture captured by a second virtual camera of the plurality of virtual cameras, wherein the second virtual camera and the first virtual camera have different fields of view within the virtual environment (see Commander Gameplay – 0:09-1:30 wherein the first virtual camera is a satellite view of the interactive gaming environment and the second virtual camera is a real-time view of the ground level which are different fields of view). Regarding claim 7, Battlefield 2 discloses the method according to claim 6, further comprising: determining the second virtual camera from the plurality of virtual cameras according to the one or more selection criteria (see Commander Gameplay – 0:09-1:00; wherein the second virtual camera location is placed in a real-time view above the ground in accordance with the spawn points; Control Point – pg. 1-5, wherein the cameras are determined based on control points/flag posts on the map). Regarding claim 8, Battlefield 2 discloses the method according to claim 4, wherein the one or more selection criteria are based on one or more of: a distance between the first virtual object and the corresponding virtual camera; a distance between a second virtual object and the corresponding virtual camera; and/or a user configured criterion (see Commander Gameplay – 1:00-1:30 - wherein the second input is a user configured criterion for a zoom-in camera view of the control point in the map). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to RYAN HSU whose telephone number is (571)272-7148. The examiner can normally be reached Monday - Friday 10:00-6:00 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Dmitry Suhol can be reached at (571) 272-4430. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /RYAN HSU/ EXAMINER, Art Unit 3715
Read full office action

Prosecution Timeline

May 21, 2024
Application Filed
Feb 04, 2026
Non-Final Rejection — §101, §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12567302
INDEPENDENTLY RANDOMLY DETERMINED SYMBOL PATTERN SET ASSOCIATED WITH SYMBOL DISPLAY POSITIONS
2y 5m to grant Granted Mar 03, 2026
Patent 12567304
ELECTRONIC GAMING MACHINE HAVING A TRANSMISSIVE DISPLAY DEVICE AND REELS THAT INCLUDE SYMBOLS WITH FILLABLE SUB-SYMBOLS
2y 5m to grant Granted Mar 03, 2026
Patent 12539468
AI STREAMER WITH FEEDBACK TO AI STREAMER BASED ON SPECTATORS
2y 5m to grant Granted Feb 03, 2026
Patent 12542025
MULTIPLE INSTRUMENT SHEET MUSIC EMPLOYED FOR SYMBOL GENERATION AND DISPLAY IN GAMING ENVIRONMENTS
2y 5m to grant Granted Feb 03, 2026
Patent 12515123
GAME CONTROLLER SYSTEM AND RELATED METHODS
2y 5m to grant Granted Jan 06, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
57%
Grant Probability
75%
With Interview (+18.5%)
3y 8m
Median Time to Grant
Low
PTA Risk
Based on 613 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month