Prosecution Insights
Last updated: April 19, 2026
Application No. 18/734,934

PROCESSING INFORMATION FOR VIRTUAL ENVIRONMENT

Non-Final OA §102
Filed
Jun 05, 2024
Examiner
LEICHLITER, CHASE E
Art Unit
3715
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Tencent Technology (Shenzhen) Company Limited
OA Round
1 (Non-Final)
64%
Grant Probability
Moderate
1-2
OA Rounds
3y 4m
To Grant
88%
With Interview

Examiner Intelligence

Grants 64% of resolved cases
64%
Career Allow Rate
428 granted / 666 resolved
-5.7% vs TC avg
Strong +24% interview lift
Without
With
+24.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
38 currently pending
Career history
704
Total Applications
across all art units

Statute-Specific Performance

§101
24.6%
-15.4% vs TC avg
§103
26.2%
-13.8% vs TC avg
§102
27.5%
-12.5% vs TC avg
§112
12.7%
-27.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 666 resolved cases

Office Action

§102
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by ZHI (CN111803941) (henceforth, “ZHI”). Regarding claims 1, 13, and 20, ZHI teaches a method and apparatus for processing information, comprising processing circuitry configured to: generating a virtual environment image for a virtual environment that includes a virtual character carrying out an action in the virtual environment (Para. 4 and Para. 8 of English translation); determining a perspective angle for displaying a heads-up display (HUD) control with the virtual environment image based on a character state of the virtual character and a region of the virtual environment image for the HUD control, the HUD control providing user interface (UI) information for controlling the virtual character, the perspective angle being an included angle between the HUD control and an imaging plane of the virtual environment image (Para. 8 and Figs. 2, 3a-b, and 5); and displaying the virtual environment image with the HUD control in the region of the virtual environment image according to the perspective angle (Para. 8 and Figs. 2, 3a-b, and 5). Regarding claims 2 and 14, ZHI teaches changing the perspective angle of the HUD control from a first perspective angle to a second perspective angle when the character state of the virtual character changes from a first state to a second state, the first perspective angle corresponding to the first state, and the second perspective angle corresponding to the second state (e.g., starting state and second position in Para. 51). Regarding claims 3 and 15, ZHI teaches the first state is a walking state, and the second state is a running state, the changing the perspective angle comprises: changing the perspective angle of the HUD control from the first perspective angle to the second perspective angle when the character state of the virtual character changes from the walking state to the running state, the second perspective angle being greater than the first perspective angle (e.g., movement speed in Para. 8, 10, and 41 and running in Para. 55). Regarding claims 4 and 16, ZHI teaches the determining the perspective angle comprises: changing the perspective angle of the HUD control when an attribute parameter of the character state of the virtual character changes (e.g. direction and speed in Para. 45-53). Regarding claims 5 and 17, ZHI teaches the changing the perspective angle of the HUD control comprises: increasing the perspective angle of the HUD control when a speed or an acceleration of the virtual character increases in the virtual environment (Para. 41 and Para. 45-53). Regarding claims 6 and 18, ZHI teaches the changing the perspective angle comprises: decreasing the perspective angle of the HUD control when a speed or an acceleration of the virtual character decreases in the virtual environment (e.g., decrease in Para. 64-65). Regarding claims 7 and 19, ZHI teaches varying, when the character state of the virtual character is a third state, a plurality of perspective angles of the HUD control for respectively displaying in a plurality of virtual environment images dynamically in a range (Para. 67). Regarding claim 8, ZHI teaches the third state is an attacked state, and the method further comprises: displaying, when the character state of the virtual character in the virtual environment is the attacked state, the plurality of virtual environment images with the plurality of perspective angles of the HUD control changing in a periodically shaking manner (Para. 67). Regarding claim 9, ZHI teaches the determining the perspective angle further comprises: determining a first perspective angle for displaying a first HUD control with the virtual environment image based on the character state of the virtual character and a first region of the virtual environment image for the first HUD control; and determining a second perspective angle for displaying a second HUD control with the virtual environment image based on the character state of the virtual character and a second region of the virtual environment image for the second HUD control (e.g., multiple display icons in Para. 59 and HUD UIs in Para. 34). Regarding claim 10, ZHI teaches the HUD control comprises an operation control, and the method further comprises: determining, in response to a trigger operation of the operation control, the perspective angle for displaying the HUD control with the virtual environment image according to the operation control (e.g., button movement in Para. 66). Regarding claim 11, ZHI teaches the operation control comprises a first type of operation control, and the determining the perspective angle further comprises: changing, in response to the trigger operation of the first type of operation control, the perspective angle for displaying the HUD control to a corresponding perspective angle of the first type of operation control (Para. 66). Regarding claim 12, ZHI teaches the operation control comprises a second type of operation control, and the determining the perspective angle further comprises: varying corresponding perspective angles of the HUD control for displaying with a plurality of virtual environment images dynamically in a range in response to a trigger operation of the second type of operation control (e.g., multiple display icons in Para. 59 and HUD UIs in Para. 34 and button movements in Para. 66). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant’s disclosure and is listed on the attached Notice of References Cited. Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHASE E LEICHLITER whose telephone number is (571)270-7109. The examiner can normally be reached Monday-Friday (9-5). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, David Lewis can be reached at (571)272-7673. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /CHASE E LEICHLITER/Primary Examiner, Art Unit 3715
Read full office action

Prosecution Timeline

Jun 05, 2024
Application Filed
Feb 07, 2026
Non-Final Rejection — §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597313
WAGERING ON EVENTS IN A STREAMING ENVIRONMENT
2y 5m to grant Granted Apr 07, 2026
Patent 12592119
MESSAGE DRIVEN GAMING SYSTEMS AND PROCESSES
2y 5m to grant Granted Mar 31, 2026
Patent 12589299
GRAPHICS RENDERING APPARATUS AND METHOD
2y 5m to grant Granted Mar 31, 2026
Patent 12582905
INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND PROGRAM
2y 5m to grant Granted Mar 24, 2026
Patent 12551784
TACTILE OVERLAY FOR TOUCH SCREEN VIRTUAL GAME CONTROLLER COUPLED TO EXTERNAL DISPLAY
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
64%
Grant Probability
88%
With Interview (+24.0%)
3y 4m
Median Time to Grant
Low
PTA Risk
Based on 666 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month