DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by ZHI (CN111803941) (henceforth, “ZHI”).
Regarding claims 1, 13, and 20, ZHI teaches a method and apparatus for processing information, comprising processing circuitry configured to:
generating a virtual environment image for a virtual environment that includes a virtual character carrying out an action in the virtual environment (Para. 4 and Para. 8 of English translation);
determining a perspective angle for displaying a heads-up display (HUD) control with the virtual environment image based on a character state of the virtual character and a region of the virtual environment image for the HUD control, the HUD control providing user interface (UI) information for controlling the virtual character, the perspective angle being an included angle between the HUD control and an imaging plane of the virtual environment image (Para. 8 and Figs. 2, 3a-b, and 5); and
displaying the virtual environment image with the HUD control in the region of the virtual environment image according to the perspective angle (Para. 8 and Figs. 2, 3a-b, and 5).
Regarding claims 2 and 14, ZHI teaches changing the perspective angle of the HUD control from a first perspective angle to a second perspective angle when the character state of the virtual character changes from a first state to a second state, the first perspective angle corresponding to the first state, and the second perspective angle corresponding to the second state (e.g., starting state and second position in Para. 51).
Regarding claims 3 and 15, ZHI teaches the first state is a walking state, and the second state is a running state, the changing the perspective angle comprises: changing the perspective angle of the HUD control from the first perspective angle to the second perspective angle when the character state of the virtual character changes from the walking state to the running state, the second perspective angle being greater than the first perspective angle (e.g., movement speed in Para. 8, 10, and 41 and running in Para. 55).
Regarding claims 4 and 16, ZHI teaches the determining the perspective angle comprises: changing the perspective angle of the HUD control when an attribute parameter of the character state of the virtual character changes (e.g. direction and speed in Para. 45-53).
Regarding claims 5 and 17, ZHI teaches the changing the perspective angle of the HUD control comprises: increasing the perspective angle of the HUD control when a speed or an acceleration of the virtual character increases in the virtual environment (Para. 41 and Para. 45-53).
Regarding claims 6 and 18, ZHI teaches the changing the perspective angle comprises: decreasing the perspective angle of the HUD control when a speed or an acceleration of the virtual character decreases in the virtual environment (e.g., decrease in Para. 64-65).
Regarding claims 7 and 19, ZHI teaches varying, when the character state of the virtual character is a third state, a plurality of perspective angles of the HUD control for respectively displaying in a plurality of virtual environment images dynamically in a range (Para. 67).
Regarding claim 8, ZHI teaches the third state is an attacked state, and the method further comprises: displaying, when the character state of the virtual character in the virtual environment is the attacked state, the plurality of virtual environment images with the plurality of perspective angles of the HUD control changing in a periodically shaking manner (Para. 67).
Regarding claim 9, ZHI teaches the determining the perspective angle further comprises: determining a first perspective angle for displaying a first HUD control with the virtual environment image based on the character state of the virtual character and a first region of the virtual environment image for the first HUD control; and determining a second perspective angle for displaying a second HUD control with the virtual environment image based on the character state of the virtual character and a second region of the virtual environment image for the second HUD control (e.g., multiple display icons in Para. 59 and HUD UIs in Para. 34).
Regarding claim 10, ZHI teaches the HUD control comprises an operation control, and the method further comprises: determining, in response to a trigger operation of the operation control, the perspective angle for displaying the HUD control with the virtual environment image according to the operation control (e.g., button movement in Para. 66).
Regarding claim 11, ZHI teaches the operation control comprises a first type of operation control, and the determining the perspective angle further comprises: changing, in response to the trigger operation of the first type of operation control, the perspective angle for displaying the HUD control to a corresponding perspective angle of the first type of operation control (Para. 66).
Regarding claim 12, ZHI teaches the operation control comprises a second type of operation control, and the determining the perspective angle further comprises: varying corresponding perspective angles of the HUD control for displaying with a plurality of virtual environment images dynamically in a range in response to a trigger operation of the second type of operation control (e.g., multiple display icons in Para. 59 and HUD UIs in Para. 34 and button movements in Para. 66).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant’s disclosure and is listed on the attached Notice of References Cited.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHASE E LEICHLITER whose telephone number is (571)270-7109. The examiner can normally be reached Monday-Friday (9-5).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, David Lewis can be reached at (571)272-7673. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CHASE E LEICHLITER/Primary Examiner, Art Unit 3715