Prosecution Insights
Last updated: April 19, 2026
Application No. 18/645,248

VIRTUAL OBJECT CONTROL METHOD AND APPARATUS, TERMINAL, AND STORAGE MEDIUM

Non-Final OA §101§102
Filed
Apr 24, 2024
Examiner
LANEAU, RONALD
Art Unit
3715
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Tencent Technology (Shenzhen) Company Limited
OA Round
1 (Non-Final)
88%
Grant Probability
Favorable
1-2
OA Rounds
2y 4m
To Grant
98%
With Interview

Examiner Intelligence

Grants 88% — above average
88%
Career Allow Rate
1306 granted / 1483 resolved
+18.1% vs TC avg
Moderate +10% lift
Without
With
+9.8%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
32 currently pending
Career history
1515
Total Applications
across all art units

Statute-Specific Performance

§101
35.2%
-4.8% vs TC avg
§103
17.1%
-22.9% vs TC avg
§102
16.9%
-23.1% vs TC avg
§112
9.4%
-30.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1483 resolved cases

Office Action

§101 §102
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-20 are rejected under 35 USC § 101 because the claimed invention is directed to non-statutory subject matter. Subject Matter Eligibility Standard When considering subject matter eligibility under 35 U.S.C. 101, it must be determined whether the claim is directed to one of the four statutory categories of invention, i.e., process, machine, manufacture, or composition of matter. If the claim does fall within one of the statutory categories, it must then be determined whether the claim is directed to a judicial exception (i.e., law of nature, natural phenomenon, and abstract idea), and if so, it must additionally be determined whether the claim is a patent-eligible application of the exception. If an abstract idea is present in the claim, any element or combination of elements in the claim must be sufficient to ensure that the claim amounts to significantly more than the abstract idea itself. Examples of abstract ideas include fundamental economic practices; certain methods of organizing human activities; an idea itself; and mathematical relationships/formulas. Alice Corporation Pty. Ltd. v. CLS Bank International, et al., 573 U.S. (2014). Analysis Based upon consideration of all of the relevant factors with respect to the claim as a whole, claim(s) 1, 9 and 17 held to claim an abstract idea, and is/are therefore rejected as ineligible subject matter under 35 U.S.C. 101. The rationale for this finding is explained below: Claims 1, 9 and 17 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claim recites “determining a second posture to which the virtual object is to be switched from a first posture of the virtual object according to the first posture and the attribute information.” The limitations of “displaying a user interface (UI), the UI comprising an operation control configured to control a virtual object to switch between different postures in a virtual environment; receiving a touch operation signal corresponding to the operation control by a user of the computing device; obtaining attribute information of the touch operation signal; determining a second posture to which the virtual object is to be switched from a first posture of the virtual object according to the first posture and the attribute information; and switching the virtual object from the first posture to the second posture, wherein the first posture is a current posture of the virtual object in the virtual environment” as drafted, is a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components. That is, other than reciting “a processor,” nothing in the claim element precludes the step from practically being performed in the mind. For example, but for the “a processor” language, “displaying, receiving, obtaining, determining, and switching” in the context of this claim encompasses determining a second posture to which the virtual object is to be switched from a first posture of the virtual object according to the first posture and the attribute information. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components, then it falls within the “Mental Processes” grouping of abstract ideas. Accordingly, the claim recites an abstract idea. The additional elements do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea. The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element of using a processor to perform both the ranking and determining steps amounts to no more than mere instructions to apply the exception using a generic computer component. Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept. The claim is not patent eligible. Claim Rejections - 35 USC § 102 4. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. 5. Claim(s) 1-20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Fan Yourui (CN108970112 (A). Fan Yourui was cited in the IDS received on 07/12/23. As per claim 1, Yourui discloses a method for controlling a virtual object in a virtual environment performed by a computing device, the method comprising: displaying a user interface (UI), the UI comprising an operation control configured to control a virtual object to switch between different postures in a virtual environment (a selective posture adjustment interface (see fig. 3) and a first object being an object in a virtual scene displayed by a client); receiving a touch operation signal corresponding to the operation control by a user of the computing device (a player clicks an operation button such as “squatting”, “going prone” or “jumping” in a client); obtaining attribute information of the touch operation signal (a player clicks an operation button such as “squatting”, “going prone” or “jumping” in a client); determining a second posture to which the virtual object is to be switched from a first posture of the virtual object according to the first posture and the attribute information (see abstract); and switching the virtual object from the first posture to the second posture, wherein the first posture is a current posture of the virtual object in the virtual environment (a terminal determines a second action in response to a first operation and the second action is used for driving, by a second part of a first object, a first part to adjust a holding posture to a target item). As per claim 2, Yourui discloses the method according to claim 1, wherein the attribute information comprises at least one of the following: a touch duration, a touch pressure, a quantity of times of touch, and a swipe direction (see fig. 1). As per claim 3, Yourui discloses the method according to claim 1, wherein the determining a second posture to which the virtual object is to be switched from a first posture of the virtual object according to the first posture and the attribute information, comprises: when the first posture is a standing posture, determining the second posture as a squatting posture when the attribute information is first attribute information, and determining the second posture as a prone posture when the attribute information is second attribute information; when the first posture is a squatting posture, determining the second posture as a standing posture when the attribute information is the first attribute information, and determining the second posture as a prone posture when the attribute information is the second attribute information; and when the first posture is a prone posture, determining the second posture as a squatting posture when the attribute information is the first attribute information, and determining the second posture as a standing posture when the attribute information is the second attribute information (see abstract; (a selective posture adjustment interface (see fig. 3) and a first object being an object in a virtual scene displayed by a client). As per claim 4, Yourui discloses the method according to claim 1, wherein the method further comprises: when the first posture is a running posture, determining that the second posture to which the virtual object is to be switched is a sliding tackle posture (see figs 1-3). As per claim 5, Yourui discloses the method according to claim 1, wherein the method further comprises: after receiving the touch operation signal corresponding to the operation control by the user of the computing device, obtaining scene information corresponding to the virtual object, the scene information being used for indicating a virtual scene in which the virtual object is located; and determining the second posture to which the virtual object is to be switched from the first posture according to the first posture, the attribute information, and the scene information (see figs. 1-3). As per claim 6, Yourui discloses the method according to claim 1, wherein the operation control comprises a posture icon, and the method further comprises: when the virtual object is switched from the first posture to the second posture, controlling the posture icon to switch from a first display style to a second display style (see fig. 1). As per claim 8, Yourui discloses the method according to claim 1, wherein the switching the virtual object from the first posture to the second posture comprises: displaying, in the UI, by adjusting a three-dimensional model of the virtual object and a viewing angle of a virtual camera in the virtual environment, a process of switching the virtual object from the first posture to the second posture (see figs. 1-3). As per claim 9, Yourui discloses a computing device, comprising a processor, a memory that is communicatively connected to the processor via a communication bus, and a plurality of computer programs stored in the memory that, when executed by the processor, cause the computing device to perform a plurality of operations including: displaying a user interface (UI), the UI comprising an operation control configured to control a virtual object to switch between different postures in a virtual environment (a selective posture adjustment interface (see fig. 3) and a first object being an object in a virtual scene displayed by a client); receiving a touch operation signal corresponding to the operation control by a user of the computing device (a player clicks an operation button such as “squatting”, “going prone” or “jumping” in a client); obtaining attribute information of the touch operation signal; determining a second posture to which the virtual object is to be switched from a first posture of the virtual object according to the first posture and the attribute information (a player clicks an operation button such as “squatting”, “going prone” or “jumping” in a client); and switching the virtual object from the first posture to the second posture, wherein the first posture is a current posture of the virtual object in the virtual environment (a terminal determines a second action in response to a first operation and the second action is used for driving, by a second part of a first object, a first part to adjust a holding posture to a target item). As per claim 10, Yourui discloses the computing device according to claim 9, wherein the attribute information comprises at least one of the following: a touch duration, a touch pressure, a quantity of times of touch, and a swipe direction (see fig. 1). As per claim 11, Yourui discloses the computing device according to claim 9, wherein the determining a second posture to which the virtual object is to be switched from a first posture of the virtual object according to the first posture and the attribute information, comprises: when the first posture is a standing posture, determining the second posture as a squatting posture when the attribute information is first attribute information, and determining the second posture as a prone posture when the attribute information is second attribute information; when the first posture is a squatting posture, determining the second posture as a standing posture when the attribute information is the first attribute information, and determining the second posture as a prone posture when the attribute information is the second attribute information; and when the first posture is a prone posture, determining the second posture as a squatting posture when the attribute information is the first attribute information, and determining the second posture as a standing posture when the attribute information is the second attribute information (see abstract; (a selective posture adjustment interface (see fig. 3) and a first object being an object in a virtual scene displayed by a client). As per claim 12, Yourui discloses the computing device according to claim 9, wherein the method further comprises: when the first posture is a running posture, determining that the second posture to which the virtual object is to be switched is a sliding tackle posture (see figs. 1-3). As per claim 13, Yourui discloses the computing device according to claim 9, wherein the method further comprises: after receiving the touch operation signal corresponding to the operation control by the user of the computing device, obtaining scene information corresponding to the virtual object, the scene information being used for indicating a virtual scene in which the virtual object is located; and determining the second posture to which the virtual object is to be switched from the first posture according to the first posture, the attribute information, and the scene information (see figs. 1-3). As per claim 14, Yourui discloses the computing device according to claim 9, wherein the operation control comprises a posture icon, and the method further comprises: when the virtual object is switched from the first posture to the second posture, controlling the posture icon to switch from a first display style to a second display style (see fig. 1). As per claim 16, Yourui discloses the computing device according to claim 9, wherein the switching the virtual object from the first posture to the second posture comprises: displaying, in the UI, by adjusting a three-dimensional model of the virtual object and a viewing angle of a virtual camera in the virtual environment, a process of switching the virtual object from the first posture to the second posture (see figs. 1-3). As per claim 17, Yourui discloses a non-transitory computer-readable storage medium storing a plurality of computer programs, the computer programs, when executed by a processor of a computing device, being configured to perform a plurality of operations including: displaying a user interface (UI), the UI comprising an operation control configured to control a virtual object to switch between different postures in a virtual environment (a selective posture adjustment interface (see fig. 3) and a first object being an object in a virtual scene displayed by a client); receiving a touch operation signal corresponding to the operation control by a user of the computing device; obtaining attribute information of the touch operation signal (a player clicks an operation button such as “squatting”, “going prone” or “jumping” in a client); determining a second posture to which the virtual object is to be switched from a first posture of the virtual object according to the first posture and the attribute information (a player clicks an operation button such as “squatting”, “going prone” or “jumping” in a client); and switching the virtual object from the first posture to the second posture, wherein the first posture is a current posture of the virtual object in the virtual environment (a terminal determines a second action in response to a first operation and the second action is used for driving, by a second part of a first object, a first part to adjust a holding posture to a target item). As per claim 18, Yourui discloses the non-transitory computer-readable storage medium according to claim 17, wherein the attribute information comprises at least one of the following: a touch duration, a touch pressure, a quantity of times of touch, and a swipe direction (see fig. 1). As per claim 19, Yourui discloses the non-transitory computer-readable storage medium according to claim 17, wherein the method further comprises: after receiving the touch operation signal corresponding to the operation control by the user of the computing device, obtaining scene information corresponding to the virtual object, the scene information being used for indicating a virtual scene in which the virtual object is located; and determining the second posture to which the virtual object is to be switched from the first posture according to the first posture, the attribute information, and the scene information (see figs. 1-3). As per claim 20, Yourui discloses the non-transitory computer-readable storage medium according to claim 17, wherein the switching the virtual object from the first posture to the second posture comprises: displaying, in the UI, by adjusting a three-dimensional model of the virtual object and a viewing angle of a virtual camera in the virtual environment, a process of switching the virtual object from the first posture to the second posture (see figs. 1-3). No references have been found for the following claims: As per claim 7, the method according to claim 1, wherein the method further comprises: after determining the second posture to which the virtual object is to be switched from the first posture of the virtual object according to the first posture and the attribute information: detecting, according to position information of the virtual object, whether the virtual object satisfies a condition for switching to the second posture; when the virtual object satisfies the condition for switching to the second posture, performing an operation of switching the virtual object from the first posture to the second posture; and when the virtual object does not satisfy the condition for switching to the second posture, controlling the virtual object to maintain the first posture. As per claim 15, the computing device according to claim 9, wherein the method further comprises: after determining the second posture to which the virtual object is to be switched from the first posture of the virtual object according to the first posture and the attribute information: detecting, according to position information of the virtual object, whether the virtual object satisfies a condition for switching to the second posture; when the virtual object satisfies the condition for switching to the second posture, performing an operation of switching the virtual object from the first posture to the second posture; and when the virtual object does not satisfy the condition for switching to the second posture, controlling the virtual object to maintain the first posture. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. See references cited on PTO form 892. Any inquiry concerning this communication or earlier communications from the examiner should be directed to RONALD LANEAU whose telephone number is (571)272-6784. The examiner can normally be reached on Mon-Thu 6-4:30 ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, David L Lewis can be reached on 5712727673. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. PNG media_image1.png 275 275 media_image1.png Greyscale /Ronald Laneau/ Primary Examiner, Art Unit 3715
Read full office action

Prosecution Timeline

Apr 24, 2024
Application Filed
Mar 02, 2026
Non-Final Rejection — §101, §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597319
GAMING DEVICE WITH PERSISTENCE CYCLING
2y 5m to grant Granted Apr 07, 2026
Patent 12586444
COMPUTER-IMPLEMENTED SYSTEMS AND METHODS FOR IMPLEMENTING MATRIX-BASED ONLINE GAMING
2y 5m to grant Granted Mar 24, 2026
Patent 12586437
CONTROLLING POWER CONSUMPTION IN GRAPHICS COMPONENTS OF GAMING DEVICES
2y 5m to grant Granted Mar 24, 2026
Patent 12586443
MODIFYING PROGRESSIVE AWARD PARAMETERS
2y 5m to grant Granted Mar 24, 2026
Patent 12586438
LIGHTED GAMING TABLE
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
88%
Grant Probability
98%
With Interview (+9.8%)
2y 4m
Median Time to Grant
Low
PTA Risk
Based on 1483 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month