DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1 - 14 are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter.
Step 1:
I. The claims are drawn to apparatus, process and CRM categories.
II. Thus, initially, under Step 1 of the analysis, it is noted that the claims are directed towards eligible categories of subject matter.
Step 2a:
III. Prong 1: Does the claim recite an abstract idea, law of nature, or natural phenomenon?
Representative claim 1 is analyzed below, with italicized limitations indicating recitations of an abstract idea.
A method of controlling a virtual object, comprising: determining pose information of a first virtual object in response to the first virtual object being in a target motion state; determining location information of a following anchor point based on the pose information, wherein the following anchor point is used to represent a motion end of a second virtual object associated with the first virtual object under the pose information; and controlling the second virtual object to move based on the location information of the following anchor point.
The underlined limitations fall within at least three of the groupings of abstract ideas enumerated in the 2019 PEG:
Fundamental economic principles or practices
The claims are directed towards incentivizing the behavior of users playing a game via group agreements or contract. This is viewed by the Examiner as a fundamental economic practice, an agreement in the form of contracts.
Prong 2: Does the Claim recite additional elements that integrate the exception in to a practical application of the exception?
iii. Although the claims recite additional limitations, such as one or more processors and at least one server, the said additional limitations do not integrate the exception into a practical application of the exception. For example, the claims require
additional limitations such as a processor, display, and memory components.
iv. These additional limitations do not represent an improvement to the functioning of a computer, or to any other technology or technical field, (MPEP 2106.05(a)). Nor do they apply the exception using a particular machine, (MPEP 2106.05(b)). Furthermore, they do not effect a transformation. (MPEP 2106.05(c)). Rather, these additional limitations amount to an instruction to “apply” the judicial exception using a computer as a tool to perform the abstract idea.
Step 2b:
Under Step 2B, the claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception because they amount to conventional and routine computer implementation and mere instructions for implementing the abstract idea on generic computing devices.
For example, the claim language does not recite additional elements, viewed as a whole, are indistinguishable from conventional computing elements known in the art. Therefore, the additional elements fail to supply additional elements that yield significantly more than the underlying abstract idea. Viewing the limitations as an ordered combination adds nothing that is not already present when looking at the elements taken individually. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology.
For these reasons, it appears that the claims are not patent-eligible under 35 USC §101.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1 – 14 are rejected under 35 U.S.C. 102(a) as being anticipated by Bouazizi et al. (U.S. 2022/0335694).
Regarding claim 1, Bouazizi discloses a method of controlling a virtual object, (“Controller device 254 may be a game controller device including one or more buttons, track pads, or the like for receiving user input”, par. 0117), comprising determining pose information of a first virtual object in response to the first virtual object being in a target motion state, (“Feedback number accessor that describes an uplink stream to which pose information is to be sent”, par. 0136), determining location information of a following anchor point based on the pose information, (“the scene description including data describing a scene anchor and one or more virtual objects in a virtual scene”, par. 0007), wherein the following anchor point is used to represent a motion end of a second virtual object associated with the first virtual object under the pose information, and controlling the second virtual object to move based on the location information of the following anchor point, (“The anchor point may further include various actions that a user may perform, such as movements received via a controller or real-world repositioning of a device worn by the user. The anchor point may also include object identifiers that are shared between a presentation application and the server. Such objects may be used as anchor points. In general, the scene description may include data that relates the scene anchor point to a real-world anchor point, such as a particular location”, par. 0157).
Regarding claim 2, Bouazizi discloses wherein determining pose information of a first virtual object comprises: determining the pose information of the first virtual object in a game map, (“When a user's pose changes, as detected by camera 308 and sensors 310, client device 300 may send pose information updates”, par. 0153).
Regarding claims 3 - 7, Bouazizi discloses wherein determining location information of a following anchor point based on the pose information comprises: determining a following distance corresponding to the first virtual object and/or the second virtual object; determining a following angle based on the pose information; and determining the location information of the following anchor point based on location information in the pose information, the following distance, and the following angle, (“In general, the scene description may include data that relates the scene anchor point to a real-world anchor point, such as a particular location).
Regarding claim 8, Bouazizi discloses determining location change information of the following anchor point; and updating the second motion information based on the location change information, and controlling the second virtual object to move based on updated second motion information, (“When a user's pose changes, as detected by camera 308 and sensors 310, client device 300 may send pose information updates”, par. 0153).
Regarding claim 9, Bouazizi discloses displaying a preset media resource corresponding to the second virtual object, in a case where the second virtual object is detected being in a target region corresponding to the first virtual object, (fig. 9).
Regarding claim 10, Bouazizi discloses determining a combat target corresponding to the second virtual object in response to the first virtual object being in the target combat state; and controlling the second virtual object to perform a target combat action on the combat target based on location information corresponding to the combat target, (fig. 9).
Regarding claim 11, Bouazizi discloses an apparatus of controlling a virtual object, comprising: a first determination module, (“Controller device 254 may be a game controller device including one or more buttons, track pads, or the like for receiving user input”, par. 0117), configured to determine pose information of a first virtual object in response to the first virtual object being in a target motion state, (“Feedback number accessor that describes an uplink stream to which pose information is to be sent”, par. 0136), determining location information of a following anchor point based on the pose information, (“the scene description including data describing a scene anchor and one or more virtual objects in a virtual scene”, par. 0007), wherein the following anchor point is used to represent a motion end of a second virtual object associated with the first virtual object under the pose information, and controlling the second virtual object to move based on the location information of the following anchor point, (“The anchor point may further include various actions that a user may perform, such as movements received via a controller or real-world repositioning of a device worn by the user. The anchor point may also include object identifiers that are shared between a presentation application and the server. Such objects may be used as anchor points. In general, the scene description may include data that relates the scene anchor point to a real-world anchor point, such as a particular location”, par. 0157).
Regarding claims 12 - 14, Bouazizi discloses a computer device, comprising a processor, a memory, and a bus, wherein the memory stores machine-readable instructions executable by the processor, when the computer device is running, the processor and the memory communicate through the bus, and the machine-readable instructions, when executed by the processor, execute steps of the method of controlling a virtual object, (fig. 1).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ERIC M THOMAS whose telephone number is (571)272-1699. The examiner can normally be reached 9:00am - 5:00pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, David Lewis can be reached at 571-272-7673. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/E.M.T/Examiner, Art Unit 3715
/DAVID L LEWIS/Supervisory Patent Examiner, Art Unit 3715