Prosecution Insights
Last updated: April 19, 2026
Application No. 18/318,963

VIRTUAL SPACE PROVISION SYSTEM, VIRTUAL SPACE PROVISION METHOD, AND VIRTUAL SPACE PROVISION PROGRAM

Final Rejection §102§103
Filed
May 17, 2023
Examiner
MCCULLOCH JR, WILLIAM H
Art Unit
3715
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Asics Corporation
OA Round
2 (Final)
54%
Grant Probability
Moderate
3-4
OA Rounds
3y 5m
To Grant
87%
With Interview

Examiner Intelligence

Grants 54% of resolved cases
54%
Career Allow Rate
330 granted / 614 resolved
-16.3% vs TC avg
Strong +33% interview lift
Without
With
+33.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 5m
Avg Prosecution
32 currently pending
Career history
646
Total Applications
across all art units

Statute-Specific Performance

§101
22.6%
-17.4% vs TC avg
§103
27.7%
-12.3% vs TC avg
§102
21.3%
-18.7% vs TC avg
§112
15.8%
-24.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 614 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-4, 9, 11, and 12 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by US 2003/0078138 to Toyama (hereinafter Toyama). Regarding claims 1, 11, and 12, Toyama teaches a virtual space provision system, method, and computer readable medium that provides a virtual space in which a character (e.g., a roll-playing [sic: role-playing] character 102 in ¶ 44) moves along a course in the virtual space (e.g., roads around a town in a virtual world or virtual space in ¶ 44 and Fig. 5) based on movement of a user (e.g., user running on a running machine 1), the system comprising: a memory that stores movement course information including information on scenery of the course in which the character moves (e.g., ROM 52 stores various objects such as a running road, buildings, and natural objects forming a virtual space in which the role-playing character 102 moves; see at least ¶ 44); and a processor (e.g., CPU 4 in ¶ 45) configured to function as: an acquisition unit that acquires biometric information indicating a biological state of the user, and acquires exercise amount information indicating an amount of exercise by the user (e.g., speed calculating device 402 calculates the turning speed of the running belt 14 (virtually running speed of the user) based on the cycle of the rotation signal from the rotation sensor 19 in ¶ 50); a computation unit that computes movement quality information on the basis of the biometric and the exercise amount information (e.g., the position calculating device 403 calculates a position from a starting point (running distance), i.e. a current position based on the calculated speed in ¶ 50) and quality of movement of the character on the basis of a difficulty of the course in the virtual space in which the character moves (e.g., the course is determined based on the “degree of game difficulty,” in a game such as a “Survival Running” course set to “Moderate Hell” in ¶ 56 and Fig. 10); and a course controller that controls a situation of the course in which the character moves through by changing a viewpoint in the virtual space on the basis of the movement quality information and the quality of movement of the character (e.g., calculation of a viewpoint position (in this embodiment, changes in the height and the direction of the camera as described later), the calculation in the 3D space for the viewpoint, the calculation to convert a position in the 3D space to a position in a simulated 3D space, the calculation of a light source, writing of an image data to be formed in the video RAM based on the above calculation results in ¶ 44; see also ¶ 57 describing the virtual camera moving according to user running). Regarding claim 2, Toyama teaches wherein the computation unit computes the movement quality information that evaluates an exercise state of the user and a movement state of the character, and the course controller controls a status of the course on the basis of information of the exercise state of the user and the movement state of the character (e.g., The viewpoint of the virtual camera is controllably moved along the running road at a speed corresponding to the speed information obtained by the speed calculating device 402 based on the rotation signal from the rotation sensor 19 and the position of this viewpoint is relatively moved in accordance with the position information obtained by the position calculating device 403, i.e. the scenery image is relatively moved in a direction opposite from the running direction of the roll-playing character 102. In this way, the running movement is expressed in ¶ 57). Regarding claim 3, Toyama teaches wherein the course controller changes the viewpoint in the virtual space (see above with respect to ¶¶ 44 and 57) and a light source setting in a predetermined output device that outputs the course and the character on the basis of the movement quality information (e.g., the image processor 311 mainly performs the calculation of a viewpoint position (in this embodiment, changes in the height and the direction of the camera as described later), the calculation in the 3D space for the viewpoint, the calculation to convert a position in the 3D space to a position in a simulated 3D space, the calculation of a light source, writing of an image data to be formed in the video RAM based on the above calculation results in ¶ 44). Regarding claim 4, Toyama teaches wherein the course controller changes the viewpoint in the virtual space and controls a sound effect output by the output device on the basis of the movement quality information (e.g., image processor 311 performs the calculation of the viewpoint, the calculation of the positions of the characters in the 3D space (of course, the same applies to a two-dimensional space) in relation to the viewpoint, the calculation of the light source, the generation of the sound data in ¶ 45). Regarding claim 9, Toyama teaches wherein the course controller causes the output device to output a direction of movement to the user on the basis of the course moved by the character in the virtual space (e.g., the scenery image is relatively moved in a direction opposite from the running direction of the roll-playing character 102…In this way, the running movement is expressed; see ¶ 57). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over Toyama in view of US 2022/0276710 to Yokohama et al. (hereinafter Yokohama). Regarding claim 5, Toyama teaches the invention substantially as described above, but lacks in explicitly teaching shoes that are worn by the user and provide predetermined stimulation to feet of the user, wherein the course controller controls a type, size, and time point of stimulation applied to the feet of the user on the basis of the movement quality information and the movement course information. In a related disclosure, Yokohama teaches methods for a tactile-sense presentation device that outputs signals in a time-series feature being visually expressed (abstract). More particularly, Yokohama describes a tactile-sense presentation device 100 is provided as a wearable type, such as shoes (¶ 49), and that the tactile feedback may correspond to video content (¶ 70). Yokohama further teaches that body-part objects, such as feet O3, may experience a particular tactile effect, such as vibration effect object O7 (see at least Fig. 4A and ¶ 120). Furthermore, as illustrated in an M3 portion in the figures for example, the effect object O7 indicating a temporal transition in tactile sensation is dragged-and-dropped into a section of the “tactile-sense effect” field and the waveform indicating the temporal transition is changed, so that, for example, a variation in strength in tactile sensation can also be designated freely (¶ 127). It would have been obvious to one of ordinary skill in the art before the effective date to modify the system of Toyama to include the time-series tactile output to shoes that follows a video presentation and allows for variations in strength and type, as taught or suggested by Yokohama in order to increase the realism experienced by the user of the exercise device. Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Toyama in view of US 2017/0361133 to Yu et al. (hereinafter Yu). Regarding claim 6, Toyama teaches the invention substantially as described above but lacks in explicitly teaching wherein the output device is an aroma diffuser that generates a plurality of types of scents, and the course controller selects a predetermined scent from the plurality of scents on the basis of the movement quality information and generates it from the aroma diffuser. In a related disclosure, Yu teaches a personal entertainment respiratory apparatus that provides air to a user to provide fully immersive entertainment by using a selectively activated release of a sensory particle from a dispenser into the air flow in response to an entertainment triggering signal (abstract). More specifically, the system of Yu may be implemented to adapt the air provided to the user to manipulate the user’s experience in conjunction with a form of entertainment such as a game (¶ 72) or video game (¶ 169). The system may employ scent cartridges that may be triggered in synchrony with an entertainment scenario, such as scents that resemble the ocean when ocean waves are part of the scenario (¶ 167). It would have been obvious to one of ordinary skill in the art before the effective date to modify the system of Toyama to include the aroma diffusion features of Yu in order to increase the realism experienced by the user of the exercise device. Claims 7, 8, and 10 are rejected under 35 U.S.C. 103 as being unpatentable over Toyama in view of US 2022/0207119 to Andon et al. (hereinafter Andon). Regarding claims 7, 8, and 10, Toyama teaches the invention substantially as described above, but lacks in explicitly teaching a purchase unit that allows for a purchase of a non-substitutable token provable accessory to the character. In a related disclosure, Andon teaches video game integration of cryptographically secured digital assets on a distributed blockchain ledger, wherein at least one of the stored object attributes is modified according to an aspect of a digital software application or interaction between a character avatar and the virtual object (abstract). Andon further teaches that the system may generate a smart contract that authenticates ownership of and/or tracks future transaction of the cryptographic digital asset, and that the unique owner ID code may be linked with a cryptocurrency wallet that registered with the distributed blockchain ledger (¶ 19). As shown in Andon Fig. 10, the stored virtual object may be displayed in a video game application. In addition, Andon teaches that acquisition of the digital object may be triggered by performance in a game (e.g., points scored in a game) in ¶ 94 or by purchasing the digital object in at least ¶ 98. Furthermore, the digital object may be modified by achievement of particular levels, rankings, victories, etc. (¶ 99), as well as being sellable between players (above), which reads on the storing of historical information that the character of the user and the accessory associated with the character have been output in another virtual space where another user that is different from the user is capable of moving. It would have been obvious to one of ordinary skill in the art before the effective date to modify the system of Toyama to include the claimed features relating to purchase of a non-substitutable token provable accessory to a game character, as taught or suggested by Andon, in order to allow sports and game enthusiasts access to genuine collectables, as is beneficially taught by Andon (¶¶ 3-6). Response to Arguments Applicant's arguments filed 9/29/2025 have been fully considered but they are not persuasive. As an initial matter, the previous grounds of rejection under 35 U.S.C.§ 112(b) have been withdrawn in light of the instant claim amendments. Applicant addresses the rejection of claims as anticipated or obvious over Toyama on pages 6-8 of the Remarks section. More particularly, Applicant contends that Toyama does not teach that the quality of movement of the character is on the basis of the difficulty of the course in the virtual space in which the character moves. Remarks, 7. Applicant asserts that Toyama does not teach or suggest any difficulty of the course in the virtual space and allegedly only teaches various aspects of the virtual world such as a town having many houses. Id. The Examiner respectfully disagrees. As discussed in the grounds of rejection above, Toyama teaches that the course on which the game character runs is determined based on the “degree of game difficulty.” Toyama ¶ 56. Specifically, the human user may select a game such as a “Survival Running” course, and may set the difficulty to “Moderate Hell.” Id. at ¶ 56 and Fig. 10. Toyama additionally explains various types of exercise courses that may be chosen by the user, such as “Island Marathon” in which the user virtually runs a relatively long distance together with a trainer, a “Manual Training” course in which an exercise amount can be freely set, and a "Survival Running" course which is an obstacle stage to train reflexes in which displayed obstacles must be avoided by the user. Id. at ¶ 55. As such, the difficulty of the course determines, at least in part, the quality of movement of the virtual character in the virtual space in which the character moves as a result of the real human exercising during the chosen training course. In light of the above analysis, the claimed invention fails to demonstrate patentability over the cited prior art. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to WILLIAM H MCCULLOCH whose telephone number is (571)272-2818. The examiner can normally be reached M-F 9:30-5:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, David Lewis can be reached at 571-272-7673. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /WILLIAM H MCCULLOCH JR/Primary Examiner, Art Unit 3715
Read full office action

Prosecution Timeline

May 17, 2023
Application Filed
Jun 27, 2025
Non-Final Rejection — §102, §103
Sep 29, 2025
Response Filed
Oct 14, 2025
Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12582911
DISPLAY METHOD AND APPARATUS FOR VIRTUAL VEHICLE, DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 24, 2026
Patent 12582910
COMPUTER SYSTEM, GAME SYSTEM, AND GAME PROGRESS CONTROL METHOD
2y 5m to grant Granted Mar 24, 2026
Patent 12582915
STORAGE MEDIUM, GAME APPARATUS, GAME SYSTEM, AND GAME PROCESSING METHOD
2y 5m to grant Granted Mar 24, 2026
Patent 12582870
ESTIMATING SPIN RATE AND AXIS OF A BALL USING DEEP LEARNING
2y 5m to grant Granted Mar 24, 2026
Patent 12576343
COMMUNICATION SYSTEM
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
54%
Grant Probability
87%
With Interview (+33.3%)
3y 5m
Median Time to Grant
Moderate
PTA Risk
Based on 614 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month