Prosecution Insights
Last updated: April 19, 2026
Application No. 18/500,026

GAME SUITES

Non-Final OA §102§103
Filed
Nov 01, 2023
Examiner
HYLINSKI, STEVEN J
Art Unit
3715
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Kilburn Live LLC
OA Round
1 (Non-Final)
75%
Grant Probability
Favorable
1-2
OA Rounds
2y 11m
To Grant
93%
With Interview

Examiner Intelligence

Grants 75% — above average
75%
Career Allow Rate
688 granted / 912 resolved
+5.4% vs TC avg
Strong +18% interview lift
Without
With
+17.6%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
30 currently pending
Career history
942
Total Applications
across all art units

Statute-Specific Performance

§101
10.7%
-29.3% vs TC avg
§103
40.1%
+0.1% vs TC avg
§102
30.3%
-9.7% vs TC avg
§112
9.9%
-30.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 912 resolved cases

Office Action

§102 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-10 and 14-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by US 2021/0337284 A1 to Person et al. Re claim 1, Person teaches: a game suite, comprising: a game bay including one or more walls, a ceiling, and a floor that at least partially enclose a gaming area; The abstract describes that Person is directed to an “interactive multimedia structure [that] may include a plurality of walls on a majority of which an immersive environment is displayed” For further discussion of the interactive multimedia structure 100 comprising walls, a and a floor, see [0039]-[0040], [0062]. And additionally, regarding the multimedia structure comprising a ceiling, [0048] describes that “other shapes of the interactive multimedia structure” include, as shown in Fig. 1D and Fig. 1e, “a geodesic dome-shaped, or cave-shaped” or “a pyramid shaped interactive multimedia structure 100.” Domes, caves, and pyramids all have top surfaces that meet the limitation of being a ceiling. Considering Fig. 1D, any of the triangles above the first row of triangles can be interpreted as a ceiling. a plurality of user input interfaces configured to detect input from one or more users within the game suite; The abstract continues that “A user may interact with the immersive environment through manipulation of a user input device, which may be tracked by one or more systems associated with the interactive media structure. Based on the manipulation, the one or more systems may modify the immersive environment displayed on the majority of the walls.” [0037], “At least one user may interact with the immersive environment through manipulation of a user input device, which may be tracked by one or more systems associated with the interactive multimedia structure. Based on the manipulation, the one or more systems may modify the immersive environment displayed on the majority of the walls. In certain embodiments, the one or more systems may process interaction information of the user with the one or more user input devices to adaptively and intelligently modify the immersive environment.” [0041] describes an embodiment comprising “a virtual representation of a user input device (e.g., a virtual ball that is a virtual representation of a physical ball used by a user in the interactive multimedia structure 100)” [0058] includes a discussion of additional embodiments and details of input devices. a projection mapping system configured to provide visual game content to a plurality of the one or more walls, the ceiling, or the floor; [0044], “The interactive multimedia structure 100 may be associated with a multimedia output system …The multimedia output system may include one or more computing devices (e.g., one or more apparatuses 10 of FIG. 8). The multimedia output system may generate one or more two-dimensional or three-dimensional images that form the immersive environment. The multimedia output system may modify the two-dimensional or three-dimensional images, for example, based on user interaction with a user input device. After generating or modifying the immersive environment, the multimedia output system may output the immersive environment. For example, the multimedia output system may include one or more display screens or interactive display tiles mounted on, or that form, the wall surfaces 104, may include one or more projectors arranged to project on to the wall surfaces 104, and/or the like.” and a gaming entertainment system configured to execute a game within the game suite including: obtaining input from the plurality of user input interfaces; and providing the visual game content to the projection mapping system responsive to the detected input from the plurality of user input interfaces. The abstract further states that, “Based on the manipulation, the one or more systems may modify the immersive environment displayed on the majority of the walls … the one or more systems may process interaction information of the user with the one or more user input devices to adaptively and intelligently modify the immersive environment.” Embodiments of this immersive multimedia structure include, in [0038], “a basketball themed game” that simulates playing in “Madison Square Garden.” Another game embodiment is wherein, see [0042], participants build a physical structure in an interactive area of the multimedia structure accompanied by rendered content. [0050] also describes tournament-style e-Sports games or cooperative or competitive skills challenge genres of games. [0059] contemplates that multimedia experiences supported by the multimedia structure can include “games, skills challenges … a time-based challenge, an accuracy-based challenge” or the like. [0062]-[0073] describe additional embodiments of games, input devices and manipulations, and visual content mapped to the surfaces of the immersive environment responsive to game progress. Re claim 2, [0050] describes that networked multimedia structures may be operated to enable that “users of different interactive multimedia structures 100 may be matched to play a competitive or cooperative game or skills challenge based on data related to their past uses of the interactive multimedia structures 100 (e.g., the data may be processed using a machine learning model, and the users matched based on user skill level as determined by the machine learning model).” Automated user-matched game selection meets the claimed selection and execution of such a game. [0059] describes that the interactive multimedia structure provides games, skills challenges, and the like. [0065] describes an illustrative scenario of a user choosing to start a game by kicking a ball from a starting location. Re claims 3-5, the disclosure of Person includes extensive discussion of a spectator area. See: [0045], [0048], [0057], [0059], [0060], [0063], [0069], [0071], [0074], [0081]. And regarding spectator areas having displays that display game content, [0045] describes that “spectator area 108 may further include a user device 112” that may, among other interactive functions, “display scores or ranking for players or teams participating in the game or skills challenge, may provide account access to accounts of users…” Re claim 6, [0044] describes that, “After generating or modifying the immersive environment, the multimedia output system may output the immersive environment. For example, the multimedia output system may include one or more display screens or interactive display tiles mounted on, or that form, the wall surfaces 104, may include one or more projectors arranged to project on to the wall surfaces 104, and/or the like.” [0045] describes that “spectator area 108 may further include a user device 112” that may, among other interactive functions, “display scores or ranking for players or teams participating in the game or skills challenge, may provide account access to accounts of users…” Re claims 7-9, [0003], [0008], [0036], describe that interactive multimedia structure game participants may make inputs sensed by a “motion sensing device”. [0077]-[0078] describes that embodiments of the invention may leverage LiDAR devices for motion sensing of physical objects including hockey sticks, swords, racquets and the like. Re claim 10, [0050] describes that interactive multimedia structures 100 can receive input by users “speaking into a microphone (e.g., to provide voice or speech to text”. [0058] describes, “voice commands and comments of the user during the multimedia experience may be used to determine levels of player excitement. Additionally… a sound (e.g., a voice command), and/or a biometric (e.g., a fingerprint, a voice characteristics, and/or the like) can be used to input commands to the interactive multimedia structure 100, and can be used in place of other user input devices discussed herein … The interactive multimedia structure 100 may provide translation operations (e.g., may translate voice commands from one language to another, may translate voice or text communications between interactive multimedia structures 100, may output speech or display text in various languages, and/or the like).” Re claim 14, refer to Fig. 1A, noting how the spectator lounge 108 is coupled to a game bay comprising walls 104. [0045] describes that the spectator area 108 can view the interior of the wall surfaces 104 as well as a display of a computing device. Re claim 15, [0042] describe that the interactive multimedia structure may be configured to output special effects including touch, smell, motion, water, wind, smoke, flash, and heat, synchronized with the multimedia experience that [0040] admits comprises three-dimensional images. Any of the effects can be interpreted as a fourth dimension. Re claim 16, a broadest reasonable interpretation of the limitation “adjust a positioning” is that images projected on the walls can be updated responsive to user input(s) and changed game state, see [0044]. Re claim 17, [0071] describes that interactive multimedia structure 100 can interact with users through “speakers”. Because there are no particular technical specification(s) or structural configuration(s) recited in the claim to define the meaning of “surround audio”, the audio in Person emitted from speakers within an immersive environment meets this limitation. Re claim 18, [0006] describes that “one or more other interactive multimedia structures may be located within a same interactive housing structure as the interactive multimedia structure” Re claims 19-20, refer to the rejection of claim 1. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 11-13 are rejected under 35 U.S.C. 103 as being unpatentable over Person in view of US 2013/0278631 A1 to Border et al. Re claim 11, although Person describes that the interactive media structure 100 may comprise “touch screen” input means, [0036], Person does not go into detail as to whether the touch screen is a capacitive type. Border is an analogous 3D virtual reality reference that teaches it was known for augmented reality input devices to include “resistive” or “capacitive touch technologies”, see [0623]. "The Court quoting In re Kahn, 441 F.3d 977, 988, 78 USPQ2d 1329, 1336 (Fed. Cir. 2006), stated that "'[R]ejections on obviousness cannot be sustained by mere conclusory statements; instead, there must be some articulated reasoning with some rational underpinning to support the legal conclusion of obviousness.'" KSR, 550 U.S. at __, 82 USPQ2d at 1396. Exemplary rationales that may support a conclusion of obviousness include: (A) Combining prior art elements according to known methods to yield predictable results; (B) Simple substitution of one known element for another to obtain predictable results; (C) Use of known technique to improve similar devices (methods, or products) in the same way; (D) Applying a known technique to a known device (method, or product) ready for improvement to yield predictable results. It would have been obvious to one having ordinary skill in the art at the time the invention was made to have incorporated capacitive touch input devices as taught by Border into the augmented reality system of Person. Here, it would require only routine creativity to modify Person, as it has been held by the courts that the use of known technique to improve similar devices or products in the same way; is indicia of obviousness. The motivation for one of skill in the art to make Person using capacitive touch input devices is that they are known to be more precise in resolving touch than resistive type. Re claims 12-13, [0045] describes that spectators may be provided with a user device 12, that is the apparatus 10 of Fig. 8, that enables users to control the immersive environment or difficulty level or ambiance or environmental conditions for players or teams participating in a game. [0036] notes that input devices usable with user devices of the invention of the disclosure include a joystick, keyboard, touch screen or motion sensing device. Because there is no particular structure attributed to the claimed intended use of “handheld”, this limitation does not breathe any life into the claim. Portable computer peripheral devices including keyboards, joysticks or touch screens are inherently capable of being handheld. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Any inquiry concerning this communication or earlier communications from the examiner should be directed to STEVEN J HYLINSKI whose telephone number is (571)270-1995. The examiner can normally be reached Mon-Fri 10-530. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Dmitry Suhol can be reached at (571) 272-4430. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /STEVEN J HYLINSKI/ Primary Examiner, Art Unit 3715
Read full office action

Prosecution Timeline

Nov 01, 2023
Application Filed
Oct 15, 2025
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12594503
SUBMOVEMENT-BASED MOUSE INPUT CHEATING DETECTION
2y 5m to grant Granted Apr 07, 2026
Patent 12594494
A SUPPORT FRAME ASSEMBLY AND METHODS OF USE THEREOF
2y 5m to grant Granted Apr 07, 2026
Patent 12589300
Systems and Methods for Improved Corner Slicing in a Multiplayer Video Game
2y 5m to grant Granted Mar 31, 2026
Patent 12569760
MASKING A FUNCTION OF A VIRTUAL OBJECT USING A TRAP IN A VIRTUAL ENVIRONMENT
2y 5m to grant Granted Mar 10, 2026
Patent 12531143
METHODS AND APPARATUS FOR VIRTUAL COMPETITION
2y 5m to grant Granted Jan 20, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
75%
Grant Probability
93%
With Interview (+17.6%)
2y 11m
Median Time to Grant
Low
PTA Risk
Based on 912 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month