Prosecution Insights
Last updated: April 19, 2026
Application No. 17/563,564

USER-SPECIFIC INTERACTIVE OBJECT SYSTEMS AND METHODS

Final Rejection §103
Filed
Dec 28, 2021
Examiner
GILES, EBONI N
Art Unit
2622
Tech Center
2600 — Communications
Assignee
Universal City Studios LLC
OA Round
7 (Final)
63%
Grant Probability
Moderate
8-9
OA Rounds
3y 7m
To Grant
72%
With Interview

Examiner Intelligence

Grants 63% of resolved cases
63%
Career Allow Rate
440 granted / 697 resolved
+1.1% vs TC avg
Moderate +9% lift
Without
With
+8.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 7m
Avg Prosecution
33 currently pending
Career history
730
Total Applications
across all art units

Statute-Specific Performance

§101
2.0%
-38.0% vs TC avg
§103
78.5%
+38.5% vs TC avg
§102
9.1%
-30.9% vs TC avg
§112
6.3%
-33.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 697 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION This office action is in response to the amendment filed 11/5/2025 in which Claims 1-6, 8-15, 17, 18, 21-25 are pending, Claims 6 and 16 are canceled and Claims 24 and 25 are new. Claim Objections Claim 17 objected to because of the following informalities: Claim 17 depends upon cancelled claim 16. Appropriate correction is required. Response to Arguments 3. Applicant’s arguments with respect to claim(s) 1, 14, 18 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. 8. Claim(s) 1-5, 9, 22-23 are rejected under 35 U.S.C. 103(a) as being unpatentable over U.S. Patent Publication 2004/0204240 to Barney et al (“Barney”) in view of U.S. Patent Publication 2019/0304216 to Mendelson et al (“Mendelson”) in further view of WIPO Patent Publication 2019/143512 to Yeh et al (“Yeh”) and in further view of U.S. Patent Publication 2022/0121290 to Nocon et al (“Nocon”). As to Claim 1, Barney teaches a system, comprising: an interactive object, comprising: a special effect system disposed in or on a housing of the interactive object, wherein the interactive object is handheld (The wand [interactive object] or other actuation device allows play participants to electronically and "magically" interact with their surrounding play environment(s), thereby giving play participants the realistic illusion of practicing, performing and mastering "real" magic, see ¶ 0008; Use of the wand may be as simple as touching it to a particular surface or "magical" item within a suitably configured play environment or it may be as complex as shaking or twisting the wand a predetermined number of times in a particular manner and/or pointing it accurately at a certain target desired to be "magically" transformed or otherwise affected, see ¶ 0063; An internal cavity 116 is preferably provided to receive and safely house various circuitry for activating and operating the wand and various wand-controlled effects [special effect system] (described later). Batteries, optional lighting, laser or sound effects and/or the like may also be provided and housed within cavity 116, see ¶ 0066); a controller disposed in or on the housing of the interactive object that controls operation of the special effect system (An internal cavity 116 is preferably provided to receive and safely house various circuitry [controller] for activating and operating the wand and various wand-controlled effects [special effect system] (described later). Batteries, optional lighting, laser or sound effects and/or the like may also be provided and housed within cavity 116, see ¶ 0066; the wand activation circuit 115 in accordance with the above-described preferred embodiment is essentially only activated (and transponder 118 is only enabled) when a user actively moves the wand 100 in such particular way as to impart different transient acceleration forces on the distal and proximal ends of the wand 100 (or wherever the sensors are located if not at the distal and proximal ends), see ¶ 0074); communicate instructions to the controller of the interactive object to activate a special effect of the special effect system (one wand motion may trigger a first wand activation circuit (and a first wand effect) while a different wand motion may trigger a second wand activation circuit (and a second wand effect) [communicate instructions to controller to activate a special effect], see ¶ 0063), wherein the instructions are based on the user profile of the plurality of user profiles, and the characterized movement or action of the interactive object (alternative wand activation circuits can be designed and configured so as to respond to different desired wand activation motions. For example, this may be achieved by adding more sensors and/or by changing sensor positions and orientations. For example, one wand motion may trigger a first wand activation circuit (and a first wand effect) while a different wand motion may trigger a second wand activation circuit (and a second wand effect) [characterized movement of the interactive object], see ¶ 0063; wand levels can easily be set and changed, for example, by accessing the internal circuitry of each wand [central controller] and flipping various dip switches to change the address or coding of the internal RF/IR transmitter. Alternatively, within a play facility wand levels may be set and stored at the receiver/controller level by tracking each wand unique ID code (UPIN/UGIN) [user profiles] and using a computer and an indexed data-base to look up the corresponding wand level and any other relevant gaming information associated with each unique UPIN/UGIN [profile being associated with an identified user], see ¶ 0118). Barney does not expressly disclose a plurality of environmental sensors disposed throughout an area of an interactive environment and configured to generate sensor data associated with the area of the interactive environment; and a central controller configured to: receive a plurality of user profiles for a plurality of users; receive the sensor data associated with the area of the interactive environment; identify a subset of users of the plurality of users based on the sensor data; characterize a movement or action of the interactive object based on collected data from the environmental sensors; access a user profile of the user from the plurality of user profiles, and wherein the instructions are based on the user profile of the plurality of user profiles, and the characterized movement or action of the interactive object. Mendelson teaches a plurality of environmental sensors disposed throughout an area of an interactive environment and configured to generate sensor data associated with the area of the interactive environment (the tracking hardware associated with a user (e.g., a wristband) is configured to send a transmission and/or where such transmission may be sensed by multiple sensors or receivers of the system at the same or similar time (e.g., sensors positioned at different locations, but still capable of picking up a transmission of a user at a particular position), see ¶ 0073; a system that includes a plurality of sensors or receivers disposed at different positions around a destination or location, a particular user and/or object may be tracked or sensed by multiple sensors or receivers for a given time, even though the user and/or object is physically only located at one particular location, see ¶ 0090); and a central controller configured to: receive a plurality of user profiles for a plurality of users (Users may interface with the system (e.g., setup or modify a user profile or preferences, make purchases or modify reservations using the system, etc.) through a software application that runs on a mobile device, such as a smart phone and/or via software that runs upon one or more components of the tracking device, and/or by interfacing with kiosks or other hardware that is fixed or positioned at particular locations throughout the destination, see ¶ 0065; The system that is engaged with or interfaces with the software application having the UI screen 1700 with user profile UI element 1730 may be configured to allow a user to establish and/or modify a variety of profile personalization data, see ¶ 0135); receive the sensor data associated with the area of the interactive environment (a mobile device may execute software and/or cooperate with an external device (e.g., via one or more of its hardware ports) for providing track capabilities by transmitting and/or receiving signals configured to be received and/or sent by sensors positioned around the location, see ¶ 0107); identify a subset of users of the plurality of users based on the sensor data (trackable hardware (e.g., one or more of the plurality of wristbands (1005, 1010, 1015) may be associated with a user (e.g., worn by a user) so that the user's movement and/or activities can be sensed/tracked as the user participates in activities at a particular destination, see ¶ 0099). characterize a movement or action of the interactive object based on collected data from the environmental sensors (The hardware (e.g., a wristband or other wearable or other device with a tracking module or component) may be provided to a user as the user enters the destination and returned by the user as the user exits the destination…Sensors configured to detect and/or track the hardware may have different ranges (e.g., sensors may have a short-range, such as corresponding to a max of a few inches, while other sensors may have a long-range, such as corresponding to roughly 90 ft) and may be placed throughout the destination at locations to sense and track the movement of users and/or objects, see ¶ 0064; trackable hardware (e.g., one or more of the plurality of wristbands (1005, 1010, 1015) [interactive object] may be associated with a user (e.g., worn by a user) so that the user's movement and/or activities can be sensed/tracked as the user participates in activities at a particular destination, see ¶ 0099); access a user profile of the user from the plurality of user profiles (A user may position themselves or some trackable hardware associated with the user within a proximity (e.g., short-range, such as within a few inches) of a receiver 825 of the kiosk that senses the trackable hardware and starts software instructions using a processor 805, see ¶ 0095; The kiosk may also be connected (e.g., via wireless and/or through wired connections) with a server 830, such as a local server associated with the destination and/or a master or global server as discussed throughout. This connection with the server 830 may allow for syncing of data, lookup of data, and/or other communication of data between the server and kiosk, for example, when a user enters a proximity of the kiosk to be sensed by the kiosk and begin use, the data on the local and/or master or global servers is available for display and/or interaction by the kiosk, see ¶ 0096; The processor 805 may also be in communication with a display 815 (e.g., a touchscreen display) associated with the kiosk, for example, in order to display information and/or receive input or data and/or provide capability for operation, manipulation, or other features by a user interacting with the kiosk, see ¶ 0097); Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Barney with Mendelson to teach a plurality of environmental sensors disposed throughout an area of an interactive environment and configured to generate sensor data associated with the area of the interactive environment; and a central controller configured to: receive a plurality of user profiles for a plurality of users; receive the sensor data associated with the area of the interactive environment; identify a subset of users of the plurality of users based on the sensor data; characterize a movement or action of the interactive object based on collected data from the environmental sensors; access a user profile of the user from the plurality of user profiles, and wherein the instructions are based on the user profile of the plurality of user profiles, and the characterized movement or action of the interactive object. The suggestion/motivation would have been in order for trackable hardware to be carried and/or worn or otherwise associated with the users so that the users' movement and activities may be tracked at the geographic location or destination (see Abstract). Barney and Mendelson do not expressly disclose associate a user of the subset of users with the interactive object based on the sensor data being indicative that the user is a closest user of the subset of users to the interactive object. Yeh teaches associate a user of the subset of users with the interactive object based on the sensor data being indicative that the user is a closest user of the subset of users to the interactive object (the interactive game elements 63 may sense proximity of each user-associated device 20 or may wirelessly (e.g., via a transceiver) capture identification information from user-associated devices 20 in range of the interactive game element 63 [associate a user with the interactive object based on sensor data indicative that the user is the closest to the interactive object]. Sensed interactions with the interactive game element 63 that are time-stamped together with a captured identification code or information are associated together to link the interaction to the player 22 and the player’s user-associated device 20. For example, the characteristic of the interaction may be that the player 22 has moved to a correct location in the game environment. Other types of interactive game elements 63 may include optical sensors, pressure sensors, cameras, etc, see ¶ 0033). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Barney and Mendelson with Yeh to teach associate a user of the subset of users with the interactive object based on the sensor data being indicative that the user is a closest user of the subset of users to the interactive object. The suggestion/motivation would have been in order for the interactive game elements 63 may pass the interaction information and the associated identification information of the player 22 that performed the interaction to the control system 12 to update the dynamic user profile 24 (see ¶ 0033). Barney, Mendelson and Yeh do not expressly disclose communicate instructions to the controller of the interactive object to activate a special effect of the special effect system at a selected intensity level, wherein the instructions to select the intensity level of the special effect are based on one or both of the characterized movement or data of the user profile. Nocon teaches communicate instructions to the controller of the interactive object to activate a special effect of the special effect system at a selected intensity level, wherein the instructions to select the intensity level of the special effect are based on one or both of the characterized movement or data of the user profile (The handle 1204 may be grasped by the hand of the user 99, while the shaft 1206 provides a form factor of a wand tip extending outward from the handle 1204, see ¶ 0115; The purchasing gesture may be detected by the GR wand and transmitted to the smart retail infrastructure directly or via the user's personal smart device. Upon receiving notification of the purchase, the smart retail infrastructure may be configured to trigger congratulatory messages (e.g., triggering a light show in the vicinity of the user, causing the wand to light up/buzz/vibrate, issue verbal confirmation of the purchase, etc.) [communicate instructions to the controller to activate a special effect] to confirm the purchase, see ¶ 0086; The user intent may be inferred based on a combination of type of grip technique on the handle 1204 and subsequent finger and/or thumb press performed by the user 99, and also the time sequence of assertions. An example of inferring the user intent may be based on how the user 99 holds the exemplary GR device 1200, such as a smart and interactive wand, based on the one or more factors and how the user 99 performs a gesture, for example press and hold the wand with the thumb once or move the wand around in the air, to cast a spell [select intensity level of special effect based on characterized movement], see ¶ 0157). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Barney, Mendelson and Yeh with Nocon to teach communicate instructions to the controller of the interactive object to activate a special effect of the special effect system at a selected intensity level, wherein the instructions to select the intensity level of the special effect are based on one or both of the characterized movement or data of the user profile. The suggestion/motivation would have been in order for the user intent to be inferred based on a combination of type of grip technique on the handle and subsequent finger and/or thumb press performed by the user (see ¶ 0157). As to Claim 2, Barney, Mendelson, Yeh and Nocon depending on Claim 1, Yeh teaches wherein the plurality of environmental sensors comprise facial recognition sensors, 3D time of flight sensors, radio frequency sensors, optical sensors, or any combination thereof (Other types of interactive game elements 63 may include optical sensors, pressure sensors, cameras, etc, see ¶ 0033). As to Claim 3, Barney, Mendelson, Yeh and Nocon depending on Claim 1, Yeh teaches wherein the sensor data comprises facial recognition data, optical data, radio frequency data, motion data, or any combination thereof (the interaction of the user-associated devices 20 with interactive game elements 63 of the game environments may generate the signals that are indicative of the interaction…Other types of interactive game elements 63 may include optical sensors, pressure sensors, cameras, etc, see ¶ 0033). As to Claim 4, Barney, Mendelson, Yeh and Nocon depending on Claim 1, Barney teaches wherein the interactive object comprises one or more on-board sensors (the placement and orientation of the tilt sensors 122, 124 is preferably such that different accelerations or motions are required at the proximal and distal ends 112 and 114 in order to trigger both tilt sensors 122, 124 to their ON positions (or OFF positions, as the case may be), see ¶ 0072). As to Claim 5, Barney, Mendelson, Yeh and Nocon depending on Claim 1, Barney teaches wherein the interactive object comprises a handheld object, wherein the handheld object is a sword, wand, token, book, ball, or figurine (The wand [interactive object] or other actuation device allows play participants to electronically and "magically" interact with their surrounding play environment(s), thereby giving play participants the realistic illusion of practicing, performing and mastering "real" magic, see ¶ 0008). As to Claim 9, Barney, Mendelson, Yeh and Nocon depending on Claim 1, Barney teaches wherein the central controller is configured to operates to characterize the movement or action by identifying a movement pattern of the interactive object (alternative wand activation circuits [central controllers] can be designed and configured so as to respond to different desired wand activation motions. For example, this may be achieved by adding more sensors and/or by changing sensor positions and orientations. For example, one wand motion may trigger a first wand activation circuit (and a first wand effect) while a different wand motion may trigger a second wand activation circuit (and a second wand effect) [characterize a movement of the interactive object], see ¶ 0063). As to Claim 22, Barney, Mendelson, Yeh and Nocon depending on Claim 1, Nocon teaches wherein the user profile comprises one or more user preferences input by the user (a “personalized experience” means sensory output from the connected devices that is configured based on information defined by or for an individual user indicative of the user's preferences for the sensory output, see ¶ 0016). As to Claim 23, Barney, Mendelson, Yeh and Nocon depending on Claim 1, Nocon teaches wherein the user profile comprises one or more user preferences input by the user (For social applications, a GR device 100 may be personalized to the user. For example, a GR device 100 may be configured to recognize the user's biometric/voice and retrieve personal information associated with user (e.g., name, birthday, affiliations, preferences, and so forth), see ¶ 0081). Claim(s) 8, 13 are rejected under 35 U.S.C. 103(a) as being unpatentable over U.S. Patent Publication 2004/0204240 to Barney et al (“Barney”) in view of U.S. Patent Publication 2019/0304216 to Mendelson et al (“Mendelson”) in further view of WIPO Patent Publication 2019/143512 to Yeh et al (“Yeh”) in further view of U.S. Patent Publication 2022/0121290 to Nocon et al (“Nocon”) and in further view of U.S. Patent Publication 2019/0220635 to Yeh et al (“Yeh 2”). As to Claim 8, Barney, Mendelson, Yeh and Nocon depending on Claim 1, Barney, Mendelson, Yeh and Nocon do not expressly disclose wherein the special effects system further comprises one or more of a haptic feedback device, a light source, or a sound system, that are activated in response to the instructions. Yeh 2 teaches wherein the special effects system further comprises one or more of a haptic feedback device, a light source, or a sound system, that are activated in response to the instructions (The LEDs 54a, 54b and/or the sensors 56 receive control signals from the microcontroller 38, and may also receive power from the power circuitry 42. The LEDs 54a, 54b emit light in response to control signals from the microcontroller 38, and the emitted light may then be detected by the detector 20. The sensors 56 may include an accelerometer, a gyrometer, a pressure sensor, a sound sensor, or a light detector [environmental sensors], for example, see ¶ 0025). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Barney, Mendelson, Yeh and Nocon with Yeh 2 to teach wherein the special effects system further comprises one or more of a haptic feedback device, a light source, or a sound system, that are activated in response to the instructions. The suggestion/motivation would have been in order to detect an interaction between the wearable device and an element of an attraction based on the received identification information and the received signal (see ¶ 0006). As to Claim 13, Barney, Mendelson, Yeh and Nocon depending on Claim 1, Barney, Mendelson, Yeh and Nocon do not expressly disclose wherein the interactive object comprises an optical power harvester that powers the special effect system. Yeh 2 teaches wherein the interactive object comprises an optical power harvester that powers the special effect system (the wearable device has a sensor coupled to the power harvesting circuit and configured to utilize the power to monitor a condition of the wearable device, see Abstract). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Barney, Mendelson, Yeh and Nocon with Yeh 2 to teach wherein the interactive object comprises an optical power harvester that powers the special effect system. The suggestion/motivation would have been in order to utilize the power to monitor a condition of the wearable device (see Abstract). 10. Claim(s) 10-12 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent Publication 2004/0204240 to Barney et al (“Barney”) in view of U.S. Patent Publication 2019/0304216 to Mendelson et al (“Mendelson”) in further view of WIPO Patent Publication 2019/143512 to Yeh et al (“Yeh”) in further view of U.S. Patent Publication 2022/0121290 to Nocon et al (“Nocon”) and in further view of U.S. Patent Publication 2021/0090334 to Evans et al (“Evans”). As to Claim 10, Barney, Mendelson, Yeh and Nocon depending on Claim 1, Barney, Mendelson, Yeh and Nocon do not explicitly disclose wherein the activated special effect is based on a quality metric of the characterized movement or action. Evans teaches wherein the activated special effect is based on a quality metric of the characterized movement or action (first effect 607a to the respective perspective and the respective location of each of observers 142a-142d to produce multiple mixed reality effects 112/312/612a (hereinafter “second effects 112/312/612a”) corresponding to first effect 607a (action 595), see ¶ 0073; the one of second effects 112/312/612a produced for observer 142a in action 595 corresponds to first effect 607a while being modified to account for the distance of observer 142a from the location at which the effect occurs, and the unique viewing angle of observer 142a based on a head position of observer 142a and/or a posture, e.g., standing or sitting, of observer 142a [quality metric of the characterized action]. Similarly, the one of second effects 112/312/612a produced for observer 142b in action 595 corresponds to first effect 607a while being modified to account for the distance of observer 142b from the location at which the effect occurs, and the unique viewing angle of observer 142b based on a head position of observer 142b and/or a posture of observer 142b, see ¶ 0074. Examiner construes the distance of the observer from the location at which the effect occurs as a quality metric for the special effect, i.e. mixed reality effect). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Barney, Mendelson, Yeh and Nocon with Evans to teach wherein the activated special effect is based on a quality metric of the characterized movement or action. The suggestion/motivation would have been in order to track a perspective and a location within mixed reality venue of each of several observers of the activity (see ¶ 0065). As to Claim 11, Barney, Mendelson, Yeh and Nocon depending on Claim 10, Evans teaches wherein a first special effect is activated when the quality metric is above a threshold and a second special effect is activated when the quality metric is below the threshold (lighting system 114/414 may include spotlights and/or floodlights configured to provide directional lighting that can be turned on or off, or be selectively dimmed and brightened to emphasize one or more objects or features within mixed reality venue 110/111/410, or to draw attention towards or away from actions of performer 148, see ¶ 0053. Examiner construes that lights are dimmed when the observer is further away from the object [above a threshold] and the lights are brightened when the observer is closer to the object [below a threshold]). As to Claim 12, Barney, Mendelson, Yeh and Nocon depending on Claim 9, Barney, Mendelson, Yeh and Nocon do not explicitly disclose wherein the activated special effect changes based on corresponding changes to the quality metric. Evans teaches wherein the activated special effect changes based on corresponding changes to the quality metric (lighting system 114/414 may include spotlights and/or floodlights configured to provide directional lighting that can be turned on or off, or be selectively dimmed and brightened to emphasize one or more objects or features within mixed reality venue 110/111/410, or to draw attention towards or away from actions of performer 148, see ¶ 0053; the one of second effects 112/312/612a produced for observer 142a in action 595 corresponds to first effect 607a while being modified to account for the distance of observer 142a from the location at which the effect occurs, and the unique viewing angle of observer 142a based on a head position of observer 142a and/or a posture, e.g., standing or sitting, of observer 142a [quality metric of the characterized action]. Similarly, the one of second effects 112/312/612a produced for observer 142b in action 595 corresponds to first effect 607a while being modified to account for the distance of observer 142b from the location at which the effect occurs, and the unique viewing angle of observer 142b based on a head position of observer 142b and/or a posture of observer 142b, see ¶ 0074. Examiner construes the distance of the observer from the location at which the effect occurs as a quality metric for the special effect, i.e. mixed reality effect, and the effect changes the closer or further away the observer gets from the location of the effect). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Barney, Mendelson, Yeh and Nocon with Evans to teach wherein the activated special effect changes based on corresponding changes to the quality metric. The suggestion/motivation would have been in order to track a perspective and a location within mixed reality venue of each of several observers of the activity (see ¶ 0065). Claim(s) 14, 21, 24 are rejected under 35 U.S.C. 103(a) as being unpatentable over U.S. Patent Publication 2019/0304216 to Mendelson et al (“Mendelson”) in view of WIPO Patent Publication 2019/143512 to Yeh et al (“Yeh”) in further view of U.S. Patent Publication 2021/0090334 to Evans et al (“Evans”). As to Claim 14, Mendelson teaches a method, comprising: receiving sensor data from a plurality of sensors disposed throughout an area of an interactive environment, wherein the plurality of sensors are configured to generate sensor data associated with the area of the interactive environment; identifying a plurality of interactive objects and a plurality of users within the area of the interactive environment based on the sensor data (The hardware (e.g., a wristband or other wearable or other device with a tracking module or component) may be provided to a user as the user enters the destination and returned by the user as the user exits the destination…Sensors configured to detect and/or track the hardware may have different ranges (e.g., sensors may have a short-range, such as corresponding to a max of a few inches, while other sensors may have a long-range, such as corresponding to roughly 90 ft) and may be placed throughout the destination at locations to sense and track the movement of users and/or objects, see ¶ 0064; a system that includes a plurality of sensors or receivers disposed at different positions around a destination or location, a particular user and/or object may be tracked or sensed by multiple sensors or receivers for a given time, even though the user and/or object is physically only located at one particular location, see ¶ 0090). Mendelson teaches tracking movement of the identified interactive object using the sensor data (when a user is within a particular destination, location, or area that the system is configured to track user and/or object movement and/or activity. At step 710, tracking data for the user is obtained at a particular location within the destination [generating sensor data associated with the area]. For example, this tracking data may result from tracking or sensing the user by way of RFID and/or any of a variety of other possible detection methods as discussed throughout this application (e.g., tracking of hardware worn or carried by a user via one or more sensors or receivers, tracking of a user or crowd of users [identify a plurality of users based on the sensor data], such as via motion detection, visual recognition [sensors disposed throughout an area of an interactive environment], etc.), see ¶ 0090). Mendelson does not expressly disclose wherein the interactive objects are handheld; matching an identified user of the plurality of users with an identified interactive object of the plurality of interactive objects using rules-based matching, wherein the rules-based matching determines that the identified user is a most likely user of the plurality of users to be associated with the identified interactive object based on the sensor data. Yeh teaches wherein the interactive objects are handheld; matching an identified user of the plurality of users with an identified interactive object of the plurality of interactive objects using rules-based matching, wherein the rules-based matching determines that the identified user is a most likely user of the plurality of users to be associated with the identified interactive object based on the sensor data (the interactive game elements 63 may sense proximity of each user-associated device 20 or may wirelessly (e.g., via a transceiver) capture identification information from user-associated devices 20 in range of the interactive game element 63 [rules-based matching determines that the identified user is a most likely user of the plurality of users to be associated with the identified interactive object based on the sensor data]. Sensed interactions with the interactive game element 63 that are time-stamped together with a captured identification code or information are associated together to link the interaction to the player 22 and the player’s user-associated device 20. For example, the characteristic of the interaction may be that the player 22 has moved to a correct location in the game environment. Other types of interactive game elements 63 may include optical sensors, pressure sensors, cameras, etc, see ¶ 0033). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Mendelson with Yeh to teach wherein the interactive objects are handheld; matching an identified user of the plurality of users with an identified interactive object of the plurality of interactive objects using rules-based matching, wherein the rules-based matching determines that the identified user is a most likely user of the plurality of users to be associated with the identified interactive object based on the sensor data. The suggestion/motivation would have been in order for the interactive game elements 63 may pass the interaction information and the associated identification information of the player 22 that performed the interaction to the control system 12 to update the dynamic user profile 24 (see ¶ 0033). Mendelson and Yeh do not expressly disclose determining a quality metric based on the tracked movement of the identified interactive object; and communicating instructions to the identified interactive object to activate a first on-board special effect of the identified interactive object based on the tracked movement and a user profile of the identified user responsive to the quality metric being above a threshold and a second on-board special effect responsive to the quality metric being below the threshold, wherein the first on-board special effect is different than the second on-board special effect. Evans teaches determining a quality metric based on the tracked movement of the identified interactive object (lighting system 114/414 may include spotlights and/or floodlights configured to provide directional lighting that can be turned on or off, or be selectively dimmed and brightened to emphasize one or more objects or features within mixed reality venue 110/111/410, or to draw attention towards or away from actions of performer 148, see ¶ 0053; the one of second effects 112/312/612a produced for observer 142a in action 595 corresponds to first effect 607a while being modified to account for the distance of observer 142a from the location [quality metric] at which the effect occurs, and the unique viewing angle of observer 142a based on a head position of observer 142a and/or a posture, e.g., standing or sitting, of observer 142a. Similarly, the one of second effects 112/312/612a produced for observer 142b in action 595 corresponds to first effect 607a while being modified to account for the distance of observer 142b from the location at which the effect occurs [quality metric], and the unique viewing angle of observer 142b based on a head position of observer 142b and/or a posture of observer 142b, see ¶ 0074. Examiner construes the distance of the observer from the location at which the effect occurs as a quality metric for the special effect, i.e. mixed reality effect, and the effect changes the closer or further away the observer gets from the location of the effect); communicating instructions to the identified interactive object to activate a first on-board special effect of the identified interactive object based on the tracked movement and a user profile of the identified user responsive to the quality metric being above a threshold and a second on-board special effect responsive to the quality metric being below the threshold, wherein the first on-board special effect is different than the second on-board special effect (lighting system 114/414 may include spotlights and/or floodlights configured to provide directional lighting that can be turned on or off, or be selectively dimmed and brightened to emphasize one or more objects or features within mixed reality venue 110/111/410, or to draw attention towards or away from actions of performer 148, see ¶ 0053; the one of second effects 112/312/612a produced for observer 142a in action 595 corresponds to first effect 607a while being modified to account for the distance of observer 142a from the location [quality metric] at which the effect occurs, and the unique viewing angle of observer 142a based on a head position of observer 142a and/or a posture, e.g., standing or sitting, of observer 142a. Similarly, the one of second effects 112/312/612a produced for observer 142b in action 595 corresponds to first effect 607a while being modified to account for the distance of observer 142b from the location at which the effect occurs [quality metric], and the unique viewing angle of observer 142b based on a head position of observer 142b [user profile of the identified user] and/or a posture of observer 142b, see ¶ 0074. Examiner construes that lights are dimmed [first on-board special effect] when the observer is further away from the object [above a threshold] and the lights are brightened [second on-board special effect] when the observer is closer to the object [below a threshold]). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Mendelson and Yeh with Evans to teach determining a quality metric based on the tracked movement of the identified interactive object; and communicating instructions to the identified interactive object to activate a first on-board special effect of the identified interactive object based on the tracked movement responsive to the quality metric being above a threshold and a second on-board special effect responsive to the quality metric being below the threshold, wherein the first on-board special effect is different than the second on-board special effect. The suggestion/motivation would have been in order to track a perspective and a location within mixed reality venue of each of several observers of the activity (see ¶ 0065). As to Claim 21, Mendelson, Yeh and Evans depending on Claim 14, Evans teaches wherein the first on-board special effect comprises a different intensity, hue, or interval pattern of light activation relative to the second on-board special effect (lighting system 114/414 may include spotlights and/or floodlights configured to provide directional lighting that can be turned on or off, or be selectively dimmed and brightened to emphasize one or more objects or features within mixed reality venue 110/111/410, or to draw attention towards or away from actions of performer 148, see ¶ 0053. Examiner construes that lights are dimmed when the observer is further away from the object [above a threshold] and the lights are brightened when the observer is closer to the object [below a threshold]). Claim(s) 15, 17 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent Publication 2019/0304216 to Mendelson et al (“Mendelson”) in view of WIPO Patent Publication 2019/143512 to Yeh et al (“Yeh”) in further view of U.S. Patent Publication 2021/0090334 to Evans et al (“Evans”) and in further view of U.S. Patent Publication 2015/0338548 to Cortelyou et al (“Cortelyou”). As to Claim 15, Mendelson, Yeh and Evans depending from Claim 14, Mendelson, Yeh and Evans fail to disclose comprising emitting electromagnetic radiation into the interactive environment and detecting reflection of the electromagnetic radiation by retroreflective markers of the plurality of interactive objects, wherein tracking the movement of the identified interactive object comprises tracking a retroreflective marker associated with the identified interactive object. Cortelyou teaches emitting electromagnetic radiation into the interactive environment and detecting reflection of the electromagnetic radiation by retroreflective markers of the plurality of interactive objects, wherein tracking the movement of the identified interactive object comprises tracking a retroreflective marker associated with the identified interactive object (tracking systems may use sensors disposed in an environment to actively generate outputs received by a main controller. The controller may then process the generated outputs to determine certain information used for tracking. One example of such tracking may include tracking the motion of an object to which a sensor is fixed…a system might also utilize one or more devices used to bathe an area in electromagnetic radiation, a magnetic field, or the like, where the electromagnetic radiation or magnetic field is used as a reference against which the sensor’s output is compared by the controller, see ¶ 0026; a sensing device configured to detect the electromagnetic radiation retro-reflected back from objects within the field of view, and a controller configured to perform various processing and analysis routines including interpreting signals from the sensing device and controlling automated equipment based on the detected locations of the objects or markers, see ¶ 0029; the retro-reflective marker 24 is positioned on an object 26, see ¶ 0036; (the control unit 18 of the tracking system 10 may be able to identify an object in the detection area 30 of the tracking system 10 [identified interactive object]… the control unit 18 may receive data indicative of the electromagnetic radiation reflected back from the detection area 30, see ¶ 0111). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Mendelson, Yeh and Evans with Cortelyou to teach emitting electromagnetic radiation into the interactive environment and detecting reflection of the electromagnetic radiation by retroreflective markers of the plurality of interactive objects, wherein tracking the movement of the identified interactive object comprises tracking a retroreflective marker associated with the identified interactive object. The suggestion/motivation would have been in order to provide a tracking system to distinguish between participants and objects (see ¶ 0022). As to Claim 17, Mendelson, Yeh, Evans and Cortelyou depending from Claim 16, Cortelyou teaches comprising associating the identified interactive object with the retroreflective marker by identifying a closest retroreflective marker based on the sensor data to an origin of a wireless signal associated with identification information of the identified interactive object (the control unit 18 may be configured to identify certain objects that are expected to cross the path of the electromagnetic radiation beam 28 within the detection area 30…the control unit 18 may receive data indicative of the electromagnetic radiation reflected back from the detection area 30, and the control unit 18 may compare a digital signature of the detected radiation to one or more possible data signatures stored in memory 22. That is, if the signature of electromagnetic radiation reflected back to the detector 16 matches closely enough to the signature of a person 70 or known object 32, then the control unit 18 may determine that the person 70 or object 32 is located in the detection area 30 [identifying a closest retroreflective marker based on sensor data to an original wireless signal], see ¶ 0072). Claim(s) 18 is rejected under 35 U.S.C. 103(a) as being unpatentable over U.S. Patent Publication 2019/0143204 to Aman et al (“Aman”) in view of WIPO Patent Publication 2019/143512 to Yeh et al (“Yeh”) in further view of U.S. Patent Publication 2015/0338548 to Cortelyou et al (“Cortelyou”). As to Claim 18, Aman teaches a handheld interactive object, comprising: a housing; communication circuitry on or in the housing and configured to: receive a second portion of the electromagnetic radiation from the environment; transmit interactive object identification information of the interactive object responsive to receiving the second portion of the electromagnetic radiation (The present invention additionally provides for the use of passive micro RFID devices embedded within destination products, where at the point-of-purchase the gamer ID information maintained within the gamer tracking system is either written to the embedded micro RFID and/or associated with the unique RFID code, such that destination products useable as game props are also identifiable by detecting the embedded RFID at a game access point, see ¶ 0019; RFID wristband 16-wb or RFID anklet 16, where the presence of electronic devices 2c, 16-wb and 16 are detectable by a reader outputting, receiving and analyzing exciter field 20-ef [reflecting a first portion of electromagnetic radiation and receiving a second portion of the electromagnetic radiation], see ¶ 0042; gamer 2s holding an article 12, wearing clothing 19 and having a secret message book 13, where each of article 12, clothing 19 and book 13 are mobile game devices 60 and comprise an electronic device such as micro-RFID 4-rfid [power harvester]... a gamer/device detection 30-det sub-component capable of detecting the encoded information imparted to the electronic device 4-rfid, where the detected encoded information is usable at least in part by the any game access point 30 for providing to gaming system 48 to uniquely identify gamer 2s, see ¶ 0045); and a controller on or in the housing and configured to receive the special effect instructions and to generate a special effect command (The device 22 optionally included means for detecting the presence and identity of a guest as well as tracking the gestures of an article 12 [activated in response to the instructions] such as a wizard's wand being moved by the guest, where the gestures where interpretable as commands...other local environment sensors 32 for sensing any of a number of conditions in the local environment as well as an environment control system 34 for actuating or controlling any number of devices for creating local environment effects, see ¶ 0063; where at the point-of-purchase the gamer ID information maintained within the gamer tracking system is either written to the embedded micro RFID and/or associated with the unique RFID code, such that destination products useable as game props are also identifiable by detecting the embedded RFID at a game access point, see ¶ 0019); and a special effect system configured to receive the special effect command and to activate a special effect based on the special effect command (gamer 2s views the changing secret messages A through eye glasses 14, and where game access point 30 [special effect system] sends signals to article 12 such as sword 62-swd controlling effects such as vibration and the emission of light [special effect], see ¶ 0048). Aman does not expressly disclose wherein the second portion of the electromagnetic radiation is indicative, at least in part, of matching a user from a plurality of users in the environment to the interactive object based on a relative proximity of the user to the interactive object. Yeh teaches wherein the second portion of the electromagnetic radiation is indicative, at least in part, of matching a user from a plurality of users in the environment to the interactive object based on a relative proximity of the user to the interactive object (the interactive game elements 63 may sense proximity of each user-associated device 20 or may wirelessly (e.g., via a transceiver) capture identification information from user-associated devices 20 in range of the interactive game element 63 [matching a user from a plurality of users in the environment to the interactive object based on a relative proximity of the user to the interactive object]. Sensed interactions with the interactive game element 63 that are time-stamped together with a captured identification code or information are associated together to link the interaction to the player 22 and the player’s user-associated device 20. For example, the characteristic of the interaction may be that the player 22 has moved to a correct location in the game environment. Other types of interactive game elements 63 may include optical sensors, pressure sensors, cameras, etc, see ¶ 0033). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Aman and Mendelson to teach wherein the second portion of the electromagnetic radiation is indicative, at least in part, of matching a user from a plurality of users in the environment to the interactive object based on a relative proximity of the user to the interactive object. The suggestion/motivation would have been in order for the interactive game elements 63 may pass the interaction information and the associated identification information of the player 22 that performed the interaction to the control system 12 to update the dynamic user profile 24 (see ¶ 0033). Aman and Yeh do not expressly disclose a retroreflective marker disposed on or in the housing and configured to reflect a first portion of electromagnetic radiation from an environment to generate sensor data indicative of motion of the interactive object; wherein the special effect instructions are received responsive to transmitting the interactive object identification information and generating the sensor data. Cortelyou teaches retroreflective marker disposed on or in the housing and configured to reflect a first portion of electromagnetic radiation from an environment to generate sensor data indicative of motion of the interactive object (tracking systems may use sensors disposed in an environment to actively generate outputs received by a main controller. The controller may then process the generated outputs to determine certain information used for tracking. One example of such tracking may include tracking the motion of an object to which a sensor is fixed…a system might also utilize one or more devices used to bathe an area in electromagnetic radiation, a magnetic field, or the like, where the electromagnetic radiation or magnetic field is used as a reference against which the sensor’s output is compared by the controller, see ¶ 0026; a sensing device configured to detect the electromagnetic radiation retro-reflected back from objects within the field of view, and a controller configured to perform various processing and analysis routines including interpreting signals from the sensing device and controlling automated equipment based on the detected locations of the objects or markers, see ¶ 0029; the retro-reflective marker 24 is positioned on an object 26, see ¶ 0036); wherein the special effect instructions are received responsive to transmitting the interactive object identification information and generating the sensor data (the control unit 18 of the tracking system 10 may be able to identify an object in the detection area 30 of the tracking system 10 …the control unit 18 may receive data indicative of the electromagnetic radiation reflected back from the detection area 30, and the control unit 18 may compare the signature of the reflected radiation to one or more possible data signatures stored in memory 22…the control unit 18 may include a thermal signature stored in the memory 22, this thermal signature corresponding to the light from the flame effect 200 that is expected to reach the detector 16 when the flame effect 200 is operating properly [transmitting interactive object identification information], see ¶ 0111; The control unit 18 may trigger one or more pyrotechnic show effects based on a comparison made between the actual thermal signature detected via the detector 16 and the expected thermal signature [receive special effect instructions responsive to transmitted interactive object identification information], see ¶ 0112; the ride vehicle 204 may include one or more retro-reflective markers 24 disposed thereon for tracking the motion of the ride vehicle 204 via the same tracking system 10 [generating sensor data] that monitors the flame effect 200, as long as the frequency of light reflected by the retro-reflective marker 24 is distinguishable from the flame effect signature, see ¶ 0113; the tracking system 10 may be used to control a firework (or ordinance) show 240 performed in a pyrotechnic show area, for example to enable enhanced monitoring and control of firework timing. Indeed, the tracking system 10 may use aspects relating to surveying (e.g., distance measurement) as well as flame monitoring in controlling the firework show 240, see ¶ 0117). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Aman and Yeh with Cortelyou to teach retroreflective marker disposed on or in the housing and configured to reflect a first portion of electromagnetic radiation from an environment to generate sensor data indicative of motion of the interactive object; wherein the special effect instructions are received responsive to transmitting the interactive object identification information and generating the sensor data. The suggestion/motivation would have been in order to enable detection of markers and/or objects within the field of view of the tracking system (see ¶ 0029). Claim(s) 25 is rejected under 35 U.S.C. 103(a) as being unpatentable over U.S. Patent Publication 2019/0304216 to Mendelson et al (“Mendelson”) in view of WIPO Patent Publication 2019/143512 to Yeh et al (“Yeh”) in further view of U.S. Patent Publication 2021/0090334 to Evans et al (“Evans”) and in further view of U.S. Patent Publication 2019/0318539 to Weston. As to Claim 24, Mendelson, Yeh and Evans depending on Claim 14, Mendelson, Yeh and Evans do not expressly disclose wherein the rules-based matching comprises an image recognition feature, wherein the image recognition feature links to a user profile based on a user image. Weston teaches wherein the rules-based matching comprises an image recognition feature, wherein the image recognition feature links to a user profile based on a user image ( a camera may capture images of the participant's silhouette, or other attributes of the participant and attempt to identify the participant and distinguish from other participants based on those images. For example, the tracked information may include height, weight, size, hair color, clothing, shoes, etc. As another example, the scanning and participant information detection capabilities may be connected to outside processes or databases that further determine information regarding the participant and return that information to the controller for additional processing, see ¶ 0026). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Mendelson, Yeh and Evans with Weston to teach wherein the rules-based matching comprises an image recognition feature, wherein the image recognition feature links to a user profile based on a user image. The suggestion/motivation would have been in order to enable detection of markers and/or objects within the field of view of the tracking system (see ¶ 0029). Claim(s) 25 is rejected under 35 U.S.C. 103(a) as being unpatentable over U.S. Patent Publication 2019/0304216 to Mendelson et al (“Mendelson”) in view of WIPO Patent Publication 2019/143512 to Yeh et al (“Yeh”) in further view of U.S. Patent Publication 2021/0090334 to Evans et al (“Evans”) and in further view of U.S. Patent Publication 2021/0325580 to Yeh et al (“Yeh 2”). As to Claim 25, Mendelson, Yeh and Evans depending on Claim 1, Mendelson, Yeh and Evans do not expressly disclose wherein the rules-based matching comprises a grip recognition feature, wherein the grip recognition feature links to a user profile based on grip data from the interactive object. Yeh teaches wherein the rules-based matching comprises a grip recognition feature, wherein the grip recognition feature links to a user profile based on grip data from the interactive object (The object controller 39, in an embodiment, operates to receive control signals to control operation of the special effect system 36 to selectively active the light sources 114, 116 in a particular pattern or order. In one example, the sword 100 includes an array 124 of individual pressure or grip sensors 126 that provide pressure information (via internal communication leads 128) to the object controller 39, see ¶ 0048; The object controller 39, under passive power, can use the signals from the array 124 to calibrate based on sensor data indicative of a characteristic grip biometric for a particular user, see ¶ 0049). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to modify Barney, Mendelson, Yeh and Nocon with Yeh 2 to teach wherein the rules-based matching comprises a grip recognition feature, wherein the grip recognition feature links to a user profile based on grip data from the interactive object. The suggestion/motivation would have been in order to detect an interaction between the wearable device and an element of an attraction based on the received identification information and the received signal (see ¶ 0006). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to EBONI N GILES whose telephone number is (571)270-7453. The examiner can normally be reached Monday - Friday 9 am - 6 pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Patrick Edouard can be reached on (571)272-7603. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /EBONI N GILES/Examiner, Art Unit 2622 /PATRICK N EDOUARD/Supervisory Patent Examiner, Art Unit 2622
Read full office action

Prosecution Timeline

Dec 28, 2021
Application Filed
Nov 05, 2022
Non-Final Rejection — §103
Feb 16, 2023
Response Filed
Mar 08, 2023
Final Rejection — §103
Jun 15, 2023
Notice of Allowance
Oct 16, 2023
Response after Non-Final Action
Oct 23, 2023
Response after Non-Final Action
Feb 18, 2024
Non-Final Rejection — §103
May 16, 2024
Interview Requested
May 28, 2024
Applicant Interview (Telephonic)
Jun 05, 2024
Examiner Interview Summary
Jun 05, 2024
Response Filed
Jun 13, 2024
Final Rejection — §103
Jul 24, 2024
Interview Requested
Sep 18, 2024
Request for Continued Examination
Sep 21, 2024
Response after Non-Final Action
Sep 24, 2024
Non-Final Rejection — §103
Feb 27, 2025
Response Filed
Jun 28, 2025
Non-Final Rejection — §103
Sep 12, 2025
Interview Requested
Oct 03, 2025
Applicant Interview (Telephonic)
Oct 15, 2025
Examiner Interview Summary
Nov 05, 2025
Response Filed
Feb 24, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602962
CONTACTLESS OPTICAL INTERNET OF THINGS USER IDENTIFICATION DEVICE AND SYSTEM
2y 5m to grant Granted Apr 14, 2026
Patent 12599835
WEARABLE CONTROLLER
2y 5m to grant Granted Apr 14, 2026
Patent 12596895
LOW POWER BEACON SCHEDULING
2y 5m to grant Granted Apr 07, 2026
Patent 12575179
DISPLAY DEVICE AND METHOD FOR DRIVING DISPLAY DEVICE
2y 5m to grant Granted Mar 10, 2026
Patent 12573294
SYSTEMS AND METHODS FOR ALERTING PERSONS OF APPROACHING VEHICLES
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

8-9
Expected OA Rounds
63%
Grant Probability
72%
With Interview (+8.6%)
3y 7m
Median Time to Grant
High
PTA Risk
Based on 697 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month