Prosecution Insights
Last updated: April 19, 2026
Application No. 19/089,003

VIRTUAL REALITY DE-ESCALATION TOOL FOR DELIVERING ELECTRONIC IMPULSES TO TARGETS

Non-Final OA §102§103§112§DP
Filed
Mar 25, 2025
Examiner
YANG, KWANG-SU
Art Unit
2623
Tech Center
2600 — Communications
Assignee
V-Armed Inc.
OA Round
1 (Non-Final)
74%
Grant Probability
Favorable
1-2
OA Rounds
2y 9m
To Grant
92%
With Interview

Examiner Intelligence

Grants 74% — above average
74%
Career Allow Rate
577 granted / 775 resolved
+12.5% vs TC avg
Strong +18% interview lift
Without
With
+18.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
14 currently pending
Career history
789
Total Applications
across all art units

Statute-Specific Performance

§101
0.9%
-39.1% vs TC avg
§103
54.7%
+14.7% vs TC avg
§102
26.9%
-13.1% vs TC avg
§112
13.1%
-26.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 775 resolved cases

Office Action

§102 §103 §112 §DP
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Objections 2. Claims 1-14 are objected to because of the following informalities: In line 18 of claim 1: “… a network switch; …” should be changed to --… a network switch; and …--; In line 4 of claim 5: “… a simulation scenario …” should be changed to --… the simulation scenario …--; and In line 8 of claim 10: “… an EMS training scenario …” should be changed to --… an emergency medical services (EMS) training scenario …--. Appropriate correction is required. Double Patenting 3. The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP §§ 706.02(l)(1) - 706.02(l)(3) for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp. 4. Claims 1-14 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-14 of U.S. Patent No. US 12,287,911 B2. The following is an example for comparing claim 1 of this application with claim 1 of U.S. Patent No. US 12,287,911 B2. Claim 1 of this application Claim 1 of U.S. Patent No. US 12,287,911 B2 A virtual reality system comprising: A virtual reality system comprising: a physical environment defined at least partially by a physical coordinate system and comprising one or more physical objects; a physical environment defined at least partially by a physical coordinate system and comprising one or more physical objects; one or more users located in the physical environment, wherein each of the one or more users are configured with wearable devices and a weapon, and wherein each of the wearable devices and the weapon comprise a position indicator configured to detect position data in the physical environment; one or more users located in the physical environment, wherein each of the one or more users is configured with wearable devices and a weapon, and wherein each of the wearable devices and the weapon comprises a position indicator configured to detect position data in the physical environment; a computing device communicatively coupled to a server, the computing device comprising a simulation engine; a computing device communicatively coupled to a server, the computing device comprising a simulation engine; a modular symptoms generator configured to interact with the simulation engine and the server; and a modular symptoms generator configured to interact with the simulation engine and the server, wherein the modular symptoms generator is configured to establish a set of rules based on relative object position interrelationship, predetermined position data for each of the rules, and a set of at least two simulation modifications triggered by results from matching the predetermined position data to corresponding ones of the rules, wherein each of the rules includes a rule dependent upon positional information of the weapon, which one of the users is wielding the weapon, and dependent upon a facing direction of the weapon to a position of a target one of the users, the target user being affected by the facing direction per a simulation scenario for which the at least two simulation modifications apply; and the physical environment comprising: one or more cameras configured to: monitor a portion of the physical environment; capture the position data of each position indicator within the portion of the physical environment; and transmit the position data of each position indicator within the portion of the physical environment to a network switch; the network switch configured to transmit the position data of each position indicator within the portion of the physical environment to the computing device. the physical environment comprising: one or more cameras configured to: monitor a portion of the physical environment; capture the position data of each position indicator within the portion of the physical environment; and transmit the position data of each position indicator within the portion of the physical environment to a network switch, wherein the position indictors are utilized to derive the predefined position data defined by the rules; the network switch configured to transmit the position data of each position indicator within the portion of the physical environment to the computing device. Although the claims at issue are not identical, they are not patentably distinct from each other because this application claim is broader than the patent claim 1 and is therefore an obvious variant thereof. Claims 2-14 are similarly rejected over claims 2-14 of U.S. Patent No. US 12,287,911 B2, respectively. Claim Rejections - 35 USC § 112 5. The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. 6. Claims 7-8 and 14 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 7 recites the limitation "the wearable device" in line 1. Claim 8 recites the limitation "the physical object" in line 1. Claim 14 recites the limitation "the first new simulation scenario" in line 1 and the limitation "the second new simulation scenario" in line 2. There are insufficient antecedent bases for these limitations in the claims. Claim Rejections - 35 USC § 102 7. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention. 8. Claims 1-4 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Kur (U.S. Pub. No. US 2020/0341542 A1). As to claim 1, Kur (Figs. 1-16) teaches a virtual reality system (a virtual reality system; Fig. 1) comprising: a physical environment (users 200A-200D and a motion tracking system 130 with the venue such as the room, building, or outdoor area ; [0040]) defined at least partially by a physical coordinate system (the motion tracking system triangulates the markers' positions in 6 degrees of freedom (forward and back, left and right, up and down) to determine the markers' location and relative orientation within the 3D space of the VR training environment; [0081], lines 20-24) and comprising one or more physical objects (users 200A-200D) (Figs. 1 and 14); one or more users (users 200A-200D) located in the physical environment (the users 200A-200D and a motion tracking system 130 with the venue such as the room, building, or outdoor area ), wherein each of the one or more users (the users 200A-200D) are configured with wearable devices (e.g., a shock mount 210) and a weapon (a weapon simulator 202), and wherein each of the wearable devices (the shock mount 210) and the weapon (the weapon simulator 202) comprise a position indicator configured to detect position data in the physical environment (embodiments may further include a shock mount 210 disposed between the weapon simulator 202 and the inertial tracker 208; this allows the position of the weapon simulator to be tracked as the user moves it; [0041], lines 111-15) (Fig. 1); a computing device (a processor 140 in a scenario management system 102) communicatively coupled to a server (web servers; [0036], line 16), the computing device comprising a simulation engine (multiple cores to achieve a level of parallelism in execution of various tasks such as computations, rendering, and/or scenario generation; [0034], lines 1-7) (Fig. 1); a modular symptoms generator (an inverse kinematic (IK) solver 406 which can be used to model human movement to create virtual targets and/or model the motion of live users; [0044], lines 17-21) configured to interact with the simulation engine (the multiple cores in the processor 140) and the server (the web servers) (Figs. 1 and 3); and the physical environment (the users 200A-200D and the motion tracking system 130 with the venue such as the room, building, or outdoor area) comprising: one or more cameras (cameras 137) configured to: monitor a portion of the physical environment (the cameras 137 may be deployed in a venue such as a room, building, or outdoor area, such that the camera 137 can track the motion of one or more users (200A, 200B, 200C, and 200D); [0040], lines 1-4) (Fig. 1); capture the position data of each position indicator (the wearable sensors can be used to detect motion and position of a user; by having sensors on the limbs of a user, the position and/or orientation of a user can be more precisely determined; each user may further wear a helmet 205; the helmet 205 may include a sensor 207 that can be used to determine head location and/or head orientation of the wearer (user); [0040], lines 8-14) within the portion of the physical environment (the venue such as the room, building, or outdoor area) (Fig. 1); and transmit the position data of each position indicator (e.g., the position of a user with the wearable sensor 206) within the portion of the physical environment (the venue such as the room, building, or outdoor area) to a network switch (e.g., switches in the network 124; [0101], lines 6-9) (Fig. 1); the network switch (the switches in the network 124) configured to transmit the position data of each position indicator (e.g., the position of a user with the wearable sensor 206) within the portion of the physical environment (the venue such as the room, building, or outdoor area) to the computing device (the processor 140 in the scenario management system 102) (Fig. 1). As to claim 2, Kur teaches the virtual reality system of claim 1, wherein the wearable devices are selected from the group consisting of: a virtual reality head-mounted display (a virtual reality goggles or an augmented reality goggles; [0040], lines 15-17; Fig. 1), a backpack, at least one ankle strap, and at least one wrist strap. As to claim 3, Kur teaches the virtual reality system of claim 1, wherein the simulation engine is configured to control a scenario (a scenario which is generated by multiple cores) for the virtual reality system (a virtual reality system) (Fig. 1). As to claim 4, Kur teaches the virtual reality system of claim 1, wherein the weapon is selected from the group consisting of: a taser, a pepper spray canister, a gun (a gun implied by a gunshot; [0046], line 1), and a flashlight. Claim Rejections - 35 USC § 103 9. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 10. Claims 5-9 and 11-14 are rejected under 35 U.S.C. 103 as being unpatentable over Kur in view of Li (U.S. Patent No. US 10,976,809 B2). As to claim 5, Kur (Figs. 1-16) teaches a method executed by a virtual reality system (a virtual reality system; Fig. 1) for providing a simulation scenario (a scenario), the method comprising: receiving, by a simulation engine (multiple cores in the processor 140) of a virtual reality system (the virtual reality system), position data (position) of an object (a user 200A, 200B, 200C, or 200D) during a simulation scenario (the scenario) (Fig. 1); transmitting, by the simulation engine (the multiple cores in the processor 140), the position data (the position of the weapon simulator) of the object to a modular symptoms generator (an inverse kinematic (IK) solver 406 which can be used to model human movement to create virtual targets and/or model the motion of live users; [0044], lines 17-21) of the virtual reality system (the virtual reality system) (Figs. 1 and 3); receiving, by the modular symptoms generator (the inverse kinematic (IK) solver 406), the position data (the position) of the object (the user 200A, 200B, 200C, or 200D) (Figs. 1 and 3); querying, by the modular symptoms generator (the inverse kinematic (IK) solver 406), a database (a database; [0098], line 6) to determine predefined position data for the object (the user 200A, 200B, 200C, or 200D) during the simulation scenario (the scenario) and based on a set of rules (instructions; [0105], line 7) (Figs. 1 and 3). Kur does not expressly teach comparing, by the modular symptoms generator, the captured position data with the predefined position data for the object during the simulation scenario; in response to a determination that the position data for the object meets or exceeds the predefined position data, executing, by the modular symptoms generator, a first modification on a portion of the simulation scenario; and in response to a determination that the position data for the object fails to meet or exceed the predefined position data, executing, by the modular symptoms generator, a second modification on a portion of the simulation scenario. Li (Figs. 1-8) teaches comparing, by the modular symptoms generator (the processing circuitry 142), the captured position data with the predefined position data for the object during the simulation scenario (compare the motion displacement with a threshold displacement; S606; Fig. 6); in response to a determination that the position data for the object meets or exceeds the predefined position data (shorter than threshold displacement), executing, by the modular symptoms generator (the processing circuitry 142), a first modification (provide the first virtuality scene and a second virtual reality scene according to the motion displacement; S608) on a portion of the simulation scenario (the scenario) (Figs. 1 and 6); and in response to a determination that the position data for the object fails to meet or exceed the predefined position data (exceed threshold displacement), executing, by the modular symptoms generator (the processing circuitry 142), a second modification (provide the second virtual reality scene; S610) on a portion of the simulation scenario (the scenario) (Figs. 1 and 6). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have made a comparison of motion displacement as taught by Li in a method executed by a virtual reality system of Kur because the comparison of motion displacement helps the user avoid confusion about whether the user is trying to do a normal movement or trying to switch the virtual reality scene. As to claim 6, Kur teaches wherein the object is selected from the group consisting of: a user (users 200!-200D), a wearable device, a weapon, and a physical object. As to claim 7, Kur teaches wherein the wearable device is selected from the group consisting of: a backpack, a head-mounted display (a virtual reality goggles or an augmented reality goggles; [0040], lines 15-17; Fig. 1), an ankle strap, and a wrist strap. As to claim 8, Kur teaches wherein the physical object is selected from the group consisting of: a flashlight, a doorway, a wall, a ceiling, a floor, a doorknob, a steering wheel, a step, a surface, a freely movable object, a desk, a table, and a door (a door in the room; [0040], lines 2-3). As to claim 9, Kur teaches wherein the weapon is selected from the group consisting of: a taser, a pepper spray canister, and a gun (a gun implied by a gunshot; [0046], line 1). As to claim 11, Li teaches wherein the predefined position data comprises ideal position data (marked area MA; col. 8, lines 8-9) for the object (the user) during the simulation scenario. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have used a marked area as taught by Li in a method executed by a virtual reality system of Kur because the marked area is used to indicate the ideal area for determining a threshold. As to claim 12, Li teaches wherein the first modification (the first virtuality scene and a second virtual reality scene according to the motion displacement; S608) differs from the second modification (the second virtual reality scene; S610) (Fig. 8). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have made a comparison of motion displacement as taught by Li in a method executed by a virtual reality system of Kur because the comparison of motion displacement helps the user avoid confusion about whether the user is trying to do a normal movement or trying to switch the virtual reality scene. As to claim 13, Li teaches wherein the first modification results in a first new simulation scenario (generation of the first virtuality scene and a second virtual reality scene according to the motion displacement; S608), and wherein the second modification results in a second new simulation scenario (generation of the second virtual reality scene; S610) (Fig. 6). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have made a comparison of motion displacement as taught by Li in a method executed by a virtual reality system of Kur because the comparison of motion displacement helps the user avoid confusion about whether the user is trying to do a normal movement or trying to switch the virtual reality scene. As to claim 14, Li teaches wherein the first new simulation scenario (generation of the first virtuality scene and a second virtual reality scene according to the motion displacement; S608) is more favorable (accurate) as compared to the second new simulation scenario (generation of the second virtual reality scene; S610) (Fig. 6). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have made a comparison of motion displacement as taught by Li in a method executed by a virtual reality system of Kur because the comparison of motion displacement helps the user avoid confusion about whether the user is trying to do a normal movement or trying to switch the virtual reality scene. 11. Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Kur in view of Li as applied to claim 5 above, and further in view of Carriere (U.S. Pub. No. US 2015/0354922 A1). As to claim 10, Kur and Li teach the method of claim 5. Kur and Li does not expressly teach wherein the simulation scenario is selected from the group consisting of: a video gaming simulation scenario, a situational awareness training simulation scenario, an entertainment simulation scenario, an active shooter simulation scenario, a military training simulation scenario, a traffic stop simulation scenario, a car crash simulation scenario, a lifesaving simulation scenario, a law enforcement training simulation scenario, a fire fighter training simulation scenario, a flight simulation scenario, a science education simulation scenario, a medical training simulation scenario, a medical response simulation scenario, an emergency response training scenario, an EMS training scenario, a triage training scenario, a paramedic training scenario, a mission rehearsal simulation scenario, and an architectural training simulation scenario. Carriere (Figs. 1-11) teaches wherein the simulation scenario is selected from the group consisting of: a video gaming simulation scenario, a situational awareness training simulation scenario, an entertainment simulation scenario, an active shooter simulation scenario (active shooter simulation scenario; [0028], line 6), a military training simulation scenario, a traffic stop simulation scenario, a car crash simulation scenario, a lifesaving simulation scenario, a law enforcement training simulation scenario, a fire fighter training simulation scenario, a flight simulation scenario, a science education simulation scenario, a medical training simulation scenario, a medical response simulation scenario, an emergency response training scenario, an EMS training scenario, a triage training scenario, a paramedic training scenario, a mission rehearsal simulation scenario, and an architectural training simulation scenario. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have used active shooter simulation scenarios as taught by Carriere in a method executed by a virtual reality system of Kur as modified by Li because active shooter simulation scenarios provide realistic, high-intensity, and hands-on training that improves muscle memory, improves decision-making under stress, and enhances preparedness by simulating the chaos of an actual attack. Conclusion 12. The prior art made of record and not relied upon is considered pertinent to applicant’s disclosure. Vandonkelaar (U.S. Patent No. US 10,717,001 B2) is cited to teach a system and method for replaying the activity on request to individuals and to the group at large within a virtual reality (VR) arena for training and efficiency improvement purposes from within the VR system or outside. Rodniansky (U.S. Patent No. US 10,417,441 B2) is cited to teach effectively validating dynamic structured query language (SQL) database queries through database activity monitoring. Inquiries 13. Any inquiry concerning this communication or earlier communications from the examiner should be directed to KWANG-SU YANG whose telephone number is (571)270-7307. The examiner can normally be reached on (571)270-7307 from Monday-Friday during 9:00AM-6:00PM EST. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chanh Nguyen, can be reached on (571)272-7772. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://portal.uspto.gov/external/portal. Should you have questions about access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. /KWANG-SU YANG/Primary Examiner, Art Unit 2623
Read full office action

Prosecution Timeline

Mar 25, 2025
Application Filed
Jan 24, 2026
Non-Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603047
DISPLAY SUBSTRATE COMPRISING PIXEL DRIVING CIRCUIT HAVING COMPENSATION SUB-CIRCUIT AND OPERATING METHOD THEREFOR, AND DISPLAY APPARATUS
2y 5m to grant Granted Apr 14, 2026
Patent 12603051
DISPLAY PANEL DRIVING METHOD FOR ADJUSTING SUB-PIXEL ROW SCANNING DURATION BASED ON IMAGE DATA, DRIVING CIRCUIT, AND DISPLAY APPARATUS
2y 5m to grant Granted Apr 14, 2026
Patent 12597373
BRIGHTNESS ROLL-OFF COMPENSATION FOR VIRTUAL REALITY DISPLAYS
2y 5m to grant Granted Apr 07, 2026
Patent 12597391
Display Pixel Circuitry with Shared Emission Transistors
2y 5m to grant Granted Apr 07, 2026
Patent 12592170
DISPLAY CONTROL METHOD AND TERMINAL DEVICE FOR VEHICLES
2y 5m to grant Granted Mar 31, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
74%
Grant Probability
92%
With Interview (+18.0%)
2y 9m
Median Time to Grant
Low
PTA Risk
Based on 775 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month