Prosecution Insights
Last updated: April 19, 2026
Application No. 16/875,688

AUGMENTED AND VIRTUAL REALITY SIMULATOR FOR PROFESSIONAL AND EDUCATIONAL TRAINING

Non-Final OA §101§103
Filed
May 15, 2020
Examiner
LANE, DANIEL E
Art Unit
3715
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Simx Inc.
OA Round
5 (Non-Final)
4%
Grant Probability
At Risk
5-6
OA Rounds
3y 5m
To Grant
13%
With Interview

Examiner Intelligence

Grants only 4% of cases
4%
Career Allow Rate
12 granted / 290 resolved
-65.9% vs TC avg
Moderate +9% lift
Without
With
+8.7%
Interview Lift
resolved cases with interview
Typical timeline
3y 5m
Avg Prosecution
42 currently pending
Career history
332
Total Applications
across all art units

Statute-Specific Performance

§101
29.0%
-11.0% vs TC avg
§103
19.2%
-20.8% vs TC avg
§102
17.8%
-22.2% vs TC avg
§112
29.7%
-10.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 290 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114 was filed in this application after a decision by the Patent Trial and Appeal Board, but before the filing of a Notice of Appeal to the Court of Appeals for the Federal Circuit or the commencement of a civil action. Since this application is eligible for continued examination under 37 CFR 1.114 and the fee set forth in 37 CFR 1.17(e) has been timely paid, the appeal has been withdrawn pursuant to 37 CFR 1.114 and prosecution in this application has been reopened pursuant to 37 CFR 1.114. Applicant’s submission filed on 30 September 2025 has been entered. This a response to Applicant’s amendments filed on 30 September 2025, wherein: Claims 1, 10, and 19 are amended. Claims 6 and 15 are original. Claims 2, 3, 5, 8, 9, 11, 12, 14, 17, 18, 20, 22, and 23 are previously presented. Claims 4, 7, 13, 16, and 21 are canceled. Claims 1-3, 5, 6, 8-12, 14, 15, 17-20, 22, and 23 are pending. Claim Objections Claims 1-3, 5, 6, 8-12, 14, 15, 17-20, 22, and 23 are objected to because of the following informalities: The claims recite both “said” and “the”. Uniformity is recommended to increase clarity. Each of claims 2, 3, 5, 6, 8, 9, 11, 12, 14, 15, 17, 18, 20, 22, and 23 recites either “[t]he method/media/system of claim” or “[t]he method/media according to claim”. Uniformity is recommended to increase clarity. Dependent claims2, 3, 5, 6, 8, 9, 11, 12, 14, 15, 17, 18, 20, 22, and 23 inherit the deficiencies of their respective parent claims, and are thus objected to under the same rationale. Appropriate correction is required. Claim Rejections - 35 USC § 101 The text of those sections of Title 35, U.S. Code 101 not included in this action can be found in a prior Office action. Claims 1-3, 5, 6, 8-12, 14, 15, 17-20, 22, and 23 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without including additional elements that are sufficient to amount to significantly more than the judicial exception itself. Step 1 The claims are directed to a method and products which fall under the four statutory categories (STEP 1: YES). Step 2A, Prong 1 Independent claim 1 recites: A method comprising: receiving, wirelessly by at least one augmented or virtual reality device during simulation of a medical procedure, code broadcast from a marker embedded in a physical object in a real space; responsive to receiving the code broadcast from the marker embedded in the physical object, matching the marker to a virtual medical tool associated with an instruction set for simulating characteristics of the medical tool in the simulation of the medical procedure; projecting, by the at least one augmented or virtual reality device during simulation of the medical procedure based in part on the instruction set associated with the virtual medical tool, a virtual avatar and the virtual medical tool into the real space or a virtual space within an augmented or virtual reality environment; detecting, using at least one sensor of the at least one augmented or virtual reality device based on data broadcast by the marker, an interaction between the virtual medical tool and the virtual avatar that are projected into the real space or the virtual space within the augmented or virtual reality environment, wherein the interaction is detected by correlating a virtual simulated spatial position of the virtual avatar with a physical location of the physical object to which the virtual medical tool is registered; generating, based on the interaction between the medical tool and the virtual avatar detected using the at least one sensor of the at least one augmented or virtual reality device, a record of the interaction; comparing the record of the interaction to a performance metric that identifies a pre-determined set of goal actions for a simulation of a medical operation; and providing, by the at least one augmented or virtual reality device based at least in part on said comparing and executing the instruction set matched to the marker, real-time feedback to at least one participant in the augmented or virtual reality environment during the simulation of the medical operation. Independent claim 10 recites: One or more non-transitory computer-readable media storing instructions which, when executed by one or more hardware processors, cause operations comprising: receiving, wirelessly by at least one augmented or virtual reality device during simulation of a medical procedure, code broadcast from a marker embedded in a physical object in a real space; responsive to receiving the code broadcast from the marker embedded in the physical object, matching the marker to a virtual medical tool associated with an instruction set for simulating characteristics of the medical tool in the simulation of the medical procedure; projecting, by the at least one augmented or virtual reality device during simulation of the medical procedure based in part on the instruction set associated with the virtual medical tool, a virtual avatar and the virtual medical tool into the real space or a virtual space within an augmented or virtual reality environment; detecting, using at least one sensor of the at least one augmented or virtual reality device based on data broadcast by the marker, an interaction between the virtual medical tool and the virtual avatar that are projected into the real space or the virtual space within the augmented or virtual reality environment, wherein the interaction is detected by correlating a virtual simulated spatial position of the virtual avatar with a physical location of the physical object to which the virtual medical tool is registered; generating, based on the interaction between the medical tool and the virtual avatar detected using the at least one sensor of the at least one augmented or virtual reality device, a record of the interaction; comparing the record of the interaction to a performance metric that identifies a pre-determined set of goal actions for a simulation of a medical operation; and providing, by the at least one augmented or virtual reality device based at least in part on said comparing and executing the instruction set matched to the marker, real-time feedback to at least one participant in the augmented or virtual reality environment, during the simulation of the medical operation. Independent claim 19: A system comprising: one or more hardware processors; one or more non-transitory computer-readable media storing instructions which, when executed by the one or more hardware processors, cause: receiving, wirelessly by at least one augmented or virtual reality device during simulation of a medical procedure, code broadcast from a marker embedded in a physical object in a real space; responsive to receiving the code broadcast from the marker embedded in the physical object, matching the marker to a virtual medical tool associated with an instruction set for simulating characteristics of the medical tool in the simulation of the medical procedure; projecting, by the at least one augmented or virtual reality device during simulation of the medical procedure based in part on the instruction set associated with the virtual medical tool, a virtual avatar and the virtual medical tool into the real space or a virtual space within an augmented or virtual reality environment; detecting, using at least one sensor of the at least one augmented or virtual reality device based on data broadcast by the marker, an interaction between the virtual medical tool and the virtual avatar that are projected into the real space or the virtual space within the augmented or virtual reality environment, wherein the interaction is detected by correlating a virtual simulated spatial position of the virtual avatar with a physical location of the physical object to which the virtual medical tool is registered; generating, based on the interaction between the medical tool and the virtual avatar detected using the at least one sensor of the at least one augmented or virtual reality device, a record of the interaction; comparing the record of the interaction to a performance metric that identifies a pre-determined set of goal actions for a simulation of a medical operation; and providing, by the at least one augmented or virtual reality device based at least in part on said comparing and executing the instruction set matched to the marker, real-time feedback to at least one participant in the augmented or virtual reality environment, during the simulation of the medical operation. All of the foregoing underlined elements amount to the abstract idea grouping of a certain method of organizing human activity because it is managing personal behavior or interactions between people (including social activities, teaching, and following rules or instructions) by collecting information (e.g., detecting and generating a record steps), analyzing it (e.g., comparing, generating a performance evaluation, determining steps), and outputting the results of the collection and analysis (e.g., generating a performance evaluation). This collection, analysis, and outputting of results also amounts to the abstract idea grouping of mental processes as the claims, under their broadest reasonable interpretation, cover performance of the limitations in the mind with the aid of pen and paper (including observation, evaluation, judgment, opinion). Therefore, the claims recite a judicial exception. (STEP 2A, PRONG 1: YES). Step 2A, Prong 2 This judicial exception is not integrated into a practical application because the claims do not include additional elements that are sufficient to integrate the exception into a practical application under the considerations set forth in MPEP 2106.04(d). The elements of the claims above that are not underlined constitute additional elements. The following additional elements merely generally link the judicial exception to a particular technological environment or field of use: receiving, wirelessly by at least one augmented or virtual reality device during simulation of a medical procedure, code broadcast from a marker embedded in a physical object in a real space (claims 1, 10, and 19); responsive to receiving the code broadcast from the marker embedded in the physical object, matching the marker to a virtual medical tool associated with an instruction set for simulating characteristics of the medical tool in the simulation of the medical procedure (claims 1, 10, and 19); projecting, by the at least one augmented or virtual reality device during simulation of the medical procedure based in part on the instruction set associated with the virtual medical tool, a virtual avatar and the virtual medical tool into the real space or a virtual space within an augmented or virtual reality environment (claims 1, 10, and 19); at least one augmented or virtual reality device (claims 1, 10, and 19); at least one sensor (claims 1, 10, and 19); an augmented or virtual reality environment (claims 1, 10, and 19); at least one computer device (claims 2, 11, and 20); participant apparatuses (claims 5 and 14); rebroadcasting the data to a plurality of augmented or virtual reality devices response to receiving the data by that least one sensor from the marker wherein the plurality of augmented or virtual reality devices simultaneously or near simultaneously change the augmented or virtual reality environment responsive to receive the data that is rebroadcast (claim 22); and responsive to detecting the interaction, projecting, by the at least one augmented or virtual reality device during simulation of the medical procedure based in part on the instruction set associated with the virtual medical tool, a simulated behavior of the virtual avatar (claim 23). Although the claims recite the elements identified above, these elements are recited at a high level of generality in a conventional arrangement for performing their basic computer functions (i.e., receiving, processing, outputting data) associated with augmented and virtual reality. This is evidenced by at least Fig. 1, 2, 4-6, and 9 which illustrate the components as non-descript black boxes or stock images. Further evidence is provided by the specification. See, for example, at least para. 26-31, 45-51, 58-74, and 84-96. Additionally, one of ordinary skill in the art would immediately recognize that all of the newly added limitations with respect to a marker (i.e., receiving code broadcast, matching the marker, projecting the virtual medical tool, projecting a simulated behavior of the virtual avatar) merely describe conventional operations of nearly all, if not all, forms of extended reality (XR) systems that use a marker. Thus, the judicial exception is not implemented with, or used in, a particular machine or manufacture. Additionally, the claims do not recite any limitations that improve the functionality of the computer system because the claimed recording (i.e., generating a record), comparing, generating, and determining are merely performing the steps of processing data, in addition to the mere claiming of acquiring data (the detecting step) and broadcasting/rebroadcasting data (aka, merely transmitting data), but are not tied to improving any functionality of the computer system. Furthermore, the at least one augmented or virtual reality device is merely recited to perform its insignificant extrasolution activity of projecting a virtual avatar into a real or virtual space within an augmented or virtual reality environment and the at least one sensor is merely recited to perform its insignificant pre-solution activities of data gathering. The mere inclusion of a generically recited at least one augmented or virtual reality device that is conventionally configured to perform its conventional functions merely indicates a field of use or technological environment in which to apply a judicial exception. Applicant’s mere use of generically recited at least one augmented or virtual reality device with a sensor and a marker to implement an augmented or virtual reality environment in which to perform the method closely relates to example (vi) provided in MPEP 2106.05(h) - limiting the abstract idea of collecting information, analyzing it, and displaying certain results of the collection and analysis to data related to the electric power grid, because limiting application of the abstract idea to a power-grid monitoring is simply an attempt to limit the abstract idea to a particular technological environment. Thus, at least one augmented or virtual reality device and the at least one sensor are also not tied to improving any functionality of the computer system. The claims do not apply or use a judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition. For instance, it is silent regarding any specific treatment or prophylaxis for any specific disease or medical condition. Additionally, the additional elements do not apply or use a judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment. In particular, it is merely the use of conventional or generic technology in a nascent but well-known environment (in the instant case, applying existing XR technology to the environment of medical training, not any improvement in XR technology itself), without any assertion that the invention reflects an inventive solution to any problem presented in the technology, itself. Accordingly, based on all of the considered factors, these additional elements do not integrate the abstract idea into a practical application. Therefore, the claims are directed to the judicial exception. (STEP 2A, PRONG 2: NO). Step 2B The independent and dependent claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception under the considerations set forth in MPEP 2106.05. As identified in Step 2A, Prong 2, above, the claimed system and the process it performs does not require the use of a particular machine, nor does it result in the transformation of an article. The claims do not involve an improvement in a computer or other technology. Although the claims recite components (identified in Step 2A, Prong 2) for performing at least some of the recited functions, these elements are recited at a high level of generality in a conventional arrangement for performing their basic computer functions (i.e., receiving, processing, outputting data). This is at least evidenced by the manner in which this is disclosed that indicates that the additional elements are sufficiently well-known that the specification does not need to describe the particulars of such additional elements to satisfy 35 USC 112(a) as identified in Step 2A, Prong 2, above. Additionally, one of ordinary skill in the art would immediately recognize that all of the limitations with respect to a marker merely describe conventional operations of nearly all, if not all, forms of systems that use a marker, including an electromagnetic marker. Thus, the judicial exception is not implemented with, or used in, a particular machine or manufacture. Furthermore, this also evidences that the components are merely an attempt to link the abstract idea to a particular technological environment, but do not result in an improvement to the technology or computer functions employed. In particular, at least one augmented or virtual reality device and the at least one sensor being recited and organized in a generic fashion to perform their generic functions of augmented or virtual reality projection and data gathering, respectively, are merely adding insignificant extrasolution activity to the judicial exception (e.g., mere data gathering in conjunction with a law of nature or abstract idea). Further, generating a record, as claimed, is well-understood, routine and conventional electronic record keeping as identified in MPEP 2106.05(d) which cites (Electronic recordkeeping, Alice Corp. Pty. Ltd. v. CLS Bank Int'l, 573 U.S. 208, 225, 110 USPQ2d 1984 (2014) (creating and maintaining "shadow accounts"); Ultramercial, 772 F.3d at 716, 112 USPQ2d at 1755 (updating an activity log);). Additionally, as identified in Step 2A, Prong 2, the mere inclusion of a generically recited at least one augmented or virtual reality device with a sensor and a marker that are conventionally configured to perform their conventional functions merely indicates a field of use or technological environment in which to apply a judicial exception. This further evidences that the claims do not recite any specific rules with specific characteristics that improve the functionality of the computer system. Viewed as a whole, these additional claim elements do not provide meaningful limitation to transform the abstract idea into a patent eligible application of the abstract idea such that the claims amount to significantly more than the abstract idea of itself (STEP 2B: NO). Therefore, the claims are rejected under 35 USC 101 as being directed to non-statutory subject matter. Claim Rejections - 35 USC § 103 The text of those sections of Title 35, U.S. Code 103 not included in this action can be found in a prior Office action. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1, 2, 5, 6, 8-11, 14, 15, 17-20, 22, and 23 are rejected under 35 U.S.C. 103 as being unpatentable over Thaler et al. (US 2012/0282583, hereinafter referred to as Thaler). Regarding claims 1, 10, and 19, Thaler teaches a method (claim 1), one or more non-transitory computer-readable media storing instructions which, when executed by one or more hardware processors, cause operations (claim 10), and a system (claim 19) comprising: receiving, wirelessly by at least one augmented or virtual reality device during simulation of a medical procedure, code broadcast from a marker embedded in a physical object in a real space (Thaler, para. 11, “Input units 105 may comprise means for receiving input from physical medical tools that may be simulated, e.g., as described herein. For example, physical objects or tools such as handles, activation buttons and the like, as well as real medical tools that may be configured to produce output signals, e.g., signals related to a movement, location, resistance, orientation or force applied, may be connected to one or more input units 105 to enable manipulation of a digital simulation of such physical objects or tools. Input units 105 may include a wired or wireless network interface card (NIC). Specifically, input units 105 may receive input from stationary transmitter/receiver unit 181 and mobile transmitter/receiver unit 160.”; responsive to receiving the code broadcast from the marker embedded in the physical object, simulating characteristics of the medical tool as a virtual medical tool in the simulation of the medical procedure (Thaler, para. 11, “Input units 105 may comprise means for receiving input from physical medical tools that may be simulated, e.g., as described herein. For example, physical objects or tools such as handles, activation buttons and the like, as well as real medical tools that may be configured to produce output signals, e.g., signals related to a movement, location, resistance, orientation or force applied, may be connected to one or more input units 105 to enable manipulation of a digital simulation of such physical objects or tools.” A digital simulation of such physical objects or tools are the virtual medical tools. Thaler also refers to these as “3D models” of the physical medical tools. Para. 16, “Reflected IR light may be received by the emitting or other device and, based on properties of reflected light, the location, orientation or other aspects of a medical tool or any other object may be determined It will be understood that embodiments of the invention are not limited by the system or method used for determining a location, position or orientation of objects in a space near mannequin 170.” Para. 20, “Any information, data or parameters required by device 101 in order to perform, or participate in a simulation of an invasive procedure may be stored in data repository 140. For example, management unit 135 may interact, e.g., over a network and possibly according to and/or by implementing a predefined protocol, with any external data repository and may be thus received any relevant information, e.g., provided by a manufacturer of mannequin 170 or a manufacturer or provider of simulation medical tools and may stored received in data repository 140.” Para. 26, “products available from Polhemus® and/or NDI® may be used to track a medical tool, finger or element used in performing a simulated procedure. Using tracking information provided by a tracking system and a location, position, orientation or other spatial parameters of a dummy or doll (e.g., mannequin 170), digital models of one or more of a tool, finger or element and a digital model of the dummy or doll may be manipulated (and displayed), in real-time, such that the digital models adequately and closely represent one or more of the tool, finger, element and doll.”); projecting, by the at least one augmented or virtual reality device during simulation of the medical procedure based in part on the instruction set associated with the virtual medical tool (Thaler, Fig. 2, Performing, by a user using the physical medical tool, a simulation of a medical procedure 230), a virtual avatar and the virtual medical tool into the real space or virtual space within an augmented or virtual reality environment (Thaler, Fig. 2, Providing a physical model of an anatomical structure and providing a digital 3D model of the anatomical structure 210, Providing a physical medical tool and providing a digital 3D model of the physical medical tool 215. The digital 3D model of the anatomical structure is a virtual avatar and the digital 3D model of the physical medical tool is a virtual medical tool projected into a real or virtual space within an augmented or virtual reality environment.); detecting, using at least one sensor of the at least one augmented or virtual reality device based on data broadcast by the marker, an interaction between the virtual medical tool and the virtual avatar that are projected into the real space or the virtual space within the augmented or virtual reality environment, wherein the interaction is detected by correlating a virtual simulated spatial position of the virtual avatar with a physical location of the physical object to which the virtual medical tool is registered (Thaler, Fig. 2, Determining a location and/or an orientation of the medical tool based on a signal transmitted by the transmitter and received by the receiver wherein the location and/or orientation are with respect to a location of the physical model 225, Performing, by a user using the physical medical tool, a simulation of a medical procedure 230, Manipulating the digital 3D models of the anatomical structure and the medical tool according to the location of the physical medical tool 235; para. 32, “simulation of a medical procedure may comprise an image or graphical representation of an anatomical organ, e.g., a model as described herein, that may be rotated or otherwise positioned, or may be made to imitate a real anatomical system, e.g., change or evolve with time, change shape in response to an operation of, or an interaction with a medical tool or substance, bleed, or otherwise present or display real anatomical organ's behavior and related tools, medicine or other aspects.” Para. 40, “For example, a 3D model of a medical tool may be moved, rotated or made to change its shape based on a location, position or orientation of a related physical tool. A 3D model of a mannequin may be manipulated based on a location, position or orientation of a medical tool. For example, a modeled tissue or organ included in a 3D digital model of a mannequin may be made to bend, stretch or otherwise change shape, location or orientation based on a position, location or orientation of a medical tool, for example, in order to simulate an interaction of a medical tool with a mannequin. Accordingly, the 3D models of a medical tool and mannequin may be manipulated such that they closely duplicate, imitate, replicate, repeat, copy or reproduce any movement or other aspect of the physical medical tool and mannequin. The 3D model of the mannequin may be manipulated such that it imitates or reproduces the response or interaction of a real subject, patient or physical mannequin with the medical tool.”); generating, based on the interaction between the medical tool and the virtual avatar detected using the at least one sensor of the at least one augmented or virtual reality device, a record of the interaction (Thaler, para. 23, “Stationary transmitter/receiver unit 181 may be firmly secured to a table or support tray such that it can not be moved. The location and/or orientation of stationary transmitter/receiver unit 181 may be known and/or recorded. For example, the distance of stationary transmitter/receiver unit 181 from mannequin 170 ( or a specific part of mannequin 170) may be known. Location position and/or orientation of stationary transmitter/receiver unit 181 may be recorded, e.g., stored in data repository 140 and/or loaded into memory 130. Generally, any information or parameter related to a location, position or orientation of stationary transmitter/receiver unit 181 and of mannequin 170 may be known and recorded. Accordingly, the location, position and/or orientation of stationary transmitter/receiver unit 181 with respect or relevant to a location and/or orientation of mannequin 170 (which, as described herein, may be stationary and its location and/or orientation may be known and/or recorded) may be known and/or recorded.”); comparing the record of the interaction to a performance metric that identifies a pre-determined set of goal actions for a simulation of a medical operation (Thaler, para. 49, “a performance of a procedure by an expert may be recorded and any aspect of a performance of the procedure by a trainee may be compared or otherwise related to the recorded procedure as performed by the expert.”); and providing, by the at least one augmented or virtual reality device based at least in part on said comparing and executing the instruction set matched to the marker, real-time feedback to at least one participant in the augmented or virtual reality environment, during the simulation of the medical operation (Thaler, Fig. 2, Providing a user with feedback related to a performance of the medical procedure based on a location and/or orientation of at least one of: the finger, physical medical tool and the element 255, Determining a score related to a performance of the medical procedure and recording the score 260; para. 47, “providing of feedback may be performed simultaneously or concurrently with performance of a related simulation of a procedure, or it may be otherwise at the same time. In some embodiments, providing feedback may be synchronized or otherwise coordinated with a progress, state, mode, context or any relevant aspect of a simulated procedure. Feedback may be provided together with the simulated procedure, e.g., while the simulated procedure is in progress. For example, a single display may be used to present feedback and one or more 3D models, e.g., 3D models of a tool, finger and/or element.” It is noted that the physical medical tool is matched to a virtual medical tool which behaves in the augmented/virtual space based on manipulation of the physical medical tool as identified earlier and throughout Thaler.). Thaler does not explicitly teach matching the marker to a virtual medical tool associated with an instruction set. However, Thaler does teach matching the marker to a virtual representation associated with an instruction set (Thaler, para. 42, “3D digital model may be automatically synchronized with a set of physical parts modeling an aspect of a patient, e.g., the set of removable or replaceable elements included or installed in mannequin 170. For example, mobile transmitter/receiver unit 160 may receive a signal or information (e.g., an identification code) from a passive or active component installed in a replaceable physical parts installed in mannequin 170, transmit the identification code to controller 131 (e.g., to management unit 135) that may instruct model generation 110 to generate a model according to specific parameters selected based on an identification code.”). Thaler also teaches that “the location, orientation or other aspects of a medical tool or any other object may be determined It will be understood that embodiments of the invention are not limited by the system or method used for determining a location, position or orientation of objects in a space near mannequin 170 (see Thaler at para. 16)”, “[a]ny information, data or parameters required by device 101 in order to perform, or participate in a simulation of an invasive procedure may be stored in data repository 140. For example, management unit 135 may interact, e.g., over a network and possibly according to and/or by implementing a predefined protocol, with any external data repository and may be thus received any relevant information, e.g., provided by a manufacturer of mannequin 170 or a manufacturer or provider of simulation medical tools and may stored received in data repository 140 (see Thaler at para. 20)”, and “products available from Polhemus® and/or NDI® may be used to track a medical tool, finger or element used in performing a simulated procedure. Using tracking information provided by a tracking system and a location, position, orientation or other spatial parameters of a dummy or doll (e.g., mannequin 170), digital models of one or more of a tool, finger or element and a digital model of the dummy or doll may be manipulated (and displayed), in real-time, such that the digital models adequately and closely represent one or more of the tool, finger, element and doll (see Thaler at para. 26).” Thus, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention for Thaler to include matching the marker to a virtual medical tool associated with an instruction set because one of ordinary skill in the art before the effective filing date of the claimed invention would readily know how to apply an identification code to a virtual medical tool in the same manner as the identification code for automatically synchronizing a 3D digital model with a set of physical parts modeling an aspect of a patient as explicitly taught in Thaler. Thus, it is merely using a known technique in the same way. Regarding claims 2, 11, and 20, Thaler teaches the method according to claim 1, the media according to claim 10, and the system of claim 19, wherein the record further includes data associated with a second detected interaction comprising at least one of a body movement, direct input of data into the at least one computer device, at least one vocalization, or an expression of specific ideas via body movement or vocalization (Thaler, Fig. 2, Attaching a transmitter to a finger of a user, providing a digital 3D model of the finger and receiving a location and/or an orientation parameter related to a location and/or an orientation of the finger, wherein the location and/or orientation are with respect to a location of the physical model 240, manipulating the digital 3D models of the anatomical structure and the finger according to the location and/or orientation of the finger 245). Regarding claims 5 and 14, Thaler teaches the method according to claim 1 and the media according to claim 10, further comprising: generating a file for the augmented or virtual reality environment session (Thaler, para. 23, “Stationary transmitter/receiver unit 181 may be firmly secured to a table or support tray such that it can not be moved. The location and/or orientation of stationary transmitter/receiver unit 181 may be known and/or recorded. For example, the distance of stationary transmitter/receiver unit 181 from mannequin 170 ( or a specific part of mannequin 170) may be known. Location position and/or orientation of stationary transmitter/receiver unit 181 may be recorded, e.g., stored in data repository 140 and/or loaded into memory 130. Generally, any information or parameter related to a location, position or orientation of stationary transmitter/receiver unit 181 and of mannequin 170 may be known and recorded. Accordingly, the location, position and/or orientation of stationary transmitter/receiver unit 181 with respect or relevant to a location and/or orientation of mannequin 170 (which, as described herein, may be stationary and its location and/or orientation may be known and/or recorded) may be known and/or recorded.”); wherein the file for the augmented or virtual reality environment session is comprised of a record of an execution of a file containing an instruction set controlling the augmented or virtual reality environment (Thaler, para. 20, “Any information, data or parameters required by device 101 in order to perform, or participate in a simulation of an invasive procedure may be stored in data repository 140.”), timing data regarding when instructions were executed and events occurred (Thaler, para. 48, “A time for completing a predefined part of an operation may be preconfigured and the time a user takes to complete such part may be recorded (e.g., controller 131 may start a timer when an element is located in a first location and stop the timer when the element reaches a second location).”), a plurality of events occurring during execution of the instruction set (Thaler, para. 48, “preferred location or placement of an element may be calculated or it may be indicated or dictated by an expert and the scoring may take also into account information about adverse events such as vessel perforation, hitting nerves and amount of blood lost during the operation” Para. 49, “a performance of a procedure by an expert may be recorded and any aspect of a performance of the procedure by a trainee may be compared or otherwise related to the recorded procedure as performed by the expert.”), a plurality of data collected from a plurality of participant apparatuses, and video representing the augmented or virtual reality environment from a perspective of at least one participant or a virtual third party perspective (Thaler, para. 23, “Stationary transmitter/receiver unit 181 may be firmly secured to a table or support tray such that it can not be moved. The location and/or orientation of stationary transmitter/receiver unit 181 may be known and/or recorded. For example, the distance of stationary transmitter/receiver unit 181 from mannequin 170 ( or a specific part of mannequin 170) may be known. Location position and/or orientation of stationary transmitter/receiver unit 181 may be recorded, e.g., stored in data repository 140 and/or loaded into memory 130. Generally, any information or parameter related to a location, position or orientation of stationary transmitter/receiver unit 181 and of mannequin 170 may be known and recorded. Accordingly, the location, position and/or orientation of stationary transmitter/receiver unit 181 with respect or relevant to a location and/or orientation of mannequin 170 (which, as described herein, may be stationary and its location and/or orientation may be known and/or recorded) may be known and/or recorded.”). Regarding claims 6 and 15, Thaler teaches the method according to claim 5 and the media according to claim 14, wherein an event in the plurality of events corresponds to at least one change to the augmented or virtual reality environment (Thaler, Fig. 2, Manipulating the digital 3D models of the anatomical structure and the medical tool according to the location of the physical medical tool 235). Regarding claims 8 and 17, Thaler teaches the method according to claim 1 and the media according to claim 10, wherein the predetermined set of goal actions are defined by an author of an instruction set for executing the simulation of the medical operation within the virtual or augmented reality environment (Thaler, para. 49, “a performance of a procedure by an expert may be recorded and any aspect of a performance of the procedure by a trainee may be compared or otherwise related to the recorded procedure as performed by the expert.”). Regarding claims 9 and 18, Thaler teaches the method according to claim 1 and the media according to claim 10, further comprising: determining, based on data identifying the orientation broadcast by the marker, a position of the virtual medical tool in the virtual space relative to the virtual avatar projected in the virtual or augmented reality environment (Thaler, Fig. 2, Determining a location and/or an orientation of the medical tool based on a signal transmitted by the transmitter and received by the receiver wherein the location and/or orientation are with respect to a location of the physical model 225). Regarding claim 22, Thaler teaches the method of claim 1, further comprising: rebroadcasting the data to a plurality of augmented or virtual reality devices (Thaler, para. 17, “According to embodiments of the invention, device 101 may be implemented on a single computational device or alternatively, in a distributed configuration, on two or more different computational devices.”) responsive to receiving the data by the at least one sensor from the marker (Thaler, para. 25, “mobile transmitter/receiver unit 160 may transmit a first electromagnetic signal in a first direction, e.g., along the X axis in a given coordinate system, and another signal in a second direction, e.g., along a Y axis in the same coordinate system. Accordingly, by analyzing a signal received from mobile transmitter/receiver unit 160 by stationary transmitter/receiver unit 181, the location and orientation of mobile transmitter/receiver unit 160 may be determined, monitored and/or tracked.”); wherein the plurality of augmented or virtual reality devices simultaneously or near simultaneously change the augmented or virtual reality environment responsive to receiving the data that is rebroadcast (Thaler, Fig. 2, Manipulating the digital 3D models of the anatomical structure and the medical tool according to the location of the physical medical tool 235). Regarding claim 23, Thaler teaches the method of claim 1, further comprising: responsive to detecting the interaction, projecting, by the at least one augmented or virtual reality device during simulation of the medical procedure based in part on the instruction set associated with the virtual medical tool, a simulated behavior of the virtual avatar (Thaler, para. 32, “simulation of a medical procedure may comprise an image or graphical representation of an anatomical organ, e.g., a model as described herein, that may be rotated or otherwise positioned, or may be made to imitate a real anatomical system, e.g., change or evolve with time, change shape in response to an operation of, or an interaction with a medical tool or substance, bleed, or otherwise present or display real anatomical organ's behavior and related tools, medicine or other aspects.” Para. 40, “For example, a 3D model of a medical tool may be moved, rotated or made to change its shape based on a location, position or orientation of a related physical tool. A 3D model of a mannequin may be manipulated based on a location, position or orientation of a medical tool. For example, a modeled tissue or organ included in a 3D digital model of a mannequin may be made to bend, stretch or otherwise change shape, location or orientation based on a position, location or orientation of a medical tool, for example, in order to simulate an interaction of a medical tool with a mannequin. Accordingly, the 3D models of a medical tool and mannequin may be manipulated such that they closely duplicate, imitate, replicate, repeat, copy or reproduce any movement or other aspect of the physical medical tool and mannequin. The 3D model of the mannequin may be manipulated such that it imitates or reproduces the response or interaction of a real subject, patient or physical mannequin with the medical tool.” Para. 44, “the 3D model of the mannequin may be manipulated such that it imitates or reproduces the response or interaction of a real subject, patient or physical mannequin with a physician's finger or hand.”). Claims 3 and 12 are rejected under 35 U.S.C. 103 as being unpatentable over Thaler et al. (US 2012/0282583, hereinafter referred to as Thaler) as applied to claims 1 and 10 above, in view of von Lubitz et al.1 (“Simulation-based medical training: the Medical Readiness Trainer concept and the preparation for civilian and military medical field operations”, hereinafter referred to as von Lubitz). Regarding claims 3 and 12, Thaler teaches the method according to claim 1 and the media according to claim 10. Thaler does not explicitly teach wherein the record further includes data for a vocalization diagnosing a medical condition of the virtual avatar projected into the augmented or virtual reality environment. However, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention for the record in Thaler to include data for a vocalization diagnosing a medical condition of the virtual avatar projected into the augmented or virtual reality environment because the presentation unit in Thaler “may control, coordinate or manage a display or presentation of video, audio or other aspects a simulated procedure” (see Thaler at para. 19) and one of ordinary skill in the art would recognize that a vocalization diagnosing a medical condition of a patient, in this case virtual patient, is a routine audio aspect of a medical procedure, simulated or real. See, for example, von Lubitz at pg. 2, section 3 Virtual Reality in Medical Training which recites that it “is then obvious that, in order to train for medical events involving complex interaction of medical skills (manual and diagnostic), behaviors, critical time limits, and stress, a platform incorporating all of these elements into a mirror image of ‘real life’ is need.” Response to Arguments Applicant's arguments with respect to the rejection of the claims under 35 USC 101 have been fully considered but they are not persuasive. Applicant asserts that the amended claims recite limitations directed to providing real-time feedback by an augment[ed] or virtual reality device in a manner other than what is routine, conventional, or well understood. Examiner respectfully disagrees. This is merely a conclusory statement made without evidentiary support, and is not persuasive. In contrast, one can merely look at the prior art to see that these are conventional marker-based operations and broadcasting operations in XR systems. These newly added and amended limitations which identifies the conventional use of a marker in an augmented or virtual reality environment only furthers the assessment that the mere use of a generic augmented or virtual reality device does not integrate the judicial exception into a practical application nor does it add significantly more. In particular, the mere inclusion of a generically recited at least one augmented or virtual reality device that is conventionally configured with a marker and a sensor to perform their conventional functions merely indicates a field of use or technological environment in which to apply a judicial exception. Applicant’s mere use of a generically recited at least one augmented or virtual reality device with a sensor and a marker to implement an augmented or virtual reality environment in which to perform the method closely relates to example (vi) provided in MPEP 2106.05(h) - limiting the abstract idea of collecting information, analyzing it, and displaying certain results of the collection and analysis to data related to the electric power grid, because limiting application of the abstract idea to a power-grid monitoring is simply an attempt to limit the abstract idea to a particular technological environment. The rejection stands. Applicant's arguments with respect to the rejections of the claims under 35 USC 102 and 103 have been fully considered but they are not persuasive. Applicant asserts that these rejections were reversed per the Board Decision and that the pending claims remain patentable. Examiner is not persuaded. Applicant is directed to the rejections of the claims under 35 USC 103, above, which have been updated to address the amendments to the claims and rectify the issue raised in the Board Decision that “Thaler fails to disclose, explicitly or implicitly, that these signals [providing information regarding the physical tool] are used to match the marker to a virtual medical tool and instructions for simulating the tool's characteristics when representing the tool as a virtual tool in the extended reality simulation.” See PTAB Decision at pg. 14. The rejections stand. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to DANIEL LANE whose telephone number is (303)297-4311. The examiner can normally be reached Monday - Friday 8:00 - 4:30 MT. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Xuan Thai can be reached at (571) 272-7147. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DANIEL LANE/Examiner, Art Unit 3715 1 von Lubitz DKJE, Beier K, Freer J, et al.. Simulation-based medical training: the medical readiness trainer concept and the preparation for civilian and military medical field operations, VRIC, Virtual Reality International Conference. Laval Virtual 2001.
Read full office action

Prosecution Timeline

May 15, 2020
Application Filed
Sep 29, 2021
Non-Final Rejection — §101, §103
Jan 05, 2022
Response Filed
Apr 12, 2022
Final Rejection — §101, §103
Jul 14, 2022
Examiner Interview Summary
Jul 14, 2022
Applicant Interview (Telephonic)
Jul 15, 2022
Request for Continued Examination
Jul 19, 2022
Response after Non-Final Action
Sep 13, 2022
Non-Final Rejection — §101, §103
Dec 20, 2022
Response Filed
Apr 06, 2023
Final Rejection — §101, §103
Jul 17, 2023
Notice of Allowance
Jul 17, 2023
Response after Non-Final Action
Sep 18, 2023
Response after Non-Final Action
Sep 26, 2023
Response after Non-Final Action
Oct 30, 2023
Response after Non-Final Action
Jan 03, 2024
Response after Non-Final Action
Jan 04, 2024
Response after Non-Final Action
Jan 05, 2024
Response after Non-Final Action
Jan 05, 2024
Response after Non-Final Action
Jul 30, 2025
Response after Non-Final Action
Sep 30, 2025
Request for Continued Examination
Oct 02, 2025
Response after Non-Final Action
Oct 12, 2025
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 11810474
SYSTEMS AND METHODS FOR NEURAL PATHWAYS CREATION/REINFORCEMENT BY NEURAL DETECTION WITH VIRTUAL FEEDBACK
2y 5m to grant Granted Nov 07, 2023
Patent 11398160
SYSTEM, APPARATUS, AND METHOD FOR EDUCATING AND REDUCING STRESS FOR PATIENTS WITH ILLNESS OR TRAUMA USING AN INTERACTIVE LOCATION-AWARE TOY AND A DISTRIBUTED SENSOR NETWORK
2y 5m to grant Granted Jul 26, 2022
Patent 11250723
VISUOSPATIAL DISORDERS DETECTION IN DEMENTIA USING A COMPUTER-GENERATED ENVIRONMENT BASED ON VOTING APPROACH OF MACHINE LEARNING ALGORITHMS
2y 5m to grant Granted Feb 15, 2022
Patent 11210961
SYSTEMS AND METHODS FOR NEURAL PATHWAYS CREATION/REINFORCEMENT BY NEURAL DETECTION WITH VIRTUAL FEEDBACK
2y 5m to grant Granted Dec 28, 2021
Patent 11004551
SLEEP IMPROVEMENT SYSTEM, AND SLEEP IMPROVEMENT METHOD USING SAID SYSTEM
2y 5m to grant Granted May 11, 2021
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
4%
Grant Probability
13%
With Interview (+8.7%)
3y 5m
Median Time to Grant
High
PTA Risk
Based on 290 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month