Prosecution Insights
Last updated: April 19, 2026
Application No. 18/655,551

AUTHORING SYSTEM FOR INTERACTIVE VIRTUAL REALITY ENVIRONMENTS

Non-Final OA §DP
Filed
May 06, 2024
Examiner
HAILU, TADESSE
Art Unit
2174
Tech Center
2100 — Computer Architecture & Software
Assignee
Human Mode LLC
OA Round
1 (Non-Final)
78%
Grant Probability
Favorable
1-2
OA Rounds
3y 4m
To Grant
82%
With Interview

Examiner Intelligence

Grants 78% — above average
78%
Career Allow Rate
747 granted / 960 resolved
+22.8% vs TC avg
Minimal +4% lift
Without
With
+4.5%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
29 currently pending
Career history
989
Total Applications
across all art units

Statute-Specific Performance

§101
5.8%
-34.2% vs TC avg
§103
38.1%
-1.9% vs TC avg
§102
41.1%
+1.1% vs TC avg
§112
9.0%
-31.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 960 resolved cases

Office Action

§DP
DETAILED ACTION Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . 2. This Office Action is in response to the Preliminary Amendment filed on 08/21/2024. 3. The IDS filed on 08/21/2024 is considered and entered into the application file. 4. Claims 1-20 are pending in this application, all claims are examined and rejected by NON-STATURTY DOUBLE PATENTING rejection herein below. Double Patenting 5. Claims 1-17 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-17 of U.S. Patent No. US 11,977,725 B2. Although the claims at issue are not identical, they are not patentably distinct from each other because, as furnished in a side-to-side comparison (see table below), claims of the patent are compared to the current application claims they recite only obvious differences which would have been obvious to one of ordinary skill in the art at the time of invention such as simply omitting/adding steps or elements along with their functions. For example each independent claims of the current application replaces “action script” with -module of the computer executable code having one or more function, and/or removes “action script” from all the claims. The present claims represent either attempts to secure broader coverage than the parent patent or more narrowly worded claims which nonetheless contain limitations which the claims of parent encompasses. Thus, it would have been obvious to modify (adding or removing) such claim elements , because the removal/adding of limitations from claims do not affect the remained claims methodology and/or structure. 6. Claims 18-20 are rejected on the ground of nonstatutory double patenting as being unpatentable over independent claims 1, 7 and 12 of U.S. Patent No. US 11,977,725 B2 in view of U.S. Patent No. 5,946,674 The newly added claims 18-20 which depend on independent claims 1, 7 and 12 respectively, introduces Turing-complete computer programmable code comprising one or more scriptable functions. Even though Turing-complete computer programmable code is well known in the art of computer scripting language, the claims of the patent US 11,977,725 B2 fall short to describe the claimed Turing-complete computer programmable code as required in each dependent claims 18, 19 and 20. On the other hand, U.S. Patent No. 5,946,674 discloses Turing Complete Computer Implemented Machine Learning Method And System (see title of the U.S. Patent No. 5,946,674 and col. 31, lines 59-62). Therefore, it would have been obvious to a person of ordinary skill in the art to combine the teaching of U.S. Patent No. 5,946,674 with US 11,977,725. Because the system utilizing Turing complete provides additional advantages including, among others, the use of several machine registers, dynamic allocation of memory, variable length of programs, Multiple input parameters to functions and unlimited memory through indexed memory (see U.S. Patent No. 5,946,674 , col. 31, lines 59-col. 32, lines 10). Therefore, it would have been obvious to combine the above teaching of US. Patent No. 5,946,674 with US 11,977,725 B2 to obtain the invention as specified in claims 18,19 and 20. US Application 18/655,551 US Patent 11,977,725 comment 1. (Currently amended) An electronic device comprising: a processor; a headset having a visual feedback screen ;one or more user interface devices; and a non-transitory computer readable medium storing computer executable code that when executed by the processor cause the processor to: project a virtual reality environment to the headset, the virtual reality environment having an object, the object being within the virtual reality environment, created by a container having properties that define how the object interacts within the virtual reality environment, and having a three- dimensional mesh with coordinates defining a boundary of the object; in response to a first input from the or more user interface device, display an action menu within the virtual reality environment on the visual feedback screen, the action menu having a list of pre-defined or more user interface device, update the container for the object to include a selected module module 2. (Original) The electronic device of claim 1, wherein the object is identified before the first input. 3. (Original) The electronic device of claim 1, wherein the object is identified after the first input. 4. (Currently Amended) The electronic device of claim 1, wherein the non-transitory computer readable medium storing computer executable code that when executed by the processor cause the processor to, in response to a triggering event, update the container of the object to include the selected module 5. (Original) The electronic device of claim 4, wherein the triggering event is raised when a user interacts with a trigger object thereby meeting a pre-determined trigger condition. 6. (Currently Amended) The electronic device of claim 1, wherein the non-transitory computer readable medium storing computer executable code that when executed by the processor cause the processor to: in response to a third input from at least one user interface device selecting a specific trigger object from a trigger menu, display a trigger condition menu having a list of one or more trigger conditions for the trigger object; and in response to a selection of a trigger condition for the trigger object, allows the user to place the trigger object into the virtual reality environment, wherein the trigger condition, when met, causes the processor to update [[the]] object computer code of the object to include the selected module 7. (Currently Amended) An electronic device comprising: a processor; a headset having a visual feedback screen; one or more user interface devices; and a non-transitory computer readable medium storing computer executable code that when executed by the processor cause the processor to: project a virtual reality environment to the headset, the virtual reality environment having one or more object, each of the one or more object being within the virtual reality environment, created by a container having properties that define how the one or more object interact within the virtual reality environment, and having a first three-dimensional mesh with first coordinates defining a first boundary of the object; and in response to an input from at least one user interface device, connect a trigger object, having a second three-dimensional mesh with second coordinates defining a second boundary of the trigger object, selected from a trigger menu to the container of a particular object of the one or more object within the virtual reality environment, thereby forming a complex object within the container by linking the particular object and the trigger object using coordinate links between the first coordinates and the second coordinates, the complex object including one or more module of computer executable code having one or more function, the one or more function configured to affect one or more property of the container within the virtual reality environmentmodule 8. (Original) The electronic device of claim 7, wherein the trigger object is identified before the input. 9. (Original) The electronic device of claim 7, wherein the trigger object is identified after the input. 10. (Currently Amended) The electronic device of claim 7, wherein the non-transitory computer readable medium storing computer executable code that when executed by the processor cause the processor to, in response to a trigger event, update the complex object to activate the one or more module 11. (Original) The electronic device of claim 10, wherein the trigger event is raised when the user interacts with the trigger object thereby meeting a pre-determined trigger condition. 12. (Currently Amended) A non-transitory computer readable medium storing computer executable code that when executed by a processor cause the processor to: project a virtual reality environment to a headset, the virtual reality environment having one or more object, each of the one or more object being within the virtual reality environment, and created by a container having properties that define how the one or more object interact within the virtual reality environment, the one or more object having a three dimensional mesh with coordinates defining a boundary of the one or more object; in response to a first input from at least one user interface device selecting a first object of the one or more object within the virtual reality environment, display an action menu on a visual feedback screen having a list of pre- defined modules module module 13. (Original) The non-transitory computer readable medium of claim 12, wherein the first object is identified before the first input. 14. (Original) The non-transitory computer readable medium of claim 12, wherein the first object is identified after the first input. 15. (Currently Amended) The non-transitory computer readable medium of claim 12, wherein the non-transitory computer readable medium storing computer executable code that when executed by the processor cause the processor to, in response to a triggering event, update the container of the first object to include the selected module 16. (Original) The non-transitory computer readable medium of claim 15, wherein the triggering event is raised when a user interacts with a trigger object thereby meeting a pre-determined trigger condition. 17. (Currently Amended) The non-transitory computer readable medium of claim 12, wherein the non-transitory computer readable medium storing computer executable code that when executed by the processor cause the processor to, in response to a second input from at least one user interface device selecting a specific trigger object from a trigger menu, display a trigger condition menu having a list of one or more trigger conditions for the trigger object; and in response to a selection of a first trigger condition for the trigger object, allows a user to place the trigger object into the virtual reality environment, the trigger condition, when met, causes the processor to update the container of the first object to include the selected module 18. (New) The electronic device of claim 1 wherein the modules of computer executable code are modules of Turing-complete computer programmable code comprising one or more scriptable functions. 19. (New) The electronic device of claim 7 wherein the modules of computer executable code are modules of Turing-complete computer programmable code comprising one or more scriptable functions. 20. (New) The non-transitory computer readable medium of claim 12 wherein the modules of computer executable code are modules of Turing-complete computer programmable code comprising one or more scriptable functions. 1. An electronic device comprising: a processor; a headset having a visual feedback screen; one or more user interface devices; and a non-transitory computer readable medium storing computer executable code that when executed by the processor cause the processor to: project a virtual reality environment to the headset, the virtual reality environment having an object, the object being within the virtual reality environment, created by a container having properties that define how the object interacts within the virtual reality environment, and having a three-dimensional mesh with coordinates defining a boundary of the object; in response to a first input from the at least one user interface device, display an action menu within the virtual reality environment on the visual feedback screen, the action menu having a list of pre-defined action scripts, the action scripts including a module of computer executable code having one or more function, the one or more function configured to affect the one or more property of the container within the virtual reality environment; and in response to a second input from the at least one user interface device, update the container for the object to include a selected action script, wherein the second input includes dragging, within the virtual reality environment on the visual feedback screen, the selected action script from the action menu onto the object. 2. The electronic device of claim 1, wherein the object is identified before the first input. 3. The electronic device of claim 1, wherein the object is identified after the first input. 4. The electronic device of claim 1, wherein the non-transitory computer readable medium storing computer executable code that when executed by the processor cause the processor to, in response to a triggering event, update the container of the object to include a selected action script. 5. The electronic device of claim 4, wherein the triggering event is raised when a user interacts with a trigger object thereby meeting a pre-determined trigger condition. 6. The electronic device of claim 1, wherein the non-transitory computer readable medium storing computer executable code that when executed by the processor cause the processor to: in response to a third input from at least one user interface device selecting a specific trigger object from a trigger menu, display a trigger condition menu having a list of one or more trigger conditions for the trigger object; and in response to a selection of a trigger condition for the trigger object, allows the user to place the trigger object into the virtual reality environment, wherein the trigger condition, when met, causes the processor to update the object computer code of the object to include the selected action script. 7. An electronic device comprising: a processor; a headset having a visual feedback screen; one or more user interface devices; and a non-transitory computer readable medium storing computer executable code that when executed by the processor cause the processor to: project a virtual reality environment to the headset, the virtual reality environment having one or more object, each of the one or more object being within the virtual reality environment, created by a container having properties that define how the one or more object interact within the virtual reality environment, and having a first three-dimensional mesh with first coordinates defining a first boundary of the object; and in response to an input from at least one user interface device, connect a trigger object, having a second three-dimensional mesh with second coordinates defining a second boundary of the trigger object, selected from a trigger menu to the container of a particular object of the one or more object within the virtual reality environment, thereby forming a complex object within the container by linking the particular object and the trigger object using coordinate links between the first coordinates and the second coordinates, the complex object including one or more action script; display, in the virtual reality environment on the visual feedback screen, a scriptable menu having trigger scriptable action for the trigger object; and in response to a selection of a trigger scriptable action for the trigger object, allow a user to associate the trigger scriptable action to the one or more action script of the complex object. 8. The electronic device of claim 7, wherein the trigger object is identified before the input. 9. The electronic device of claim 7, wherein the trigger object is identified after the input. 10. The electronic device of claim 7, wherein the non-transitory computer readable medium storing computer executable code that when executed by the processor cause the processor to, in response to a trigger event, update the complex object to activate the at least one action script. 11. The electronic device of claim 10, wherein the trigger event is raised when the user interacts with the trigger object thereby meeting a pre-determined trigger condition. 12. A non-transitory computer readable medium storing computer executable code that when executed by a processor cause the processor to: project a virtual reality environment to a headset, the virtual reality environment having one or more object, each of the one or more object being within the virtual reality environment, and created by a container having properties that define how the one or more object interact within the virtual reality environment, the one or more object having a three dimensional mesh with coordinates defining a boundary of the one or more object; in response to a first input from at least one user interface device selecting a first object of the one or more object within the virtual reality environment, display an action menu on a visual feedback screen having a list of pre-defined action scripts, the action scripts including a module of computer executable code having one or more function, the one or more function configured to affect one or more property of the first object of the one or more object within the virtual reality environment; and in response to a second input from the at least one user interface device, update the container for the first object to include a selected action script, wherein the second input includes dragging, within the virtual reality environment on the visual feedback screen, the selected action script from the action menu onto the first object. 13. The non-transitory computer readable medium of claim 12, wherein the first object is identified before the first input. 14. The non-transitory computer readable medium of claim 12, wherein the first object is identified after the first input. 15. The non-transitory computer readable medium of claim 12, wherein the non-transitory computer readable medium storing computer executable code that when executed by the processor cause the processor to, in response to a triggering event, update the container of the first object to include a selected action script. 16. The non-transitory computer readable medium of claim 15, wherein the triggering event is raised when a user interacts with a trigger object thereby meeting a pre-determined trigger condition. 17. The non-transitory computer readable medium of claim 12, wherein the non-transitory computer readable medium storing computer executable code that when executed by the processor cause the processor to, in response to a second input from at least one user interface device selecting a specific trigger object from a trigger menu, display a trigger condition menu having a list of one or more trigger conditions for the trigger object; and in response to a selection of a first trigger condition for the trigger object, allows a user to place the trigger object into the virtual reality environment, the trigger condition, when met, causes the processor to update the container of the first object to include a selected action script. [Wingdings font/0xDF]- the current application replaces “action script” with –module- or removes “action script” from all the claims. [Wingdings font/0xDF]- the current application deleted “action script” and adds modules with one or more functions .. [Wingdings font/0xDF] the current application replace the -selected action script- with “the selected module” Claims 18-20 are newly added and recite - Turing-complete computer programmable code Conclusion 7. Any inquiry concerning this communication or earlier communications from the examiner should be directed to TADESSE HAILU whose telephone number is (571)272-4051; and the email address is Tadesse.hailu@USPTO.GOV. The examiner can normally be reached Monday- Friday 9:30-5:30 (Eastern time). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Bashore, William L. can be reached (571) 272-4088. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /TADESSE HAILU/Primary Examiner, Art Unit 2174
Read full office action

Prosecution Timeline

May 06, 2024
Application Filed
Jan 16, 2026
Non-Final Rejection — §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596435
CONTACT OR CONTACTLESS INTERFACE WITH TEMPERATURE HAPTIC FEEDBACK
2y 5m to grant Granted Apr 07, 2026
Patent 12578976
SYSTEMS AND METHODS FOR AFFINITY-DRIVEN INTERFACE GENERATION
2y 5m to grant Granted Mar 17, 2026
Patent 12578849
METHOD, APPARATUS, ELECTRONIC DEVICE AND READABLE STORAGE MEDIUM FOR PAGE PROCESSING
2y 5m to grant Granted Mar 17, 2026
Patent 12572198
USER INTERFACES FOR GAZE TRACKING ENROLLMENT
2y 5m to grant Granted Mar 10, 2026
Patent 12566621
CUSTOMIZATION AND ENRICHMENT OF USER INTERFACES USING LARGE LANGUAGE MODELS
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
78%
Grant Probability
82%
With Interview (+4.5%)
3y 4m
Median Time to Grant
Low
PTA Risk
Based on 960 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month