Prosecution Insights
Last updated: April 19, 2026
Application No. 18/531,667

INTERACTIVE PROCEDURAL GUIDANCE

Final Rejection §103
Filed
Dec 06, 2023
Examiner
PATEL, JITESH
Art Unit
2612
Tech Center
2600 — Communications
Assignee
LABLIGHT AR INC.
OA Round
2 (Final)
78%
Grant Probability
Favorable
3-4
OA Rounds
2y 2m
To Grant
91%
With Interview

Examiner Intelligence

Grants 78% — above average
78%
Career Allow Rate
312 granted / 398 resolved
+16.4% vs TC avg
Moderate +12% lift
Without
With
+12.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 2m
Avg Prosecution
14 currently pending
Career history
412
Total Applications
across all art units

Statute-Specific Performance

§101
6.2%
-33.8% vs TC avg
§103
61.3%
+21.3% vs TC avg
§102
3.8%
-36.2% vs TC avg
§112
16.6%
-23.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 398 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment This is in response to applicant's amendment/response filed on 01/15/2026, which has been entered and made of record. Claims 1-5, 7-12 and 14-19 have been amended. Claim 20 has been deleted. New claim 21 has been added. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 3-5, 7-8, 10-12, 14-15, 17-19 and 21 are rejected under 35 U.S.C. 103 as being unpatentable over Bridge et al (US 20190304188 A1) in view of Kjallstrom et al (US 20200160607 A1). Regarding claim 1, Bridge discloses a computer-implemented method (Bridge [0037], “a method”), comprising: receiving, by an interactive procedural system, first sensor data that reflects first characteristics of an environment where a user wearing an augmented, mixed, or extended reality (AR/MR/XR) headset is located (Bridge [0028], “a system … series of tasks associated with a procedure”; [0053], “The trainee user wearing the AR device 410 may directly interact with physical objects in his/her environment.”; [0054], “A trainer user may wear a VR headset 110 fully immersed inside a VR environment which may replicate the AR environment as seen by the trainee. The trainer user may connect to the trainee's session via the backend server 150(FIG. 1) through for example, the computing device 190 (receiving, by an interactive procedural system (comprising server 150 and device 190), first sensor data that reflects first characteristics of an environment where a user wearing an augmented, mixed, or extended reality (AR/MR/XR) headset is located) … The trainee user's view of the real-world environment may be captured via a video camera module on the AR device 410 (first sensor data that reflects first characteristics of an environment where a user wearing an augmented, mixed, or extended reality (AR/MR/XR) headset is located)”); based on the first sensor data, determining, by the interactive procedural system, a location of an object within a field of view of the AR/MR/XR headset (Bridge [0057], “The AR device 410 may include a physical object identification module 445 which may be configured to detect physical objects proximate the device. The location of physical objects may be coordinated with virtual objects in generating the virtual scene.”); based on the first sensor data, identifying, by the interactive procedural system, a type of the object (Bridge [0053], “The AR device 410 may employ an object identification algorithm to keep track of the position and orientation of one or more physical objects in the environment … The vehicle 430 is a real-life object.”); based on the type of the object and the location of the object within the field of view of the AR/MR/XR headset, determining, by the interactive procedural system, a skin that is configured to be displayed to the user through the AR/MR/XR headset over the object (Bride [0054], “The trainer user within the training session may show the trainee user how to perform a particular task by sharing information in the form of digital notes associated with the digital replica of a physical object … The trainee user may see digital notes coupled to and/or superimposed on top of the physical counterpart (overlay an exemplary skin in the form of digital notes, that is configured to be displayed to the user through the AR/MR/XR headset over the object) (for example, in the form of a pop-up window 450 (fig. 9)), and may directly interact with the physical counterpart following the digital note provided by the trainer user as guidance for the trainee.”); receiving, by the interactive procedural system, second sensor data that reflects second characteristics of the environment where the user wearing the AR/MR/XR headset is located (Bridge [0030], “trainee actions may determine how parts of the simulated environment may change as a training procedure is carried out.”; [0056], “A user input module 135 may provide triggered selections by the user for the processor 105 to use in generating an action. Some user input may be transmitted through a network to for example the server 150 (FIG. 1).”; [0059], “in the role of a host server 150, the computing device 500 may implement for example the functions of storing electronic files with connected users and their respective VR/AR devices … as a host server 150, the computing device 500 may receive and store copies of software modules and coordinate the transmission of VR environments and changes to those environments based on user actions within those environments. (information about changes to an environment by a user, based on user tracking, is interpreted as reading on receiving, second sensor data that reflects second characteristics of the environment where the user wearing the AR/MR/XR headset is located)”); Bridge does not disclose based on the second data, determining, by the interactive procedural system, that the user has virtually contacted the skin; and based on the user virtually contacting the skin, updating, by the interactive procedural system, the skin. However, Kjallstrom discloses based on the second data, determining, by the interactive procedural system, that the user has virtually contacted the skin (Kjallstrom fig. 7; [0035], “When the user selects (virtually contacts) the digital note 150 (the user has virtually contacted the note 150/the skin)”); and based on the user virtually contacting the skin, updating, by the interactive procedural system, the skin (Kjallstrom fig. 7; [0035], “When the user selects the digital note 150 (based on the user virtually contacting the skin), the modified information may be retrieved and becomes visible (displaying a modified digital note reads on updating the skin)”). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Bridge with Kjallstrom to provide users with a feature to select and update virtual overlay objects. This would have been done to enable users to track progress and make important updates in real time. Regarding claim 3, Bridge in view of Kjallstrom discloses the method of claim 1, wherein determining the location of the object within the field of view of the AR/MR/XR headset comprises: determining a distance between the object and the AR/MR/XR headset (Bridge [0053], “The AR device 410 may employ an object identification algorithm … By doing so, information in the form of digital notes may be seen inside the AR device 410 as coupled to and/or superimposed on top of a physical object inside the physical environment.”; [0057], “The AR device 410 may include a physical object identification module 445 which may be configured to detect physical objects proximate the device. The location of physical objects may be coordinated with virtual objects in generating the virtual scene. (interpreted as reading on determining a distance between the object and the user so that virtual objects are properly overlaid for display to the user – see 0053 above)”); and determining a pose of the object, and determining the skin that is configured to be displayed to the user through the AR/MR/XR headset over the object is based on the distance between the object and the AR/MR/XR headset and based on the pose of the object (Bridge [0053], “The AR device 410 may establish real-time 6 Degrees-of-Freedom (6-DoF) tracking of the position and orientation of itself relative to the physical world. The AR device 410 may employ an object identification algorithm to keep track of the position and orientation of one or more physical objects in the environment. By doing so, information in the form of digital notes may be seen inside the AR device 410 as coupled to and/or superimposed on top of a physical object inside the physical environment.”). Regarding claim 4, Bridge in view of Kjallstrom discloses the method of claim 1, comprising: based on the first sensor data, determining, by the interactive procedural system, an additional location of an additional object within a field of view of the AR/MR/XR headset (Bridge [0053], “The AR device 410 may employ an object identification algorithm to keep track of the position and orientation of one or more physical objects (objects comprising an additional object within a field of view of the AR/MR/XR headset) in the environment. By doing so, information in the form of digital notes may be seen inside the AR device 410 as coupled to and/or superimposed on top of a physical object inside the physical environment.”) and; based on the first sensor data, identifying, by the interactive procedural system, a type of the additional object, wherein determining the skin of the that is configured to be displayed to the user through the AR/MR/XR headset over the object is further based on the type of the additional object and the additional location of the object within the field of view of the AR/MR/XR headset (Bridge fig. 9; [0054], “The trainee user may see digital notes coupled to and/or superimposed on top of the physical counterpart (for example, in the form of a pop-up window 450), and may directly interact with the physical counterpart following the digital note provided by the trainer user as guidance for the trainee. Steps in the digital notes may be visible to the trainee as numbered labels (for example, “1”, “2”, “3”) displayed in the trainee's AR view as pointing to each object that needs to be interacted with in sequence. (objects marked 2 and 3, represent additional objects and their corresponding digital note overlays represent additional skins configured to be displayed to the user through the AR/MR/XR headset over the object is further based on the type of the additional object and the additional location of the object within the field of view of the AR/MR/XR headset)”). Regarding claim 5, Bridge in view of Kjallstrom discloses the method of claim 1, wherein updating the skin is further based on: determining a change in the location of the object (Kjallstrom [0058], “If the target object moves (determining a change in the location of the object), the digital note may also move (updating the skin), closely following the target object”); determining a change in an additional location of an additional object; determining a change in a pose of the object or the additional object; determining a change in an additional pose of the additional object; determining a change in a first distance between the object and the AR/MR/XR headset; determining a change in a second distance between the additional object and the AR/MR/XR headset; determining a change in a third distance between the object and the additional object; or determining an interaction between the skin of the object and the additional object, or an additional skin of the additional. Regarding claim 7, Bridge in view of Kjallstrom discloses the method of claim 1, comprising: accessing, by the interactive procedural system, a set of rules or a procedure related to an activity of the user (Kjallstrom fig. 6; [0035], “in FIG. 6, by enabling an exploded view, the manager clearly sees the disassembly of the digital twin of the engine parts 270 inside a fully immersive VR environment 205 (accessing a procedure related to an activity to fix an issue)”), wherein updating the skin is further based on the set of rules or the procedure (Kjallstrom fig. 6; [0035], “The manager user may modify the copy 250 of the digital note that is attached to the digital twin (wherein updating the digital note/skin further based on the procedure).”). Claim 8 recites a system which corresponds to the function performed by the method of claim 1. As such, the mapping and rejection of claim 1 above is considered applicable to the system of claim 8. Additionally, Bridge discloses A system (Bridge [0028], “a system”), comprising: one or more processors; and memory including a plurality of computer-executable components that are executable by the one or more processors to perform a plurality of actions (Bridge [0060], “one or more processors or processing units 510, a system memory 520, data storage 530, a computer program product 540 having a set of program modules 545 including files and executable instructions”). Claim 10 recites a system which corresponds to the function performed by the method of claim 3. As such, the mapping and rejection of claim 3 above is considered applicable to the system of claim 10. Claim 11 recites a system which corresponds to the function performed by the method of claim 4. As such, the mapping and rejection of claim 4 above is considered applicable to the system of claim 11. Claim 12 recites a system which corresponds to the function performed by the method of claim 5. As such, the mapping and rejection of claim 5 above is considered applicable to the system of claim 12. Claim 14 recites a system which corresponds to the function performed by the method of claim 7. As such, the mapping and rejection of claim 7 above is considered applicable to the system of claim 14. Claim 15 recites one or more non-transitory computer-readable media which corresponds to the function performed by the method of claim 1. As such, the mapping and rejection of claim 1 above is considered applicable to the one or more non-transitory computer-readable media of claim 15. Additionally, Bridge discloses One or more non-transitory computer-readable media storing computer- executable instructions that upon execution cause one or more computers to perform acts (Bridge [0060], “one or more processors or processing units 510, a system memory 520, data storage 530, a computer program product 540 having a set of program modules 545 including files and executable instructions”). Claim 17 recites one or more non-transitory computer-readable media which corresponds to the function performed by the method of claim 3. As such, the mapping and rejection of claim 3 above is considered applicable to the one or more non-transitory computer-readable media of claim 17. Claim 18 recites one or more non-transitory computer-readable media which corresponds to the function performed by the method of claim 4. As such, the mapping and rejection of claim 4 above is considered applicable to the one or more non-transitory computer-readable media of claim 18. Claim 19 recites one or more non-transitory computer-readable media which corresponds to the function performed by the method of claim 5. As such, the mapping and rejection of claim 5 above is considered applicable to the one or more non-transitory computer-readable media of claim 19. Regarding claim 21, Bridge in view of Kjallstrom discloses the method of claim 1, wherein determining that the user has virtually contacted the skin comprises determining that the user has virtually contacted the skin without virtually or physically contacting the object (Kjallstrom [0035], “The manager user may modify the copy 250 of the digital note that is attached to the digital twin (modifying the digital note without contacting the underlying object virtually or physically).”). Claims 2, 6, 9, 13 and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Bridge in view of Kjallstrom and further view of Puig et al (US 20230296405 A1). Regarding claim 2, Bridge in view of Kjallstrom discloses the method of claim 1, but does not disclose wherein the first sensor data and the second sensor data are generated by sensors that (i) comprise a camera, a time of flight sensor, a structured illumination sensor, an infrared sensor, and a light detection and ranging scanner and (ii) are integrated with or separate from the AR/MR/XR headset. However, Puig discloses the first sensor data and the second sensor data are generated by sensors that (i) comprise a camera, a time of flight sensor, a structured illumination sensor, an infrared sensor, and a light detection and ranging scanner, (ii) that are integrated with the AR/MR/XR headset, and (ii) that are separate from the AR/MR/XR headset (Puig [0013], “There are different types of distance sensors (first sensor data and the second sensor data are generated by sensors) which may be used to perform the method according to the present disclosure”; [0014], “ToF (Time of Flight) based sensors (a time of flight sensor) … infrared sensors (an infrared sensor) … LIDAR (a light detection and ranging scanner)”; [0015], “stereoscopic camera (camera)”; [0017], “Structured light scanners” (a structured illumination sensor); [0109], “the visual aid device (AR/MR/XR headset) may comprise a display”; [0114], “the signaling being such as colors marking detected zones or obstacles, arrows pointing towards detected zones or obstacles, sound signaling, etc. either for obstacles which are within the range view of … the display at that moment (AR/MR/XR media displayed to a user)”; [0120], “Such device may comprise any of the previously described different types of distance sensors (sensors are integrated with the AR/MR/XR headset)”; [0145], “the IMU sensor is comprised within the stereo camera 12, but it could be a separate IMU sensor (sensor separate from the AR/MR/XR headset)”). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Bridge with Puig to utilize a variety of sensors to aid users determine objects in an environment for improved navigation. This would have been done to generate accurate information about user environments and thereby provide accurate and customized information to users. See for example, Puig [0114], “the signaling may comprise many different types of media, depending on, for example, the users preferences and disabilities, and the type of environment the user is walking through” Regarding claim 6, Bridge in view of Kjallstrom discloses the method of claim 1, but does not disclose wherein the skin causes the object to appear larger to the user when the object is within the field of view of the AR/MR/XR headset. However, Puig discloses the skin causes the object to appear larger to the user when the object is within the field of view of the AR/MR/XR headset (Puig fig. 4D; [0114], “the signaling being such as colors marking detected zones or obstacles, arrows pointing towards detected zones or obstacles, sound signaling, etc. either for obstacles which are within the range view of … the display at that moment (AR/MR/XR media displayed to a user)”; [0187], “FIG. 4D shows an example of a 2D projection (skin) … wherein several obstacles are selected to be relevant (skin causes the object to appear larger to the user when the object is within the field of view of the AR/MR/XR headset) to be shown to the user, taking into account the point of view of the user within the 3D virtual map.”). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Bridge with Puig to display relevant virtual objects in an enlarged manner. This would have been done to highlight relevant objects for users so that the objects can be clearly viewed in a distinguishable manner. Claim 9 recites a system which corresponds to the function performed by the method of claim 2. As such, the mapping and rejection of claim 2 above is considered applicable to the system of claim 9. Claim 13 recites a system which corresponds to the function performed by the method of claim 6. As such, the mapping and rejection of claim 6 above is considered applicable to the system of claim 13. Claim 16 recites one or more non-transitory computer-readable media which corresponds to the function performed by the method of claim 2. As such, the mapping and rejection of claim 2 above is considered applicable to the one or more non-transitory computer-readable media of claim 16. Response to Arguments Applicant's arguments filed 01/15/2026 have been fully considered but they moot in view of the amendments made to the claims, which required further consideration and search and new ground of rejections. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. See the notice of references cited (PTO-892) for prior art made of record, including art that is not relied upon but considered pertinent to applicant's disclosure. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JITESH PATEL whose telephone number is (571)270-3313. The examiner can normally be reached 8am - 5pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Said A. Broome can be reached at (571) 272-2931. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JITESH PATEL/Primary Examiner, Art Unit 2612
Read full office action

Prosecution Timeline

Dec 06, 2023
Application Filed
Jul 12, 2025
Non-Final Rejection — §103
Jan 15, 2026
Response Filed
Jan 24, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602866
DIGITAL TWIN AUTHORING AND EDITING ENVIRONMENT FOR CREATION OF AR/VR AND VIDEO INSTRUCTIONS FROM A SINGLE DEMONSTRATION
2y 5m to grant Granted Apr 14, 2026
Patent 12597245
INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND STORAGE MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12586313
DAMAGE DETECTION FROM MULTI-VIEW VISUAL DATA
2y 5m to grant Granted Mar 24, 2026
Patent 12579739
2D CONTROL OVER 3D VIRTUAL ENVIRONMENTS
2y 5m to grant Granted Mar 17, 2026
Patent 12579765
DEFINING AND MODIFYING CONTEXT AWARE POLICIES WITH AN EDITING TOOL IN EXTENDED REALITY SYSTEMS
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
78%
Grant Probability
91%
With Interview (+12.4%)
2y 2m
Median Time to Grant
Moderate
PTA Risk
Based on 398 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month