Prosecution Insights
Last updated: April 19, 2026
Application No. 19/030,397

VISUAL TRACKING OF PERIPHERAL DEVICES

Non-Final OA §103§DP
Filed
Jan 17, 2025
Examiner
KHAN, IBRAHIM A
Art Unit
2628
Tech Center
2600 — Communications
Assignee
Magic Leap Inc.
OA Round
1 (Non-Final)
82%
Grant Probability
Favorable
1-2
OA Rounds
2y 2m
To Grant
94%
With Interview

Examiner Intelligence

Grants 82% — above average
82%
Career Allow Rate
447 granted / 546 resolved
+19.9% vs TC avg
Moderate +12% lift
Without
With
+12.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 2m
Avg Prosecution
17 currently pending
Career history
563
Total Applications
across all art units

Statute-Specific Performance

§101
2.7%
-37.3% vs TC avg
§103
66.5%
+26.5% vs TC avg
§102
10.7%
-29.3% vs TC avg
§112
11.1%
-28.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 546 resolved cases

Office Action

§103 §DP
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION In the response to this office action, the Examiner respectfully requests that support be shown for language added to any original claims on amendment and any new claims. That is, indicate support for newly added claim language by specifically pointing to page(s) and line numbers in the specification and/or drawing figure(s). This will assist the Examiner in prosecuting this application. INFORMATION DISCLOSURE STATEMENT The information disclosure statements filed 11/13/2025 and 05/08/2025, have been acknowledged and considered by the examiner. Initialed copies of the PTO-1449 forms are included in this correspondence. DOUBLE PATENTING The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. Claims 2-3, 5-6, and 9-20 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1, 3, 5-12, and 14-20 of U.S. Patent No. 10860090 (hereinafter “A”) and Claims 9-20 of US Patent No. 11181974 (hereinafter “B”) and Claims 1-16 of US Patent No. 11989339 (hereinafter “C”). Although the claims at issue are not identical, they are not patentably distinct from each other because. It is clear that all the elements of the application claims 2-3, 5-6, and 9-20 are to be found in patent claims 1, 3, 5-12, and 14-20 A and claims 9-20 of B and 1-16 of C. The difference between the application claims 2-3, 5-6, and 9-20 and the claims 1, 3, 5-12, and 14-20 A and claims 9-20 of B and 1-16 of C lies in the fact that the patent claim includes many more elements and is thus much more specific. Thus the invention of claims 1, 3, 5-12, and 14-20 A and claims 9-20 of B and 1-16 of C is in effect a “species” of the “generic” invention of the application claims 2-3, 5-6, and 9-20. It has been held that the generic invention is “anticipated” by the “species”. See In re Goodman, 29 USPQ2d 2010 (Fed. Cir. 1993). Since application claims 2-3, 5-6, and 9-20 are anticipated by claims 1, 3, 5-12, and 14-20 A and claims 9-20 of B and 1-16 of C, it is not patentably distinct from claims 1, 3, 5-12, and 14-20 A and claims 9-20 of B and 1-16 of C (refer to comparison table below). Claims 2-3, 5-6, and 9-20 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-17 of U.S. Patent No. 11625090. Although the claims at issue are not identical, they are not patentably distinct from each other because. It is clear that all the elements of the application claims 2-3, 5-6, and 9-20 are to be found in patent claims 1-17. The difference between the application claims 2-3, 5-6, and 9-20 and the patent claims 1-17 lies in the fact that the patent claim includes many more elements and is thus much more specific. Thus the invention of claims 1-17 of the patent is in effect a “species” of the “generic” invention of the application claims 2-3, 5-6, and 9-20. It has been held that the generic invention is “anticipated” by the “species”. See In re Goodman, 29 USPQ2d 2010 (Fed. Cir. 1993). Since application claims 2-3, 5-6, and 9-20 are anticipated by claims 1-17 of the patent, it is not patentably distinct from claims 1-17 of the patent (refer to comparison table below). Instant Application U.S. Patent No. 10860090 B2 2. An augmented reality (AR) system comprising: a wearable device comprising: a display disposed inside the wearable device and operable to display virtual content; and wearable fiducials affixed to the wearable device; a peripheral device comprising an imaging device mounted to the peripheral device; and a computing apparatus configured to perform localization of the peripheral device with respect to the wearable device. 3. The AR system of claim 2, wherein the imaging device is operable to capture a fiducial image containing a number of the wearable fiducials. 5. The AR system of claim 2, wherein the peripheral device further comprises a peripheral sensor operable to capture peripheral data. 6. The AR system of claim 5, wherein the peripheral sensor comprises an inertial measurement unit (IMU). 9. The AR system of claim 2, wherein the imaging device is oriented in a direction toward the wearable device. 10. An augmented reality (AR) system comprising: a wearable device comprising: a display disposed inside the wearable device and operable to display virtual content; and a wearable imaging device mounted to the wearable device; a peripheral device comprising: a peripheral imaging device mounted to the peripheral device; and a number of peripheral fiducials affixed to the peripheral device; and a computing apparatus configured to perform localization of the peripheral device with respect to the wearable device. 11. The AR system of claim 10, wherein the wearable imaging device is operable to capture a fiducial image containing at least one peripheral fiducial. 12. The AR system of claim 11, wherein performing localization of the peripheral device with respect to the wearable device is based at least in part on the fiducial image. 13. The AR system of claim 10, wherein the number of peripheral fiducials is equal to one. 14. The AR system of claim 10, wherein the number of peripheral fiducials is greater than one. 15. The AR system of claim 10, wherein the peripheral imaging device is operable to capture a world image containing one or more surrounding features. 16. The AR system of claim 15, wherein performing localization of the peripheral device with respect to the wearable device is based at least in part on the world image. 17. A method of performing localization of a peripheral device with respect to a wearable device, the method comprising: obtaining, by a wearable imaging device mounted on the wearable device, fiducial data indicative of movement of the peripheral device; obtaining, by a peripheral imaging device mounted on the peripheral device, a world image containing one or more surrounding features; and updating a position and an orientation of the peripheral device using the fiducial data and the world image. 18. The method of claim 17, wherein obtaining the fiducial data includes capturing, by the wearable imaging device, a fiducial image containing a number of fiducials affixed to the peripheral device. 19. The method of claim 17, further comprising obtaining, by a sensor on the peripheral device, peripheral data; and updating the position and the orientation of the peripheral device based on the peripheral data. 20. The method of claim 17, further comprising: capturing, by a sensor on the wearable device, wearable device data indicative of movement of the wearable device; and updating the position and the orientation of the handheld peripheral device using the wearable device data. 2. An augmented reality (AR) system comprising: a wearable device comprising: a display disposed inside the wearable device and operable to display virtual content; and wearable fiducials affixed to the wearable device; a peripheral device comprising an imaging device mounted to the peripheral device; and a computing apparatus configured to perform localization of the peripheral device with respect to the wearable device. 3. The AR system of claim 2, wherein the imaging device is operable to capture a fiducial image containing a number of the wearable fiducials. 5. The AR system of claim 2, wherein the peripheral device further comprises a peripheral sensor operable to capture peripheral data. 6. The AR system of claim 5, wherein the peripheral sensor comprises an inertial measurement unit (IMU). 9. The AR system of claim 2, wherein the imaging device is oriented in a direction toward the wearable device. 10. An augmented reality (AR) system comprising: a wearable device comprising: a display disposed inside the wearable device and operable to display virtual content; and a wearable imaging device mounted to the wearable device; a peripheral device comprising: a peripheral imaging device mounted to the peripheral device; and a number of peripheral fiducials affixed to the peripheral device; and a computing apparatus configured to perform localization of the peripheral device with respect to the wearable device. 11. The AR system of claim 10, wherein the wearable imaging device is operable to capture a fiducial image containing at least one peripheral fiducial. 12. The AR system of claim 11, wherein performing localization of the peripheral device with respect to the wearable device is based at least in part on the fiducial image. 13. The AR system of claim 10, wherein the number of peripheral fiducials is equal to one. 14. The AR system of claim 10, wherein the number of peripheral fiducials is greater than one. 15. The AR system of claim 10, wherein the peripheral imaging device is operable to capture a world image containing one or more surrounding features. 16. The AR system of claim 15, wherein performing localization of the peripheral device with respect to the wearable device is based at least in part on the world image. 17. A method of performing localization of a peripheral device with respect to a wearable device, the method comprising: obtaining, by a wearable imaging device mounted on the wearable device, fiducial data indicative of movement of the peripheral device; obtaining, by a peripheral imaging device mounted on the peripheral device, a world image containing one or more surrounding features; and updating a position and an orientation of the peripheral device using the fiducial data and the world image. 18. The method of claim 17, wherein obtaining the fiducial data includes capturing, by the wearable imaging device, a fiducial image containing a number of fiducials affixed to the peripheral device. 19. The method of claim 17, further comprising obtaining, by a sensor on the peripheral device, peripheral data; and updating the position and the orientation of the peripheral device based on the peripheral data. 20. The method of claim 17, further comprising: capturing, by a sensor on the wearable device, wearable device data indicative of movement of the wearable device; and updating the position and the orientation of the handheld peripheral device using the wearable device data. 2. An augmented reality (AR) system comprising: a wearable device comprising: a display disposed inside the wearable device and operable to display virtual content; and wearable fiducials affixed to the wearable device; a peripheral device comprising an imaging device mounted to the peripheral device; and a computing apparatus configured to perform localization of the peripheral device with respect to the wearable device. 3. The AR system of claim 2, wherein the imaging device is operable to capture a fiducial image containing a number of the wearable fiducials. 5. The AR system of claim 2, wherein the peripheral device further comprises a peripheral sensor operable to capture peripheral data. 6. The AR system of claim 5, wherein the peripheral sensor comprises an inertial measurement unit (IMU). 9. The AR system of claim 2, wherein the imaging device is oriented in a direction toward the wearable device. 10. An augmented reality (AR) system comprising: a wearable device comprising: a display disposed inside the wearable device and operable to display virtual content; and a wearable imaging device mounted to the wearable device; a peripheral device comprising: a peripheral imaging device mounted to the peripheral device; and a number of peripheral fiducials affixed to the peripheral device; and a computing apparatus configured to perform localization of the peripheral device with respect to the wearable device. 11. The AR system of claim 10, wherein the wearable imaging device is operable to capture a fiducial image containing at least one peripheral fiducial. 12. The AR system of claim 11, wherein performing localization of the peripheral device with respect to the wearable device is based at least in part on the fiducial image. 13. The AR system of claim 10, wherein the number of peripheral fiducials is equal to one. 14. The AR system of claim 10, wherein the number of peripheral fiducials is greater than one. 15. The AR system of claim 10, wherein the peripheral imaging device is operable to capture a world image containing one or more surrounding features. 16. The AR system of claim 15, wherein performing localization of the peripheral device with respect to the wearable device is based at least in part on the world image. 17. A method of performing localization of a peripheral device with respect to a wearable device, the method comprising: obtaining, by a wearable imaging device mounted on the wearable device, fiducial data indicative of movement of the peripheral device; obtaining, by a peripheral imaging device mounted on the peripheral device, a world image containing one or more surrounding features; and updating a position and an orientation of the peripheral device using the fiducial data and the world image. 18. The method of claim 17, wherein obtaining the fiducial data includes capturing, by the wearable imaging device, a fiducial image containing a number of fiducials affixed to the peripheral device. 19. The method of claim 17, further comprising obtaining, by a sensor on the peripheral device, peripheral data; and updating the position and the orientation of the peripheral device based on the peripheral data. 20. The method of claim 17, further comprising: capturing, by a sensor on the wearable device, wearable device data indicative of movement of the wearable device; and updating the position and the orientation of the handheld peripheral device using the wearable device data. 2. An augmented reality (AR) system comprising: a wearable device comprising: a display disposed inside the wearable device and operable to display virtual content; and wearable fiducials affixed to the wearable device; a peripheral device comprising an imaging device mounted to the peripheral device; and a computing apparatus configured to perform localization of the peripheral device with respect to the wearable device. 3. The AR system of claim 2, wherein the imaging device is operable to capture a fiducial image containing a number of the wearable fiducials. 4. The AR system of claim 3, wherein the wearable fiducials comprise a number of light-emitting fiducials affixed to the wearable device, wherein the number of light-emitting fiducials is sufficient to determine a position and an orientation of the peripheral device in full six degrees of freedom based solely on the fiducial image. 5. The AR system of claim 2, wherein the peripheral device further comprises a peripheral sensor operable to capture peripheral data. 6. The AR system of claim 5, wherein the peripheral sensor comprises an inertial measurement unit (IMU). 7. The AR system of claim 2, wherein the computing apparatus is disposed in the wearable device or in the peripheral device. 8. The AR system of claim 2, wherein the wearable device further comprises a wearable sensor mounted on the wearable device, wherein the wearable sensor is operable to capture wearable device data indicative of movement of the wearable device. 9. The AR system of claim 2, wherein the imaging device is oriented in a direction toward the wearable device. 17. A method of performing localization of a peripheral device with respect to a wearable device, the method comprising: obtaining, by a wearable imaging device mounted on the wearable device, fiducial data indicative of movement of the peripheral device; obtaining, by a peripheral imaging device mounted on the peripheral device, a world image containing one or more surrounding features; and updating a position and an orientation of the peripheral device using the fiducial data and the world image. 18. The method of claim 17, wherein obtaining the fiducial data includes capturing, by the wearable imaging device, a fiducial image containing a number of fiducials affixed to the peripheral device. 19. The method of claim 17, further comprising obtaining, by a sensor on the peripheral device, peripheral data; and updating the position and the orientation of the peripheral device based on the peripheral data. 20. The method of claim 17, further comprising: capturing, by a sensor on the wearable device, wearable device data indicative of movement of the wearable device; and updating the position and the orientation of the handheld peripheral device using the wearable device data. 21. The method of claim 17, further comprising: obtaining, by the peripheral imaging device, a second world image; and updating the position and the orientation of the peripheral device based at least in part on a visual odometry comparison between the world image and the second world image. 1. A method of performing localization of a handheld device with respect to a wearable device, the method comprising: obtaining, by at least one sensor mounted to the handheld device, handheld data indicative of movement of the handheld device, wherein obtaining the handheld data includes: detecting, by an inertial measurement unit (IMU) mounted to the handheld device, linear accelerations and rotational velocities of the handheld device; capturing, by a handheld camera mounted to the handheld device, a world image containing one or more features surrounding the handheld device; obtaining, by a wearable camera mounted to the wearable device, fiducial data indicative of movement of the handheld device, wherein obtaining the fiducial data includes: capturing, by the wearable camera, a fiducial image containing a number of fiducials of a plurality of fiducials affixed to the handheld device; and determining the number of fiducials in the fiducial image; in response to determining that the number of fiducials is equal to or greater than three, updating at least one of a position and an orientation of the handheld device based solely on the fiducial data in accordance with a first operating state; in response to determining that the number of fiducials is equal to one or two, updating at least one of the position and the orientation of the handheld device based on the fiducial data and the handheld data in accordance with a second operating state; and in response to determining that the number of fiducials is equal to zero, updating at least one of the position and the orientation of the handheld device based solely on the handheld data in accordance with a third operating state. 3. A method of performing localization of a handheld device with respect to a wearable device, the method comprising: obtaining, by an inertial measurement unit (IMU) mounted to the handheld device, handheld data indicative of movement of the handheld device; obtaining, by an imaging device mounted to a first device, fiducial data indicative of movement of the handheld device, wherein the first device is either the handheld device or the wearable device, and wherein obtaining the fiducial data includes: capturing, by the imaging device, a fiducial image containing a number of fiducials affixed to a second device different than the first device, wherein the second device is either the handheld device or the wearable device; determining the number of fiducials contained in the fiducial image; and based on the number of fiducials contained in the fiducial image, updating a position and an orientation of the handheld device based on the fiducial data and the handheld data in accordance with a first operating state or a second operating state. 5. The method of claim 3, wherein the imaging device is mounted to the handheld device. 6. The method of claim 3, wherein the imaging device is mounted to the wearable device and a plurality of fiducials including the number of fiducials are affixed to the handheld device. 7. The method of claim 3, wherein the imaging device is mounted to the handheld device and one or more fiducials including the number of fiducials are affixed to the wearable device, and wherein obtaining the handheld data includes: capturing, by a second handheld imaging device mounted to the handheld device, a world image containing one or more features surrounding the handheld device. 8. The method of claim 3, wherein the imaging device is mounted to the wearable device and a plurality of fiducials including the number of fiducials are affixed to the handheld device, and wherein obtaining the handheld data includes: capturing, by a second handheld imaging device mounted to the handheld device, a world image containing one or more features surrounding the handheld device. 9. The method of claim 3, further comprising: in response to determining that the number of fiducials is equal to or greater than three, updating at least one of the position and the orientation of the handheld device based on the fiducial data in accordance with the first operating state; and in response to determining that the number of fiducials is equal to one or two, updating at least one of the position and the orientation of the handheld device based on the fiducial data and the handheld data in accordance with the second operating state. 10. The method of claim 9, further comprising: in response to determining that the number of fiducials is equal to zero, updating at least one of the position and the orientation of the handheld device based on the handheld data in accordance with a third operating state. 11. The method of claim 10, wherein: at least one of the position and the orientation of the handheld device is updated based solely on the fiducial data in accordance with the first operating state; and at least one of the position and the orientation of the handheld device is updated based solely on the handheld data in accordance with the third operating state. 12. A system for performing localization of a handheld device with respect to a wearable device, the system comprising: the wearable device; the handheld device; and one or more processors communicatively coupled to the wearable device and the handheld device, wherein the one or more processors are configured to perform operations including: obtaining, by an inertial measurement unit (IMU) mounted to the handheld device, handheld data indicative of movement of the handheld device; obtaining, by an imaging device mounted to a first device, fiducial data indicative of movement of the handheld device, wherein the first device is either the handheld device or the wearable device, and wherein obtaining the fiducial data includes: capturing, by the imaging device, a fiducial image containing a number of fiducials affixed to a second device different than the first device, wherein the second device is either the handheld device or the wearable device; determining the number of fiducials contained in the fiducial image; and based on the number of fiducials contained in the fiducial image, updating at least one of a position and an orientation of the handheld device based on the fiducial data and the handheld data in accordance with a first operating state or a second operating state. 14. The system of claim 12, wherein the imaging device is mounted to the handheld device and the number of fiducials comprise a plurality of fiducials. 15. The system of claim 12, wherein the imaging device is mounted to the wearable device and the number of fiducials comprise a plurality of fiducials. 16. The system of claim 12, wherein the imaging device is mounted to the handheld device and a plurality of fiducials including the number of fiducials are affixed to the wearable device, and wherein obtaining the handheld data includes: capturing, by a second handheld imaging device mounted to the handheld device, a world image containing one or more features surrounding the handheld device. 17. The system of claim 12, wherein the imaging device is mounted to the wearable device and one or more fiducials including the number of fiducials is affixed to the handheld device, and wherein obtaining the handheld data includes: capturing, by a second handheld imaging device mounted to the handheld device, a world image containing one or more features surrounding the handheld device. 18. The system of claim 12, wherein the operations further comprise: in response to determining that the number of fiducials is equal to or greater than three, updating at least one of the position and the orientation of the handheld device based on the fiducial data in accordance with the first operating state; and in response to determining that the number of fiducials is equal to one or two, updating at least one of the position and the orientation of the handheld device based on the fiducial data and the handheld data in accordance with the second operating state. 19. The system of claim 18, wherein the operations further comprise: in response to determining that the number of fiducials is equal to zero, updating at least one of the position and the orientation of the handheld device based on the handheld data in accordance with a third operating state. 20. The system of claim 19, wherein: at least one of the position and the orientation of the handheld device is updated based solely on the fiducial data in accordance with the first operating state; and at least one of the position and the orientation of the handheld device is updated based solely on the handheld data in accordance with the third operating state. U.S. Patent No. 11181974 9. A method of performing localization of a handheld device with respect to a wearable device, the method comprising: obtaining handheld data from a sensor mounted on the handheld device; obtaining fiducial data from an imaging device mounted on either the wearable device or the handheld device, wherein the fiducial data includes a fiducial image containing a number of fiducials affixed to either the wearable device or the handheld device; determining the number of fiducials contained in the fiducial image; and determining whether the number of fiducials contained in the fiducial image is equal to one of a first set of values or to one of a second set of values different from the first set of values; in response to determining that the number of fiducials contained in the fiducial image is equal to one of the first set of values, updating a position and an orientation of the handheld device based on the fiducial data or the fiducial data and the handheld data in accordance with a first operating state; and in response to determining that the number of fiducials contained in the fiducial image is equal to one of the second set of values, updating the position and the orientation of the handheld device based on the fiducial data and the handheld data or the handheld data in accordance with a second operating state different from the first operating state. 10. The method of claim 9 wherein the first set of values includes integers equal to or greater than three. 11. The method of claim 10 wherein the position and orientation is updated based solely on the fiducial data. 12. The method of claim 9 wherein the second set of values consists of integers equal to one or two. 13. The method of claim 9 wherein the second set of values is equal to zero, the method further comprising in response to determining that the number of fiducials contained in the fiducial image is equal to zero, updating the position and the orientation of the handheld device based solely on the handheld data. 14. The method of claim 9 wherein the number of fiducials comprises a number of light-emitted diodes (LEDs). 15. The method of claim 9 wherein the imaging device is mounted on the handheld device and a plurality of fiducials including the number of fiducials are affixed to the wearable device. 16. The method of claim 15 wherein obtaining the handheld data includes capturing, by a second handheld imaging device mounted to the handheld device, a world image containing one or more features surrounding the handheld device. 17. The method of claim 9 wherein the imaging device is mounted on the wearable device and a plurality of fiducials including the number of fiducials are affixed to the handheld device. 18. The method of claim 17 wherein the sensor comprises an inertial measurement unit (IMU). 19. The method of claim 17 wherein obtaining the handheld data includes capturing, by a second handheld imaging device mounted to the handheld device, a world image containing one or more features surrounding the handheld device. 20. The method of claim 9 wherein the sensor comprises an inertial measurement unit (IMU). US Patent No. 11989339 1. A method of performing localization of a handheld device with respect to a wearable device, the method comprising: capturing, by a first imaging device mounted to the handheld device, a fiducial image containing a number of fiducials affixed to the wearable device; capturing, by a second imaging device mounted to the handheld device, a world image containing one or more features surrounding the handheld device; obtaining, by a sensor mounted to the handheld device, handheld data indicative of movement of the handheld device; determining the number of fiducials contained in the fiducial image; and updating a position and an orientation of the handheld device using at least one of the fiducial image or the world image and the handheld data. 2. The method of claim 1 wherein updating the position and the orientation of the handheld device is based at least in part on the number of fiducials contained in the fiducial image. 3. The method of claim 1 wherein the sensor comprises an inertial measurement unit (IMU). 4. The method of claim 1 wherein updating the position and orientation of the handheld device is in accordance with a first operating state or a second operating state. 5. The method of claim 4, wherein updating the position and the orientation of the handheld device is based at least in part on the number of fiducials contained in the fiducial image, the method further comprising: in response to determining that the number of fiducials is equal to or greater than three, updating at least one of the position and the orientation of the handheld device based on the fiducial image in accordance with the first operating state; and in response to determining that the number of fiducials is equal to one or two, updating at least one of the position and the orientation of the handheld device based on the fiducial image, the world image, and the handheld data in accordance with the second operating state. 6. The method of claim 5, further comprising in response to determining that the number of fiducials is equal to zero, updating at least one of the position and the orientation of the handheld device based on the world image and the handheld data in accordance with a third operating state. 7. The method of claim 6 wherein: at least one of the position and the orientation of the handheld device is updated based solely on the fiducial image in accordance with the first operating state; and at least one of the position and the orientation of the handheld device is updated based solely on the handheld data in accordance with the third operating state. 8. The method of claim 1 wherein the fiducials comprise light-emitting diodes (LEDs). 9. A system for performing localization of a handheld device with respect to a wearable device, the system comprising: the wearable device; the handheld device; and one or more processors communicatively coupled to the wearable device and the handheld device, wherein the one or more processors are configured to perform operations including: capturing, by a first imaging device mounted to the handheld device, a fiducial image containing a number of fiducials affixed to the wearable device; capturing, by a second imaging device mounted to the handheld device, a world image containing one or more features surrounding the handheld device; obtaining, by a sensor mounted to the handheld device, handheld data indicative of movement of the handheld device; determining the number of fiducials contained in the fiducial image; and updating a position and an orientation of the handheld device using at least one of the fiducial image or the world image and the handheld data. 10. The system of claim 9, wherein updating the position and the orientation of the handheld device is based at least in part on the number of fiducials contained in the fiducial image. 11. The system of claim 9, wherein the sensor comprises an inertial measurement unit (IMU). 12. The system of claim 9, wherein updating the position and orientation of the handheld device is in accordance with a first operating state or a second operating state. 13. The system of claim 12, wherein updating the position and the orientation of the handheld device is based at least on part on the number of fiducials contained in the fiducial image, and wherein the operations further comprise: in response to determining that the number of fiducials is equal to or greater than three, updating at least one of the position and the orientation of the handheld device based on the fiducial image in accordance with the first operating state; and in response to determining that the number of fiducials is equal to one or two, updating at least one of the position and the orientation of the handheld device based on the fiducial image, the world image, and the handheld data in accordance with the second operating state. 14. The system of claim 12, further comprising in response to determining that the number of fiducials is equal to zero, updating at least one of the position and the orientation of the handheld device based on the world image and the handheld data in accordance with a third operating state. 15. The system of claim 14, wherein at least one of the position and the orientation of the handheld device is updated based solely on the fiducial image in accordance with the first operating state; and at least one of the position and the orientation of the handheld device is updated based solely on the handheld data in accordance with the third operating state. 16. The system of claim 9, wherein the fiducials comprise light-emitting diodes (LEDs). U.S. Patent No. 11625090 1. A method of performing localization of a handheld device with respect to a wearable device, the method comprising: obtaining, by a wearable imaging device mounted on the wearable device, fiducial data indicative of movement of the handheld device, wherein obtaining the fiducial data includes capturing, by the wearable imaging device, a fiducial image containing a number of light-emitting fiducials affixed to the handheld device, wherein the number of light-emitting fiducials is sufficient to determine a position and an orientation of the handheld device in full six degrees of freedom based solely on the fiducial data; capturing, by a sensor on the handheld device, handheld data; capturing, by a sensor on the wearable device, wearable device data indicative of movement of the wearable device; and updating the position and the orientation of the handheld device using the wearable device data, the fiducial data, and the handheld data. 2. The method of claim 1 wherein the sensor on the handheld device comprises an inertial measurement unit (IMU). 3. The method of claim 1 wherein updating the position and the orientation of the handheld device is in accordance with a first operating state or a second operating state. 4. The method of claim 3, further comprising: determining the number of light-emitting fiducials contained in the fiducial image; in response to determining that the number of light-emitting fiducials is equal to or greater than three, updating at least one of the position and the orientation of the handheld device based on the fiducial data in accordance with the first operating state; and in response to determining that the number of light-emitting fiducials is equal to one or two, updating at least one of the position and the orientation of the handheld device based on the fiducial data and the handheld data in accordance with the second operating state. 5. The method of claim 4, further comprising in response to determining that the number of light-emitting fiducials is equal to zero, updating at least one of the position and the orientation of the handheld device based on the handheld data in accordance with a third operating state. 6. The method of claim 5 wherein: at least one of the position and the orientation of the handheld device is updated based solely on the fiducial data in accordance with the first operating state; and at least one of the position and the orientation of the handheld device is updated based solely on the handheld data in accordance with the third operating state. 7. The method of claim 1 wherein the number of light-emitting fiducials comprises a number of light-emitted diodes (LEDs). 8. The method of claim 1 wherein: the sensor on the handheld device comprises a second imaging device; and obtaining the handheld data includes capturing, by the second imaging device, a world image containing one or more features surrounding the handheld device. 9. An augmented reality (AR) system, the system comprising: a wearable device comprising an imaging device and a wearable sensor mounted on the wearable device, wherein the wearable sensor is operable to capture wearable device data indicative of movement of the wearable device; a handheld device comprising a handheld sensor and handheld light-emitting fiducials affixed to the handheld device, wherein: the imaging device is operable to capture a fiducial image containing a number of the handheld light-emitting fiducials, wherein the number of light-emitting fiducials is sufficient to determine a position and an orientation of the handheld device in full six degrees of freedom based solely on the fiducial data, wherein the handheld sensor is operable to capture handheld data; and a computing apparatus configured to perform localization of the handheld device with respect to the wearable device using the wearable device data, the fiducial image, and the handheld data. 10. The AR system of claim 9 wherein the wearable sensor comprises an inertial measurement unit (IMU). 11. The AR system of claim 9 wherein the handheld sensor comprises an inertial measurement unit (IMU). 12. The AR system of claim 9 wherein the handheld sensor comprises a second imaging device. 13. The AR system of claim 12, wherein the second imaging device is operable to capture a world image containing one or more surrounding features. 14. The AR system of claim 9, wherein the handheld device further comprises a second handheld sensor, wherein the handheld sensor comprises an inertial measurement unit (IMU) and the second handheld sensor comprises a second imaging device. 15. The AR system of claim 9, further comprising a belt pack, wherein the computing apparatus is disposed in the belt pack. 16. The AR system of claim 9, wherein the computing apparatus is disposed in the wearable device or the handheld device. 17. The AR system of claim 9 wherein the handheld light-emitting fiducials comprises a number of light-emitting diodes (LEDs). CLAIM REJECTIONS - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 , if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 1. Claims 2-3 and 5-9 are rejected under 35 U.S.C. 103 as being unpatentable over Mikhailov et al. US 20150261291 in view of Bucknor et al. US 20170307891. Consider claim 2. Mikhailov discloses An augmented reality (AR) system [0064-0065] HMD provides augmented reality, comprising: a wearable device HMD fig. 1B 102 comprising: a display disposed inside the wearable device and operable to display virtual content [0035] HMD display virtual scene; and wearable fiducials affixed to the wearable device fig. 1B [0060] HMD 102 includes a plurality of lights 200A-H, J and K. [0061] the lights are identified and tracked. a peripheral device see fig. 3D-2 fig. 1 hand held controller 104; and configured to perform localization of the peripheral device and wearable device fig. 3A [0069] the HMD 102 and controller are tracked via markers (led lights). Mikhailov does not disclose; comprising an imaging device mounted to the peripheral device; a computing apparatus configured to perform localization of the peripheral device with respect to wearable device. Bucknor however discloses comprising an imaging device mounted to the peripheral device fig 8 or 16A see handheld device 606 camera 124 [0093][0129]; a computing apparatus see local processing and data module 70 [0061-0062]. Also see fig. 2A-2D local processing and data module 70. configured to perform localization of the peripheral device with respect to wearable device fig. 8 fig. 16a fig. 19 [0140] camera captures images of the fiducials and when the orientation of the fiducials changes on the hand held device the camera from the HMD can register changes in pose of the hand held device. [0093] (58, 70, 606) Markers and cameras also may be utilized to provide further information regarding relative and absolute position and orientation. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the AR system of Mikhailov to include comprising an imaging device mounted to the peripheral device; a computing apparatus configured to perform localization of the peripheral device with respect to wearable device., as taught by Bucknor, to provide further information regarding relative and absolute position and orientation of the devices [0093] and to provide power efficient processing to the AR system. Consider claim 3. Mikhailov as modified by Bucknor disclose the AR system of claim 2, wherein the imaging device is operable to capture a fiducial image containing a number of the wearable fiducials Mikhailov fig. 1B [0060] HMD 102 includes a plurality of lights 200A-H, J and K. [0061] the lights are identified and tracked. Also see Bucknor fig. 8 130 [0093]. Consider claim 5. Mikhailov as modified by Bucknor disclose the AR system of claim 2, wherein the peripheral device further comprises a peripheral sensor operable to capture peripheral data. Bucknor fig. 8 fig. 16A IMU 102. Motivation to combine is similar to motivation in claim 2. Consider claim 6. Mikhailov as modified by Bucknor disclose the AR system of claim 5, wherein the peripheral sensor comprises an inertial measurement unit (IMU) Bucknor fig. 8 fig. 16A IMU 102. Motivation to combine is similar to motivation in claim 2. Consider claim 7. Mikhailov as modified by Bucknor disclose the AR system of claim 2, wherein the computing apparatus is disposed in the wearable device or in the peripheral device. Bucknor [0082] the computing apparatus may reside at the belt pack 70. In other embodiments, the computing apparatus may reside at the headset itself, or even the hand-held controller 606. Motivation to combine is similar to motivation in claim 2. Consider claim 8. Mikhailov as modified by Bucknor disclose the AR system of claim 2, wherein the wearable device further comprises a wearable sensor mounted on the wearable device, wherein the wearable sensor is operable to capture wearable device data indicative of movement of the wearable device. Mikhailov fig. 5A Accelerometer gyroscope. Bucknor fig. 8 fig. 16A IMU 102 in device 58. Motivation to combine is similar to motivation in claim 2. Consider claim 9. Mikhailov as modified by Bucknor disclose the AR system of claim 2, wherein the imaging device is oriented in a direction toward the wearable device. Mikhailov fig. 5A camera 108. Bucknor fig. 8 fig. 16A [0093] (58, 70, 606) Markers and cameras also may be utilized to provide further information regarding relative and absolute position and orientation. Motivation to combine is similar to motivation in claim 2. 1. Claims 4 is rejected under 35 U.S.C. 103 as being unpatentable over Mikhailov et al. US 20150261291 in view of Bucknor et al. US 20170307891 and further in view of Chen US 20170011553. Consider claim 4. Mikhailov as modified by Bucknor disclose the AR system of claim 3, wherein the wearable fiducials comprise a number of light-emitting fiducials affixed to the wearable device Mikhailov fig. 1B [0060] HMD 102 includes a plurality of lights 200A-H, J and K. Mikhailov as modified by Bucknor do not disclose wherein the number of light-emitting fiducials is sufficient to determine a position and an orientation of the peripheral device in full six degrees of freedom based solely on the fiducial image. Chen however discloses wherein the number of light-emitting fiducials is sufficient to determine a position and an orientation of the peripheral device in full six degrees of freedom based solely on the fiducial image Chen teaches the wearable fiducials comprise a number of light-emitting(IR) fiducials(115A-115C) affixed to the wearable device(100)(see Figs. 1, 3D, 3E; [0023 0027, 0028]) , wherein the number of light-emitting fiducials is sufficient to determine a position and an orientation of the peripheral device in full six degrees of freedom based solely on the fiducial image(see Figs. 1, 3D, 3E; [0031, 0038]). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the AR system of Mikhailov as modified by Bucknor to include wherein the number of light-emitting fiducials is sufficient to determine a position and an orientation of the peripheral device in full six degrees of freedom based solely on the fiducial image, as taught by Chen, to avoid inertial measurement unit drift, delivering stable, absolute orientation and position directly and improve virtual content alignment with the real world. 2. Claims 10-21 are rejected under 35 U.S.C. 103 as being unpatentable over Lohse et al. US 10,152,141 in view Bucknor US 20170307891. Consider claim 10. Lohse an augmented reality (AR) system Col. 3 lines 19-21 augmented reality, comprising: a wearable device see Col. 32 lines 44-47 head-worn computer comprising: a display inside the wearable device and operable to display virtual content col. 3 lines 23-29 and col. 13 line3 augmented reality and assisted reality display; and a wearable imaging device mounted to the wearable device see Col. 32 lines 44-47 using camera on head-worn computer; a peripheral device see fig. 33 and Col. 33 lines 40-45 hand held controller comprising: a number of peripheral fiducials affixed to the peripheral device see Col. 32 lines 44-47 using camera on head-worn computer to monitor one or more light emitters mounted at known positions on the hand-held controllers; and a computing apparatus configured to perform localization of the peripheral device with respect to the wearable device see Col. 32 lines 44-47 tracking position of hand-held controller by using camera on head-worn computer. a peripheral imaging device mounted to the peripheral device; and Lohse does not explicitly disclose a peripheral imaging device mounted to the peripheral device; and a computing apparatus. Bucknor however discloses a peripheral imaging device mounted to the peripheral device fig 8 or 16A see handheld device 606 camera 124 [0093][0129] and a computing apparatus see local processing and data module 70 [0061-0062]. Also see fig. 2A-2D local processing and data module 70. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the AR system of Lohse to include a peripheral imaging device mounted to the peripheral device; and a computing apparatus., as taught by Bucknor, to provide further information regarding relative and absolute position and orientation of the devices [0093] and provide power efficient processing to the AR system. Consider claim 11. Lohse as modified by Bucknor disclose the AR system of claim 10, wherein the wearable imaging device is operable to capture a fiducial image containing at least one peripheral fiducial. Lohse see fig. 33-35 Col. 32 lines 44-47 using camera on head-worn computer to monitor one or more light emitters mounted at known positions on the hand-held controllers. Consider claim 12. Lohse as modified by Bucknor disclose the AR system of claim 11, wherein performing localization of the peripheral device with respect to the wearable device is based at least in part on the fiducial image see Lohse Col. 32 lines 44-47 tracking position of hand-held controller by using camera on head-worn computer. Consider claim 13. Lohse as modified by Bucknor disclose the AR system of claim 10, wherein the number of peripheral fiducials is equal to one Lohse see fig. 33-35 Col. 32 lines 44-47 using camera on head-worn computer to monitor one or more light emitters mounted at known positions on the hand-held controllers. Consider claim 14. Lohse as modified by Bucknor disclose the AR system of claim 10, wherein the number of peripheral fiducials is greater than one Lohse see fig. 33-35 Col. 32 lines 44-47 using camera on head-worn computer to monitor one or more light emitters mounted at known positions on the hand-held controllers. Consider claim 15. Lohse as modified by Bucknor disclose the AR system of claim 10, wherein the peripheral imaging device is operable to capture a world image containing one or more surrounding features Bucknor figs. 8 and 16A handheld device 606 with world capturing camera 124 [0129] [0093] the various camera components (124), may be utilized to capture data which may be utilized in simultaneous localization and mapping protocols, or “SLAM”, to determine where the component is and how it is oriented relative to other components. Note: although the example is given with respect to 58 this applies to cameras 124 located on computing device 70 and controller 606. Motivation to combine is similar to motivation in claim 10. Consider claim 16. Lohse as modified by Bucknor disclose the AR system of claim 15, wherein performing localization of the peripheral device with respect to the wearable device is based at least in part on the world image Bucknor figs. 8 and 16A handheld device 606 with world capturing camera 124 [0129] [0093] the various camera components (124), may be utilized to capture data which may be utilized in simultaneous localization and mapping protocols, or “SLAM”, to determine where the component is and how it is oriented relative to other components. Note: although the example is given with respect to 58 this applies to cameras 124 located on computing device 70 and controller 606. Motivation to combine is similar to motivation in claim 10. Claim 17 is rejected for similar reasons, mutatis mutandis, to claims 10 and 15. Claim 18 is rejected for similar reasons, mutatis mutandis, to claims 11. Consider claim 19. Lohse as modified by Bucknor disclose the method of claim 17, further comprising obtaining, by a sensor on the peripheral device, peripheral data; and updating the position and the orientation of the peripheral device based on the peripheral data see Lohse Col. 33 lines 40-45 hand held controller may include IMU to monitor first form of movement e.g. rotational angular movements Col 32 lines 27-30 external user interface 104 fig 33 (hand held controller) may include IMU or other movement detection system to monitor movements of the external user interface. Consider claim 20. Lohse as modified by Bucknor disclose the method of claim 17, further comprising: capturing, by a sensor on the wearable device, wearable device data indicative of movement of the wearable device; and updating the position and the orientation of the handheld peripheral device using the wearable device data. Bucknor fig. 6 [0082][0085] IMU and coils 608 on the headset device. Consider claim 21. Lohse as modified by Bucknor disclose the method of claim 17, further comprising: obtaining, by the peripheral imaging device, a second world image; and updating the position and the orientation of the peripheral device based at least in part on a visual odometry comparison between the world image and the second world image Bucknor figs. 8 and 16A handheld device 606 with world capturing camera 124 [0129] [0093] the various camera components (124), may be utilized to capture data which may be utilized in simultaneous localization and mapping protocols, or “SLAM”, to determine where the component is and how it is oriented relative to other components. Note: although the example is given with respect to 58 this applies to cameras 124 located on computing device 70 and controller 606. CONCLUSION The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Miller et al. US 20190187779 discloses an AR environment with an HMD and handheld controller, each with their respective cameras, that are tracked in space. Any inquiry concerning this communication or earlier communications from the examiner should be directed to IBRAHIM A KHAN whose telephone number is (571)270-7998. The examiner can normally be reached on 10am-6pm. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, LunYi Lao can be reached on 5712727671. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. IBRAHIM A. KHAN Primary Examiner Art Unit 2621 /IBRAHIM A KHAN/ 01/09/2026Primary Examiner, Art Unit 2621
Read full office action

Prosecution Timeline

Jan 17, 2025
Application Filed
May 07, 2025
Response after Non-Final Action
Jan 09, 2026
Non-Final Rejection — §103, §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602017
WRISTWATCH AND WRISTWATCH-TYPE DISPLAY DEVICE
2y 5m to grant Granted Apr 14, 2026
Patent 12603067
Displaying Image Data based on Ambient Light
2y 5m to grant Granted Apr 14, 2026
Patent 12573152
OVERLAY TECHNOLOGY FOR ENHANCING CONNECTIVITY AND REALISM IN INTERACTING SIMULATIONS
2y 5m to grant Granted Mar 10, 2026
Patent 12572211
VIRTUAL REALITY INTERACTION
2y 5m to grant Granted Mar 10, 2026
Patent 12557706
PIXEL PACKAGE AND MANUFACTURING METHOD THEREOF
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
82%
Grant Probability
94%
With Interview (+12.0%)
2y 2m
Median Time to Grant
Low
PTA Risk
Based on 546 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month