Prosecution Insights
Last updated: April 19, 2026
Application No. 18/787,935

VIRTUAL RETICLE FOR AUGMENTED REALITY SYSTEMS

Non-Final OA §102§103§DP
Filed
Jul 29, 2024
Examiner
WANG, YUEHAN
Art Unit
2617
Tech Center
2600 — Communications
Assignee
Magic Leap Inc.
OA Round
1 (Non-Final)
83%
Grant Probability
Favorable
1-2
OA Rounds
2y 7m
To Grant
96%
With Interview

Examiner Intelligence

Grants 83% — above average
83%
Career Allow Rate
404 granted / 485 resolved
+21.3% vs TC avg
Moderate +13% lift
Without
With
+12.9%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
47 currently pending
Career history
532
Total Applications
across all art units

Statute-Specific Performance

§101
4.3%
-35.7% vs TC avg
§103
69.6%
+29.6% vs TC avg
§102
8.3%
-31.7% vs TC avg
§112
6.6%
-33.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 485 resolved cases

Office Action

§102 §103 §DP
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp. Claim(s) 1-21 is/are rejected on the ground of nonstatutory double patenting as being unpatentable over claim(s) 1-21 of U.S. Patent No. US 12100080 B2 (Reference Patent). Although the claims at issue are not identical, they are not patentably distinct from each other because both of claims are essentially the same structure and perform essentially the same function. In addition, the reference patent comprises more limitations with narrower scope that fully anticipates the claims of the instant application, therefore unpatentable for obvious-type double patenting. The following table illustrates the conflicting claim pairs: Instant Appl. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 Reference Patent 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 Claims of the instant application are compared to claims of Reference Patent in the following tables. Instant Application Reference Patent 1. A system comprising: a sensor configured to obtain an orientation of a head and/or eyes of a user of the system; non-transitory memory configured to store orientation data; a display configured to be positioned in front of an eye of a user, and configured to project a virtual reticle toward the eye of the user within a field of view (FOV) of the user, wherein the FOV changes as a head pose of the user changes; a hardware processor in communication with the sensor, the display, and the non-transitory memory, the hardware processor programmed to: recognize an orientation of the user's head and/or eyes; determine whether the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses; and at least partly in response to determining that the orientation of the user's head and/or eyes are outside of the range of acceptable head and/or eye poses, shifting movement of the virtual reticle away from a default position and toward a position in a direction of the user's head and/or eye movement. 1. A system comprising: a sensor configured to obtain an orientation of a head and/or eyes of a user of the system; non-transitory memory configured to store orientation data; a display configured to be positioned in front of an eye of a user, and configured to project a virtual reticle toward the eye of the user within a field of view (FOV) of the user, wherein the FOV changes as a head pose of the user changes; a hardware processor in communication with the sensor, the display, and the non-transitory memory, the hardware processor programmed to: recognize an orientation of the user's head and/or eyes; determine whether the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses; and at least partly in response to determining that the orientation of the user's head and/or eyes are outside of the range of acceptable head and/or eye poses, reduce a degree to which the user needs to reorient the user's neck to align the virtual reticle and a target object by: accelerating movement of the virtual reticle away from a default position and toward a position in a direction of the user's head and/or eye movement. Instant Application Reference Patent 2. The system of claim 1, wherein determining whether the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses comprises determining whether the acceptable head and/or eye poses satisfies a first orientation threshold. 2. The system of claim 1, wherein determining whether the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses further comprises determining whether the acceptable head and/or eye poses satisfies a first orientation threshold. Instant Application Reference Patent 3. The system of claim 1, wherein while the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses, a location of the virtual reticle is based at least in part on an offset and pose data. 3. The system of claim 1, wherein while the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses, a location of the virtual reticle is based at least in part on an offset and pose data. Instant Application Reference Patent 4. The system of claim 1, wherein when the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses, a location of the virtual reticle is based at least in part on an easing function and pose data. 4. The system of claim 1, wherein when the orientation of the user's head is outside of a range of acceptable head poses, a location of the virtual reticle is based at least in part on an easing function and pose data. Instant Application Reference Patent 5. The system of claim 1, wherein while the orientation of the user's head and/or eyes are outside of a far end limit of the range of acceptable head and/or eye poses, the hardware processor is programmed to cause the display to maintain the virtual reticle at a fixed offset position within the FOV that is offset from the default position. 5. The system of claim 1, wherein while the orientation of the user's head and/or eyes are outside of a far end limit of the range of acceptable head and/or eye poses, the hardware processor is programmed to cause the display to maintain the virtual reticle at a fixed offset position within the FOV that is offset from the default position. Instant Application Reference Patent 6. The system of claim 1, wherein the virtual reticle comprises a movable indicator identifying a position within the FOV. 6. The system of claim 1, wherein the virtual reticle comprises a movable indicator identifying a position within in the FOV. Instant Application Reference Patent 7. The system of claim 1, wherein the head pose is determined in an axial plane of the user, a coronal plane of the user, and a sagittal plane of the user. 7. The system of claim 1, wherein the head pose is determined in an axial plane of the user, a coronal plane of the user, and a sagittal plane of the user. Instant Application Reference Patent 8. A system comprising: a sensor configured to obtain an orientation of a head and/or eyes of a user of the system; non-transitory memory configured to store orientation data; a display configured to be positioned in front of an eye of a user, and configured to project a virtual reticle toward the eye of the user within a field of view (FOV) of the user, wherein the FOV changes as a head pose of the user changes; a hardware processor in communication with the sensor, the display, and the non-transitory memory, the hardware processor programmed to: recognize an orientation of the user's head and/or eyes; determine whether the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses; at least partly in response to determining that the orientation of the user's head and/or eyes are outside of the range of acceptable head and/or eye poses, shifting the virtual reticle away from a default position and toward a position in a direction of the user's head and/or eye movement; and at least partly in response to determining that the orientation of the user's head and/or eyes are within the range of acceptable head and/or eye poses causing the display to render a virtual reticle at a fixed location within the FOV. 8. A system comprising: a sensor configured to obtain an orientation of a head and/or eyes of a user of the system; non-transitory memory configured to store orientation data; a display configured to be positioned in front of an eye of a user, and configured to project a virtual reticle toward the eye of the user within a field of view (FOV) of the user, wherein the FOV changes as a head pose of the user changes; a hardware processor in communication with the sensor, the display, and the non-transitory memory, the hardware processor programmed to: recognize an orientation of the user's head and/or eyes; determine whether the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses; at least partly in response to determining that the orientation of the user's head and/or eyes are outside of the range of acceptable head and/or eye poses, reduce a degree to which the user needs to reorient the user's neck to align the virtual reticle and a target object by: causing the virtual reticle to move away from a default position and toward a position in a direction of the user's head and/or eye movement; and at least partly in response to determining that the orientation of the user's head and/or eyes are within the range of acceptable head and/or eye poses causing the display to render a virtual reticle at a fixed location within the FOV. Instant Application Reference Patent 9. The system of claim 8, wherein determining whether the orientation of the user's head and/or eves are outside of a range of acceptable head and/or eye poses comprises determining whether the acceptable head and/or eye poses satisfies a first orientation threshold. 9. The system of claim 8, wherein determining whether the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses further comprises determining whether the acceptable head and/or eye poses satisfies a first orientation threshold. Instant Application Reference Patent 10. The system of claim 8, wherein while the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses, a location of the virtual reticle is based at least in part on an offset and pose data. 10. The system of claim 8, wherein while the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses, a location of the virtual reticle is based at least in part on an offset and pose data. Instant Application Reference Patent 11. The system of claim 8, wherein when the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses, a location of the virtual reticle is based at least in part on an easing function and pose data. 11. The system of claim 8, wherein when the orientation of the user's head is outside of a range of acceptable head poses, a location of the virtual reticle is based at least in part on an easing function and pose data. Instant Application Reference Patent 12. The system of claim 8, wherein while the orientation of the user's head and/or eyes are outside of a far end limit of the range of acceptable head and/or eye poses, the hardware processor is programmed to cause the display to maintain the virtual reticle at a fixed offset position within the FOV that is offset from the default position. 12. The system of claim 8, wherein while the orientation of the user's head and/or eyes are outside of a far end limit of the range of acceptable head and/or eye poses, the hardware processor is programmed to cause the display to maintain the virtual reticle at a fixed offset position within the FOV that is offset from the default position. Instant Application Reference Patent 13. The system of claim 8, wherein the virtual reticle comprises a movable indicator identifying a position within the FOV. 13. The system of claim 8, wherein the virtual reticle comprises a movable indicator identifying a position within in the FOV. Instant Application Reference Patent 14. The system of claim 8, wherein the head pose is determined in an axial plane of the user, a coronal plane of the user, and a sagittal plane of the user. 14. The system of claim 8, wherein the head pose is determined in an axial plane of the user, a coronal plane of the user, and a sagittal plane of the user. Instant Application Reference Patent 15. A method of adjusting a position of a virtual reticle identifying a position within a field of view (FOV) corresponding to a display of a display system, the method comprising: recognizing an orientation of a user's head and/or eyes based at least on data from one or more sensors; determining whether the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses; at least partly in response to determining that the orientation of the user's head and/or eyes are outside of the range of acceptable head and/or eye poses, shifting movement of the virtual reticle away from a default position and toward a position in a direction of the user's head and/or eye movement; and at least partly in response to determining, that the orientation of the user's head and/or eyes are within the range of acceptable head and/or eye poses causing the display to render a virtual reticle at a fixed location within the FOV. 15. A method of adjusting a position of a virtual reticle identifying a position within a field of view (FOV) corresponding to a display of a display system, the method comprising: recognizing an orientation of a user's head and/or eyes based at least on data from one or more sensors; determining whether the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses; at least partly in response to determining that the orientation of the user's head and/or eyes are outside of the range of acceptable head and/or eye poses, reducing a degree to which the user needs to reorient the user's neck to align the virtual reticle and a target object by: accelerating movement of the virtual reticle away from a default position and toward a position in a direction of the user's head and/or eye movement; and at least partly in response to determining, that the orientation of the user's head and/or eyes are within the range of acceptable head and/or eye poses causing the display to render a virtual reticle at a fixed location within the FOV. Instant Application Reference Patent 16. The method of claim 15, wherein determining whether the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses comprises determining whether the acceptable head and/or eye poses satisfies a first orientation threshold. 16. The method of claim 15, wherein determining whether the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses further comprises determining whether the acceptable head and/or eye poses satisfies a first orientation threshold. Instant Application Reference Patent 17. The method of claim 15, wherein while the orientation of the uses head and/or eyes are outside of a range of acceptable head and/or eye poses, a location of the virtual reticle is based at least in part on an offset and pose data. 17. The method of claim 15, wherein while the orientation of the uses head and/or eyes are outside of a range of acceptable head and/or eye poses, a location of the virtual reticle is based at least in part on an offset and pose data. Instant Application Reference Patent 18. The method of claim 15, wherein when the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses, a location of the virtual reticle is based at least in part on an easing function and pose data. 18. The method of claim 15, wherein when the orientation of the user's head is outside of a range of acceptable head poses, a location of the virtual reticle is based at least in part on an easing function and pose data. Instant Application Reference Patent 19. The method of claim 15, wherein while the orientation of the user's head and/or eyes are outside of a far end limit of the range of acceptable head and/or eye poses, the method further comprises causing the display to maintain the virtual reticle at a fixed offset position within the FOV that is offset from the default position. 19. The method of claim 15, wherein while the orientation of the user's head and/or eyes are outside of a far end limit of the range of acceptable head and/or eye poses, the method further comprises causing the display to maintain the virtual reticle at a fixed offset position within the FOV that is offset from the default position. Instant Application Reference Patent 20. The method of claim 15, wherein the virtual reticle comprises a movable indicator identifying a position within the FOV. 20. The method of claim 15, wherein the virtual reticle comprises a movable indicator identifying a position within in the FOV. Instant Application Reference Patent 21. The method of claim 15, wherein the head pose is determined in an axial plane of the user, a coronal plane of the user, and a sagittal plane of the user. 21. The method of claim 15, wherein the head pose is determined in an axial plane of the user, a coronal plane of the user, and a sagittal plane of the user. Claim(s) 1-4, 6-11, 13-18, 20 and 21 is/are rejected on the ground of nonstatutory double patenting as being unpatentable over claim(s) 1-4, 6, 7, 9, 10, 12, 13, 15 and 16 of U.S. Patent No. US 11367230 B2 (Reference Patent). Although the claims at issue are not identical, they are not patentably distinct from each other because both of claims are essentially the same structure and perform essentially the same function. In addition, the reference patent comprises more limitations with narrower scope that fully anticipates the claims of the instant application, therefore unpatentable for obvious-type double patenting. The following table illustrates the conflicting claim pairs: Instant Appl. 1 2 3 4 6 7 8 9 10 11 13 14 15 16 17 18 20 21 Reference Patent 1 & 4 4 2 1 3 4 6 & 4 10 7 6 9 10 12 & 16 16 13 12 15 16 Claims of the instant application are compared to claims of Reference Patent in the following tables. Instant Application Reference Patent 1. A system comprising: a sensor configured to obtain an orientation of a head and/or eyes of a user of the system; non-transitory memory configured to store orientation data; a display configured to be positioned in front of an eye of a user, and configured to project a virtual reticle toward the eye of the user within a field of view (FOV) of the user, wherein the FOV changes as a head pose of the user changes; a hardware processor in communication with the sensor, the display, and the non-transitory memory, the hardware processor programmed to: recognize an orientation of the user's head and/or eyes; determine whether the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses; and at least partly in response to determining that the orientation of the user's head and/or eyes are outside of the range of acceptable head and/or eye poses, shifting movement of the virtual reticle away from a default position and toward a position in a direction of the user's head and/or eye movement. 1. A system comprising: a head pose sensor configured to obtain head pose data of a user of the system; non-transitory memory configured to store the head pose data; a display configured to be positioned in front of an eye of a user, and configured to project a virtual reticle toward the eye of the user within a field of view (FOV) of the user, wherein the FOV changes as a head pose of the user changes; and a hardware processor in communication with the head pose sensor, the display, and the non-transitory memory, the hardware processor programmed to: identify a head pose of the user based at least in part on the head pose data; and cause the display to render the virtual reticle at a location within the FOV based at least in part on the head pose, wherein while the head pose satisfies a first head pose threshold and does not satisfy a second head pose threshold, the location of the virtual reticle within the FOV is based at least in part on an easing function and the head pose changes as the head pose changes, and wherein while the head pose does not satisfy the first head pose threshold, the FOV changes as the head pose changes and the location of the virtual reticle is fixed within the FOV. 4. The system of claim 1, wherein at least one of the first head pose threshold or the second head pose threshold corresponds to at least one of −50 degrees to 50 degrees relative to an axial plane of the user, −20 degrees to 20 degrees relative to a coronal plane of the user, or −45 degrees to 5 degrees relative to a sagittal plane of the user. Instant Application Reference Patent 2. The system of claim 1, wherein determining whether the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses comprises determining whether the acceptable head and/or eye poses satisfies a first orientation threshold. 4. The system of claim 1, wherein at least one of the first head pose threshold or the second head pose threshold corresponds to at least one of −50 degrees to 50 degrees relative to an axial plane of the user, −20 degrees to 20 degrees relative to a coronal plane of the user, or −45 degrees to 5 degrees relative to a sagittal plane of the user. Instant Application Reference Patent 3. The system of claim 1, wherein while the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses, a location of the virtual reticle is based at least in part on an offset and pose data. 2. The system of claim 1, wherein while the head pose satisfies the first head pose threshold and does not satisfy a second head pose threshold, the location of the virtual reticle is based at least in part on an offset and the head pose. Instant Application Reference Patent 4. The system of claim 1, wherein when the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses, a location of the virtual reticle is based at least in part on an easing function and pose data. 1. A system comprising: a head pose sensor configured to obtain head pose data of a user of the system; non-transitory memory configured to store the head pose data; a display configured to be positioned in front of an eye of a user, and configured to project a virtual reticle toward the eye of the user within a field of view (FOV) of the user, wherein the FOV changes as a head pose of the user changes; and a hardware processor in communication with the head pose sensor, the display, and the non-transitory memory, the hardware processor programmed to: identify a head pose of the user based at least in part on the head pose data; and cause the display to render the virtual reticle at a location within the FOV based at least in part on the head pose, wherein while the head pose satisfies a first head pose threshold and does not satisfy a second head pose threshold, the location of the virtual reticle within the FOV is based at least in part on an easing function and the head pose changes as the head pose changes, and wherein while the head pose does not satisfy the first head pose threshold, the FOV changes as the head pose changes and the location of the virtual reticle is fixed within the FOV. Instant Application Reference Patent 6. The system of claim 1, wherein the virtual reticle comprises a movable indicator identifying a position within the FOV. 3. The system of claim 1, wherein the virtual reticle comprises a movable indicator identifying a position within in the FOV. Instant Application Reference Patent 7. The system of claim 1, wherein the head pose is determined in an axial plane of the user, a coronal plane of the user, and a sagittal plane of the user. 4. The system of claim 1, wherein at least one of the first head pose threshold or the second head pose threshold corresponds to at least one of −50 degrees to 50 degrees relative to an axial plane of the user, −20 degrees to 20 degrees relative to a coronal plane of the user, or −45 degrees to 5 degrees relative to a sagittal plane of the user. Instant Application Reference Patent 8. A system comprising: a sensor configured to obtain an orientation of a head and/or eyes of a user of the system; non-transitory memory configured to store orientation data; a display configured to be positioned in front of an eye of a user, and configured to project a virtual reticle toward the eye of the user within a field of view (FOV) of the user, wherein the FOV changes as a head pose of the user changes; a hardware processor in communication with the sensor, the display, and the non-transitory memory, the hardware processor programmed to: recognize an orientation of the user's head and/or eyes; determine whether the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses; at least partly in response to determining that the orientation of the user's head and/or eyes are outside of the range of acceptable head and/or eye poses, shifting the virtual reticle away from a default position and toward a position in a direction of the user's head and/or eye movement; and at least partly in response to determining that the orientation of the user's head and/or eyes are within the range of acceptable head and/or eye poses causing the display to render a virtual reticle at a fixed location within the FOV. 6. A system comprising: a head pose sensor configured to obtain head pose data of a user of the system; non-transitory memory configured to store the head pose data; a display configured to be positioned in front of an eye of a user, and configured to project a virtual reticle toward the eye of the user within a field of view (FOV) of the user, wherein the FOV changes as a head pose of the user changes; and a hardware processor in communication with the head pose sensor, the display, and the non-transitory memory, the hardware processor programmed to: identify a head pose of the user based at least in part on the head pose data; based at least in part on a determination that the head pose does not satisfy a first head pose threshold, cause the display to render a virtual reticle at a fixed location within the FOV, wherein while the head pose does not satisfy the first head pose threshold, the FOV changes as the head pose changes; and based at least in part on a determination that the head pose satisfies the first head pose threshold and does not satisfy a second head pose threshold, cause the display to render the virtual reticle at a location within the FOV that varies based at least in part on an easing function and the head pose of the user. 4. The system of claim 1, wherein at least one of the first head pose threshold or the second head pose threshold corresponds to at least one of −50 degrees to 50 degrees relative to an axial plane of the user, −20 degrees to 20 degrees relative to a coronal plane of the user, or −45 degrees to 5 degrees relative to a sagittal plane of the user. Instant Application Reference Patent 9. The system of claim 8, wherein determining whether the orientation of the user's head and/or eves are outside of a range of acceptable head and/or eye poses comprises determining whether the acceptable head and/or eye poses satisfies a first orientation threshold. 10. The system of claim 6, wherein the first head pose threshold corresponds to at least one of −50 degrees to 50 degrees relative to an axial plane of the user, −20 degrees to 20 degrees relative to a coronal plane of the user, or −45 degrees to 5 degrees relative to a sagittal plane of the user. Instant Application Reference Patent 10. The system of claim 8, wherein while the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses, a location of the virtual reticle is based at least in part on an offset and pose data. 7. The system of claim 6, wherein while the head pose satisfies the first head pose and does not satisfy the second head pose threshold, the location of the virtual reticle is based at least in part on an offset and the head pose. Instant Application Reference Patent 11. The system of claim 8, wherein when the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses, a location of the virtual reticle is based at least in part on an easing function and pose data. 6. A system comprising: a head pose sensor configured to obtain head pose data of a user of the system; non-transitory memory configured to store the head pose data; a display configured to be positioned in front of an eye of a user, and configured to project a virtual reticle toward the eye of the user within a field of view (FOV) of the user, wherein the FOV changes as a head pose of the user changes; and a hardware processor in communication with the head pose sensor, the display, and the non-transitory memory, the hardware processor programmed to: identify a head pose of the user based at least in part on the head pose data; based at least in part on a determination that the head pose does not satisfy a first head pose threshold, cause the display to render a virtual reticle at a fixed location within the FOV, wherein while the head pose does not satisfy the first head pose threshold, the FOV changes as the head pose changes; and based at least in part on a determination that the head pose satisfies the first head pose threshold and does not satisfy a second head pose threshold, cause the display to render the virtual reticle at a location within the FOV that varies based at least in part on an easing function and the head pose of the user. Instant Application Reference Patent 13. The system of claim 8, wherein the virtual reticle comprises a movable indicator identifying a position within the FOV. 9. The system of claim 6, wherein the virtual reticle comprises a movable indicator identifying a position within in the FOV. Instant Application Reference Patent 14. The system of claim 8, wherein the head pose is determined in an axial plane of the user, a coronal plane of the user, and a sagittal plane of the user. 10. The system of claim 6, wherein the first head pose threshold corresponds to at least one of −50 degrees to 50 degrees relative to an axial plane of the user, −20 degrees to 20 degrees relative to a coronal plane of the user, or −45 degrees to 5 degrees relative to a sagittal plane of the user. Instant Application Reference Patent 15. A method of adjusting a position of a virtual reticle identifying a position within a field of view (FOV) corresponding to a display of a display system, the method comprising: recognizing an orientation of a user's head and/or eyes based at least on data from one or more sensors; determining whether the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses; at least partly in response to determining that the orientation of the user's head and/or eyes are outside of the range of acceptable head and/or eye poses, shifting movement of the virtual reticle away from a default position and toward a position in a direction of the user's head and/or eye movement; and at least partly in response to determining, that the orientation of the user's head and/or eyes are within the range of acceptable head and/or eye poses causing the display to render a virtual reticle at a fixed location within the FOV. 12. A method of adjusting a position of a virtual reticle identifying a position within a field of view (FOV) corresponding to a display of a display system, the method comprising: identify a head pose of a user based at least in part on head pose data obtained from a head pose sensor; based at least in part on a determination that the head pose does not satisfy a first head pose threshold, cause the display to render a virtual reticle at a fixed location within the FOV, wherein while the head pose does not satisfy the first head pose threshold, the FOV changes as the head pose changes; and based at least in part on a determination that the head pose satisfies the first head pose threshold and does not satisfy a second head pose threshold, cause the display to render the virtual reticle at a location within the FOV that varies based at least in part on an easing function and the head pose of the user. 16. The method of claim 12, wherein the first head pose threshold corresponds to at least one of −50 degrees to 50 degrees relative to an axial plane of the user, −20 degrees to 20 degrees relative to a coronal plane of the user, or −45 degrees to 5 degrees relative to a sagittal plane of the user. Instant Application Reference Patent 16. The method of claim 15, wherein determining whether the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses comprises determining whether the acceptable head and/or eye poses satisfies a first orientation threshold. 16. The method of claim 12, wherein the first head pose threshold corresponds to at least one of −50 degrees to 50 degrees relative to an axial plane of the user, −20 degrees to 20 degrees relative to a coronal plane of the user, or −45 degrees to 5 degrees relative to a sagittal plane of the user. Instant Application Reference Patent 17. The method of claim 15, wherein while the orientation of the uses head and/or eyes are outside of a range of acceptable head and/or eye poses, a location of the virtual reticle is based at least in part on an offset and pose data. 13. The method of claim 12, wherein while the head pose satisfies the first head pose and does not satisfy the second head pose threshold, the location of the virtual reticle is based at least in part on an offset and the head pose. Instant Application Reference Patent 18. The method of claim 15, wherein when the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses, a location of the virtual reticle is based at least in part on an easing function and pose data. 12. A method of adjusting a position of a virtual reticle identifying a position within a field of view (FOV) corresponding to a display of a display system, the method comprising: identify a head pose of a user based at least in part on head pose data obtained from a head pose sensor; based at least in part on a determination that the head pose does not satisfy a first head pose threshold, cause the display to render a virtual reticle at a fixed location within the FOV, wherein while the head pose does not satisfy the first head pose threshold, the FOV changes as the head pose changes; and based at least in part on a determination that the head pose satisfies the first head pose threshold and does not satisfy a second head pose threshold, cause the display to render the virtual reticle at a location within the FOV that varies based at least in part on an easing function and the head pose of the user. Instant Application Reference Patent 20. The method of claim 15, wherein the virtual reticle comprises a movable indicator identifying a position within the FOV. 15. The method of claim 12, wherein the virtual reticle comprises a movable indicator identifying a position within in the FOV. Instant Application Reference Patent 21. The method of claim 15, wherein the head pose is determined in an axial plane of the user, a coronal plane of the user, and a sagittal plane of the user. 16. The method of claim 12, wherein the first head pose threshold corresponds to at least one of −50 degrees to 50 degrees relative to an axial plane of the user, −20 degrees to 20 degrees relative to a coronal plane of the user, or −45 degrees to 5 degrees relative to a sagittal plane of the user. Claim(s) 1, 2-9, 13-16, 20 and 21 is/are rejected on the ground of nonstatutory double patenting as being unpatentable over claim(s) 1-3, 5, 26, 27 and 30 of U.S. Patent No. US 10839576 B2 (Reference Patent). Although the claims at issue are not identical, they are not patentably distinct from each other because both of claims are essentially the same structure and perform essentially the same function, therefore unpatentable for obvious-type double patenting. The following table illustrates the conflicting claim pairs: Instant Appl. 1 2 6 7 8 9 13 14 15 16 20 21 Reference Patent 1, 3 & 26 1 2 5 1, 3 & 26 1 2 5 30, 3, 26 & 27 30 2 5 Claims of the instant application are compared to claims of Reference Patent in the following tables. Instant Application Reference Patent 1. A system comprising: a sensor configured to obtain an orientation of a head and/or eyes of a user of the system; non-transitory memory configured to store orientation data; a display configured to be positioned in front of an eye of a user, and configured to project a virtual reticle toward the eye of the user within a field of view (FOV) of the user, wherein the FOV changes as a head pose of the user changes; a hardware processor in communication with the sensor, the display, and the non-transitory memory, the hardware processor programmed to: recognize an orientation of the user's head and/or eyes; determine whether the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses; and at least partly in response to determining that the orientation of the user's head and/or eyes are outside of the range of acceptable head and/or eye poses, shifting movement of the virtual reticle away from a default position and toward a position in a direction of the user's head and/or eye movement. 1. A system comprising: a head pose sensor configured to obtain head pose data of a user of the system; non-transitory memory configured to store the head pose data; a display configured to be positioned in front of an eye of a user, and configured to project a virtual reticle toward the eye of the user within a field of view (FOV) of the user, wherein the FOV changes as a head pose of the user changes; a hardware processor in communication with the head pose sensor, the display, and the non-transitory memory, the hardware processor programmed to: obtain the head pose data of the user; identify a head pose of the user based at least in part on the head pose data; determine a difference between the head pose of the user and a reference head pose; while the difference does not satisfy a first threshold, cause the display to update the FOV at least in response to changes in the head pose and maintain the virtual reticle at a default location; and while the difference satisfies the first threshold, cause the display to render the virtual reticle at a location within the FOV that varies based on the head pose of the user. 3. The system of claim 1, wherein the head pose data corresponds to at least one of an indication of a yaw, a pitch, or a roll of a head of the user. 26. The system of claim 1, wherein the head pose corresponds to head pose that is offset from a natural resting state of the head of the user by a threshold amount, wherein to cause the virtual reticle to change in position within the FOV, the hardware processor programmed to cause the virtual reticle to move, from a default position in the FOV, in a direction corresponding to a direction of head movement. Instant Application Reference Patent 2. The system of claim 1, wherein determining whether the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses comprises determining whether the acceptable head and/or eye poses satisfies a first orientation threshold. 1. A system comprising: a head pose sensor configured to obtain head pose data of a user of the system; non-transitory memory configured to store the head pose data; a display configured to be positioned in front of an eye of a user, and configured to project a virtual reticle toward the eye of the user within a field of view (FOV) of the user, wherein the FOV changes as a head pose of the user changes; a hardware processor in communication with the head pose sensor, the display, and the non-transitory memory, the hardware processor programmed to: obtain the head pose data of the user; identify a head pose of the user based at least in part on the head pose data; determine a difference between the head pose of the user and a reference head pose; while the difference does not satisfy a first threshold, cause the display to update the FOV at least in response to changes in the head pose and maintain the virtual reticle at a default location; and while the difference satisfies the first threshold, cause the display to render the virtual reticle at a location within the FOV that varies based on the head pose of the user. Instant Application Reference Patent 6. The system of claim 1, wherein the virtual reticle comprises a movable indicator identifying a position within the FOV. 2. The system of claim 1, wherein the virtual reticle comprises a movable indicator identifying a position of the user within the FOV. Instant Application Reference Patent 7. The system of claim 1, wherein the head pose is determined in an axial plane of the user, a coronal plane of the user, and a sagittal plane of the user. 5. The system of claim 4, wherein the level head pose comprises a head pose in which a coronal plane of the head of the user, a sagittal plane of the head of the user, and an axial plane of the head of the user are each orthogonal to one another. Instant Application Reference Patent 8. A system comprising: a sensor configured to obtain an orientation of a head and/or eyes of a user of the system; non-transitory memory configured to store orientation data; a display configured to be positioned in front of an eye of a user, and configured to project a virtual reticle toward the eye of the user within a field of view (FOV) of the user, wherein the FOV changes as a head pose of the user changes; a hardware processor in communication with the sensor, the display, and the non-transitory memory, the hardware processor programmed to: recognize an orientation of the user's head and/or eyes; determine whether the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses; at least partly in response to determining that the orientation of the user's head and/or eyes are outside of the range of acceptable head and/or eye poses, shifting the virtual reticle away from a default position and toward a position in a direction of the user's head and/or eye movement; and at least partly in response to determining that the orientation of the user's head and/or eyes are within the range of acceptable head and/or eye poses causing the display to render a virtual reticle at a fixed location within the FOV. 1. A system comprising: a head pose sensor configured to obtain head pose data of a user of the system; non-transitory memory configured to store the head pose data; a display configured to be positioned in front of an eye of a user, and configured to project a virtual reticle toward the eye of the user within a field of view (FOV) of the user, wherein the FOV changes as a head pose of the user changes; a hardware processor in communication with the head pose sensor, the display, and the non-transitory memory, the hardware processor programmed to: obtain the head pose data of the user; identify a head pose of the user based at least in part on the head pose data; determine a difference between the head pose of the user and a reference head pose; while the difference does not satisfy a first threshold, cause the display to update the FOV at least in response to changes in the head pose and maintain the virtual reticle at a default location; and while the difference satisfies the first threshold, cause the display to render the virtual reticle at a location within the FOV that varies based on the head pose of the user. 3. The system of claim 1, wherein the head pose data corresponds to at least one of an indication of a yaw, a pitch, or a roll of a head of the user. 26. The system of claim 1, wherein the head pose corresponds to head pose that is offset from a natural resting state of the head of the user by a threshold amount, wherein to cause the virtual reticle to change in position within the FOV, the hardware processor programmed to cause the virtual reticle to move, from a default position in the FOV, in a direction corresponding to a direction of head movement. Instant Application Reference Patent 9. The system of claim 8, wherein determining whether the orientation of the user's head and/or eves are outside of a range of acceptable head and/or eye poses comprises determining whether the acceptable head and/or eye poses satisfies a first orientation threshold. 1. A system comprising: a head pose sensor configured to obtain head pose data of a user of the system; non-transitory memory configured to store the head pose data; a display configured to be positioned in front of an eye of a user, and configured to project a virtual reticle toward the eye of the user within a field of view (FOV) of the user, wherein the FOV changes as a head pose of the user changes; a hardware processor in communication with the head pose sensor, the display, and the non-transitory memory, the hardware processor programmed to: obtain the head pose data of the user; identify a head pose of the user based at least in part on the head pose data; determine a difference between the head pose of the user and a reference head pose; while the difference does not satisfy a first threshold, cause the display to update the FOV at least in response to changes in the head pose and maintain the virtual reticle at a default location; and while the difference satisfies the first threshold, cause the display to render the virtual reticle at a location within the FOV that varies based on the head pose of the user. Instant Application Reference Patent 13. The system of claim 8, wherein the virtual reticle comprises a movable indicator identifying a position within the FOV. 2. The system of claim 1, wherein the virtual reticle comprises a movable indicator identifying a position of the user within the FOV. Instant Application Reference Patent 14. The system of claim 8, wherein the head pose is determined in an axial plane of the user, a coronal plane of the user, and a sagittal plane of the user. 5. The system of claim 4, wherein the level head pose comprises a head pose in which a coronal plane of the head of the user, a sagittal plane of the head of the user, and an axial plane of the head of the user are each orthogonal to one another. Instant Application Reference Patent 15. A method of adjusting a position of a virtual reticle identifying a position within a field of view (FOV) corresponding to a display of a display system, the method comprising: recognizing an orientation of a user's head and/or eyes based at least on data from one or more sensors; determining whether the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses; at least partly in response to determining that the orientation of the user's head and/or eyes are outside of the range of acceptable head and/or eye poses, shifting movement of the virtual reticle away from a default position and toward a position in a direction of the user's head and/or eye movement; and at least partly in response to determining, that the orientation of the user's head and/or eyes are within the range of acceptable head and/or eye poses causing the display to render a virtual reticle at a fixed location within the FOV. 30. A method of adjusting a position of a virtual reticle identifying a position of a user within a field of view (FOV) corresponding to a display of a display system, the method comprising: obtaining head pose data of a user of a head-mounted display system, wherein the head-mounted display system projects a virtual reticle toward an eye of the user within a FOV of the user, wherein the FOV changes as a head pose of the user changes; identifying the head pose of the user based at least in part on the head pose data; determining a difference between the head pose of the user and a reference head pose; while the difference does not satisfy a first threshold, causing the display to update the FOV at least in response to changes in the head pose and maintaining the virtual reticle at a default location; and while the difference satisfies the first threshold, causing the display to render the virtual reticle at a location within the FOV that varies based on the head pose of the user. 3. The system of claim 1, wherein the head pose data corresponds to at least one of an indication of a yaw, a pitch, or a roll of a head of the user. 26. The system of claim 1, wherein the head pose corresponds to head pose that is offset from a natural resting state of the head of the user by a threshold amount, wherein to cause the virtual reticle to change in position within the FOV, the hardware processor programmed to cause the virtual reticle to move, from a default position in the FOV, in a direction corresponding to a direction of head movement. 27. The system of claim 1, wherein the hardware processor is further configured to: while the difference satisfies a second threshold and does not satisfy the first threshold, cause the display to render the virtual reticle at a fixed location within the FOV. Instant Application Reference Patent 16. The method of claim 15, wherein determining whether the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses comprises determining whether the acceptable head and/or eye poses satisfies a first orientation threshold. 30. A method of adjusting a position of a virtual reticle identifying a position of a user within a field of view (FOV) corresponding to a display of a display system, the method comprising: obtaining head pose data of a user of a head-mounted display system, wherein the head-mounted display system projects a virtual reticle toward an eye of the user within a FOV of the user, wherein the FOV changes as a head pose of the user changes; identifying the head pose of the user based at least in part on the head pose data; determining a difference between the head pose of the user and a reference head pose; while the difference does not satisfy a first threshold, causing the display to update the FOV at least in response to changes in the head pose and maintaining the virtual reticle at a default location; and while the difference satisfies the first threshold, causing the display to render the virtual reticle at a location within the FOV that varies based on the head pose of the user. Instant Application Reference Patent 20. The method of claim 15, wherein the virtual reticle comprises a movable indicator identifying a position within the FOV. 2. The system of claim 1, wherein the virtual reticle comprises a movable indicator identifying a position of the user within the FOV. Instant Application Reference Patent 21. The method of claim 15, wherein the head pose is determined in an axial plane of the user, a coronal plane of the user, and a sagittal plane of the user. 5. The system of claim 4, wherein the level head pose comprises a head pose in which a coronal plane of the head of the user, a sagittal plane of the head of the user, and an axial plane of the head of the user are each orthogonal to one another. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-4, 6-11, 13-18, 20 and 21 is/are rejected under 35 U.S.C. 102(a)(1)] as being anticipated by Ballard et al. (US 20150153826 A1), referred herein as Ballard. Regarding Claim 1, Ballard teaches a system comprising: a sensor configured to obtain an orientation of a head and/or eyes of a user of the system (Ballard [0006] The wearable device may include a display; at least one sensor configured to provide an output indicative of an orientation of a head of the user); non-transitory memory configured to store orientation data (Ballard [0081] AR device 200 may include a number of features relating to sensory input and sensory output. AR device 200 may include at least a front facing camera 203 to provide visual (e.g., video) input, a display (e.g., a translucent or a stereoscopic translucent display) 204 to provide a medium for displaying computer-generated information to the user, a microphone 205 to provide sound input and audio buds/speakers 206 to provide sound output. In some embodiments, the visually conveyed digital data may be received by AR device 200 through the front facing camera 203); a display configured to be positioned in front of an eye of a user, and configured to project a virtual reticle toward the eye of the user within a field of view (FOV) of the user, wherein the FOV changes as a head pose of the user changes (Ballard [0084] Additionally, in this embodiment, AR device 200 may rely on a computer software application to instruct the glasses to render virtual objects on the display field of view. Virtual objects include, but are not limited to, text, images, models, icons. The user may view or interact with virtual objects using the hardware and software application associated with the AR glasses 200; [0186] FIG. 10 illustrates an example of accessing a nested menu that is displayed by an AR device consistent with disclosed embodiments. As shown in FIG. 10, a reticle 1001 may be shown on display 204. Reticle 1001 may constitute a virtual reticle shown on display 204 whose position on display 204 may be changed in response to user input. For example, one or more eye tracking sensors, as described above, may enable tracking of a user's gaze direction, and the position of reticle 1001 on display 204 may be changed with determined changes in the user's gaze direction); a hardware processor in communication with the sensor, the display, and the non-transitory memory, the hardware processor programmed to (Ballard [0069] The processor device 123 may be configured to execute software instructions to perform aspects of the disclosed embodiments. User system 120 may be configured in the form of an AR device, such as a head mounted display (HMD).): recognize an orientation of the user's head and/or eyes (Ballard [0007] monitoring, based on output of at least one sensor, an orientation of a head of the user; determining based on the monitored orientation of the head whether the user is looking upward or downward with respect to a predetermined horizontal threshold); determine whether the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses (Ballard [0007] determining based on the monitored orientation of the head whether the user is looking upward or downward with respect to a predetermined horizontal threshold; and causing the virtual menu to be shown on a display of the wearable device if the user is determined to be looking upward or downward with respect to the predetermined horizontal threshold); and at least partly in response to determining that the orientation of the user's head and/or eyes are outside of the range of acceptable head and/or eye poses, shifting movement of the virtual reticle away from a default position and toward a position in a direction of the user's head and/or eye movement (Ballard [0245] FIG. 17A, a cursor 1702 is also displayed within graphical display 1700a. Cursor 1702 may be an icon, a trackable reticle, a pointer, or any other such cursor known in the art of computer graphics; [0246] in FIG. 17B, user 1501 has moved cursor 1702 over one of the points displayed in graphical display 1700b). Regarding Claim 2, Ballard teaches the system of claim 1, and further teaches wherein determining whether the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses comprises determining whether the acceptable head and/or eye poses satisfies a first orientation threshold (Ballard [0105] determine whether the user is looking upward or downward with respect to a predetermined horizontal threshold. If the user is determined to be looking upward or downward with respect to the predetermined horizontal threshold). Regarding Claim 3, Ballard teaches the system of claim 1, and further teaches wherein while the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses, a location of the virtual reticle is based at least in part on an offset and pose data (Ballard [0245] Detection of user movement and translation of the movement into cursor motion on display 204 may be processed by rendering services module 370, visual processing module 374, and positional processing module 378. To enter the password to unlock AR device 200, user 1501 may move or orient cursor 1702 over specific points within graphical display 1700a for specific periods of time). Move from position in FIG. 17A to the position in FIG. 17B. Regarding Claim 4, Ballard teaches the system of claim 1, and further teaches wherein when the orientation of the user's head and/or eyes are outside of a range of acceptable head and/or eye poses, a location of the virtual reticle is based at least in part on an easing function and pose data (Ballard [0110] the processing device may be configured to monitor both the orientation of the head of user 401 and the time duration that the head of user 401 stays in that orientation based on output received from the sensor. For example, if the time duration that the head of user 401 stays in a detected orientation is less than a predetermined time threshold, the processing device may determine that user 401 does not intend the detected orientation to cause AR device 200 to take an action. On the other hand, if the time duration that the head of user 401 stays in a detected orientation is greater than or equal to the predetermined time threshold, the processing device may determine that user 401 intends the detected orientation to cause AR device 200 to take an action). Regarding Claim 6, Ballard teaches the system of claim 1, and further teaches wherein the virtual reticle comprises a movable indicator identifying a position within the FOV (Ballard [0110] enable tracking of a user's gaze direction, and the position of reticle 1001 on display 204 may be changed with determined changes in the user's gaze direction). Regarding Claim 7, Ballard teaches the system of claim 1, and further teaches wherein the head pose is determined in an axial plane of the user, a coronal plane of the user, and a sagittal plane of the user (Ballard [0106] the orientation of the head of user 401 may be defined according to a coordinate system, such as a three-dimensional coordinate system (x, y, z) having the origin at a point on AR device 200, such as a central point of display 204, the position at which IMU 201 is located, or any other reference point on AR device 200; [0236] In the example illustrated in FIG. 15, user 1501's head (and by extension, AR device 200) is initially oriented at position 1502. As described above, position 1502 may be detected and processed in many ways within AR device 200, including as an angle relative to horizontal or vertical planes, or as a set of coordinates denoting the position of AR device 200 within a three-dimensional coordinate system, as in the example shown in FIG. 15. As AR device 200 progressively samples the orientation and position of the device, user 1501 changes the orientation of his or her head). Regarding Claims 8-11, 13 and 14, Ballard teaches a system comprising: at least partly in response to determining that the orientation of the user's head and/or eyes are within the range of acceptable head and/or eye poses causing the display to render a virtual reticle at a fixed location within the FOV (Ballard [0111] If the detected orientation of the head of user 401 is greater than or equal to the predetermined horizontal threshold, the processing device may be configured to determine that the user is looking upward. On the other hand, if the detected orientation of the head of user 401 is less than the predetermined horizontal threshold, the processing device may be configured to determine that the user is not looking upward). The metes and bounds of the rest of the claims substantially correspond to the limitations set forth in claims 1-4, 6 and 7; thus they are rejected on similar grounds and rationale as their corresponding limitations. Regarding Claims 15-18, 20 and 21, Ballard teaches a method. The metes and bounds of the claims substantially correspond to the limitations set forth in claims 8-11, 13 and 14; thus they are rejected on similar grounds and rationale as their corresponding limitations. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 5, 12 and 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ballard et al. (US 20150153826 A1), referred herein as Ballard in view of Naganawa (US 20140184641 A1), referred herein as Naganawa. Regarding Claim 5, Ballard teaches the system of claim 1, but does not teach the limitations herein. However, Naganawa teaches wherein while the orientation of the user's head and/or eyes are outside of a far end limit of the range of acceptable head and/or eye poses, the hardware processor is programmed to cause the display to maintain the virtual reticle at a fixed offset position within the FOV that is offset from the default position (Naganawa [0059] Further, in the state that it has been decided in the step S403 that the client apparatus is in the high-speed movement (e.g., equal to or higher than 6 cm per second), the control unit 303 considers the marker detection area 102b as an area which is further narrower than that illustrated in FIG. 7. Besides, in the case of the high-speed movement, it is also possible to consider that the user does not keep close watch on the screen 102a and thus not to display any content information; [0060] In a step S407, it is detected whether or not the marker exists in the marker detection area (target area) 102b in the image which has been imaged by the imaging device 209. When it is detected that the marker exists in the marker detection area, then, in a step S408, the content information corresponding or related to the detected marker is superposed on the imaged image). The moving distance can be derived from the speed. Naganawa discloses an information display apparatus which superposes information on an imaged image and displays them, which is analogous to the present patent application. It would have been obvious for a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified Ballard to incorporate the teachings of Naganawa, and apply the determination of marker’s shape by viewing distance/speed of a user’s view to the moving threshold of user’s head of Ballard. Doing so would be able to reduce process loads by limiting the content information to be displayed. Regarding Claim 12, Ballard teaches the system of claim 8. The metes and bounds of the of the claims substantially correspond to the limitations set forth in claim 5; thus they are rejected on similar grounds and rationale as their corresponding limitations. Regarding Claim 19, Ballard teaches the method of claim 15. The metes and bounds of the of the claims substantially correspond to the limitations set forth in claim 5; thus they are rejected on similar grounds and rationale as their corresponding limitations. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Samantha (Yuehan) Wang whose telephone number is (571)270-5011. The examiner can normally be reached Monday-Friday, 8am-5pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, King Poon can be reached on (571)272-7440. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Samantha (YUEHAN) WANG/ Primary Examiner Art Unit 2617
Read full office action

Prosecution Timeline

Jul 29, 2024
Application Filed
Feb 02, 2026
Non-Final Rejection — §102, §103, §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597178
VECTOR OBJECT PATH SEGMENT EDITING
2y 5m to grant Granted Apr 07, 2026
Patent 12597506
ENDOSCOPIC EXAMINATION SUPPORT APPARATUS, ENDOSCOPIC EXAMINATION SUPPORT METHOD, AND RECORDING MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12586286
DIFFERENTIABLE REAL-TIME RADIANCE FIELD RENDERING FOR LARGE SCALE VIEW SYNTHESIS
2y 5m to grant Granted Mar 24, 2026
Patent 12586261
IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 24, 2026
Patent 12567182
USING AUGMENTED REALITY TO VISUALIZE OPTIMAL WATER SENSOR PLACEMENT
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
83%
Grant Probability
96%
With Interview (+12.9%)
2y 7m
Median Time to Grant
Low
PTA Risk
Based on 485 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month