Prosecution Insights
Last updated: April 19, 2026
Application No. 19/243,024

Point-and-select system and methods

Non-Final OA §102§103§DP
Filed
Jun 19, 2025
Examiner
MCLOONE, PETER D
Art Unit
2621
Tech Center
2600 — Communications
Assignee
Doublepoint Technologies OY
OA Round
1 (Non-Final)
83%
Grant Probability
Favorable
1-2
OA Rounds
1y 11m
To Grant
86%
With Interview

Examiner Intelligence

Grants 83% — above average
83%
Career Allow Rate
481 granted / 581 resolved
+20.8% vs TC avg
Minimal +3% lift
Without
With
+2.7%
Interview Lift
resolved cases with interview
Fast prosecutor
1y 11m
Avg Prosecution
23 currently pending
Career history
604
Total Applications
across all art units

Statute-Specific Performance

§101
1.3%
-38.7% vs TC avg
§103
52.1%
+12.1% vs TC avg
§102
35.8%
-4.2% vs TC avg
§112
3.0%
-37.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 581 resolved cases

Office Action

§102 §103 §DP
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Objections Claim 21 is objected to because of the following informalities: Regarding claim 21, reference is made to “the normalized gravity vector” but no such claim limitation appears in claim 21 or parent claim 1. It will be assumed for examination that claim 21 should depend on claim 2, which recites “a normalized gravity vector.” Appropriate correction is required. Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. Claims 1-5, 7-9, 11-13, 15-22, and 25 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-3, 5-7, 9-11, 13-20, 22, and 23 of U.S. Patent No. 12360614 B2. Although the claims at issue are not identical, they are not patentably distinct from each other because the differences are a simple reorganization of claim dependencies. App No. 19243024 US Patent No. 12360614 1. A system comprising: a wrist-wearable apparatus comprising: a mounting component, a controller comprising a processing core, at least one memory including computer program code; and a wrist-wearable IMU configured to measure a user, wherein the system is configured to: receive data from the wrist-wearable IMU, the data comprising gravity information, receive data from at least one head sensor configured to measure the user, said data comprising a head orientation for the user, compute, based on the gravity information and based at least in part on the data received from the head sensor, a yaw component of a directive ray, compute a pitch component of the directive ray, wherein computing the pitch component is based at least in part on the gravity information, and compute the directive ray, wherein the directive ray is based on a combination of the computed yaw component and computed pitch component. 2. The system of claim 1, wherein the system is configured to: compute, based at least in part on the gravity information, a normalized gravity vector, compute, based on the normalized gravity vector and based at least in part on the data received from the head sensor, the yaw component of the directive ray, and compute the pitch component of the directive ray based at least in part on the computed normalized gravity vector. 1. A system comprising: a wrist-wearable apparatus comprising: a mounting component, a controller comprising a processing core, at least one memory including computer program code; and a wrist-wearable IMU configured to measure a user, wherein the system is configured to: receive data from the wrist-wearable IMU, the data comprising gravity information, compute, based at least in part on the gravity information, a normalized gravity vector, receive data from at least one head sensor configured to measure the user, said data comprising a head orientation for the user, compute, based on the computed normalized gravity vector and based at least in part on the data received from the head sensor, a yaw component of a directive ray, compute a pitch component of the directive ray, wherein computing the pitch component is based at least in part on the computed normalized gravity vector, and compute the directive ray, wherein the directive ray is based on a combination of the computed yaw component and computed pitch component. 1. A system comprising: a wrist-wearable apparatus comprising: a mounting component, a controller comprising a processing core, at least one memory including computer program code; and a wrist-wearable IMU configured to measure a user, wherein the system is configured to: receive data from the wrist-wearable IMU, the data comprising gravity information, receive data from at least one head sensor configured to measure the user, said data comprising a head orientation for the user, compute, based on the gravity information and based at least in part on the data received from the head sensor, a yaw component of a directive ray, compute a pitch component of the directive ray, wherein computing the pitch component is based at least in part on the gravity information, and compute the directive ray, wherein the directive ray is based on a combination of the computed yaw component and computed pitch component. 3. The system of claim 1, wherein the system is configured to: compute the pitch component of the directive ray using a Madgwick filter configured to bias angular velocity correctness over orientation correctness. 23. A system comprising: a wrist-wearable apparatus comprising: a mounting component, a controller comprising a processing core, at least one memory including computer program code; and a wrist-wearable IMU configured to measure a user, wherein the system is configured to: receive data from the wrist-wearable IMU, the data comprising gravity information, receive data from at least one head sensor configured to measure the user, said data comprising a head orientation for the user, compute, based on the gravity information and based at least in part on the data received from the head sensor, a yaw component of a directive ray, compute a pitch component of the directive ray using a Madgwick filter configured to bias angular velocity correctness over orientation correctness, and compute the directive ray, wherein the directive ray is based on a combination of the computed yaw component and computed pitch component. 4. The system of claim 1, wherein the system further comprises a head apparatus comprising: a mounting component, a display unit, and a head apparatus controller, and wherein at least one of the head apparatus controller or the wrist-wearable apparatus controller is configured to receive the wrist-wearable IMU data and the head sensor data and to compute the yaw component, the pitch component and the directive ray. 2. The system of claim 1, wherein the system further comprises a head apparatus comprising: a mounting component, a display unit, and a head apparatus controller, and wherein at least one of the head apparatus controller or the wrist-wearable apparatus controller is configured to receive the wrist-wearable IMU data and the head sensor data and to compute the yaw component, the pitch component and the directive ray. 5. The system of claim 4, wherein at least one of the head apparatus controller or the wrist-wearable apparatus controller is configured to provide a data stream representing the directive ray a head apparatus and/or a computing device. 3. The system of claim 2, wherein at least one of the head apparatus controller or the wrist-wearable apparatus controller is configured to provide a data stream representing the directive ray to a head apparatus and/or a computing device. 7. The system of claim 1, wherein the computation of the directive ray is further based on contextual information received from a scene. 5. The system of claim 1, wherein the computation of the directive ray is further based on contextual information received from a scene. 8. The system of claim 1, wherein the system is further configured to: receive optical sensor data from the wrist-wearable apparatus, and identify a selection gesture based on at least one of: the received optical data or the received IMU data, wherein said identification is performed at least in part by a machine learning model, wherein the system comprises the machine learning model. 6. The system of claim 1, wherein the system is further configured to: receive optical sensor data from the wrist-wearable apparatus, and identify a selection gesture based on at least one of: the received optical data or the received IMU data, wherein said identification is performed at least in part by a machine learning model, wherein the system comprises the machine learning model. 9. The system of claim 8, wherein the system is further configured to: enable or disable the gesture identification responsive to a classification of the hand's trajectory, wherein the classification of the hand trajectory is done by a ballistic/corrective phase classifier. 7. The system of claim 6, wherein the system is further configured to: enable or disable the gesture identification responsive to a classification of the hand's trajectory, wherein the classification of the hand trajectory is done by a ballistic/corrective phase classifier. 11. The system of claim 1, wherein system is further configured to adjust the sensitivity of the gesture identification based on received contextual information which comprises location information of an interactive element and/or proximity information with respect to the end-point of the directive ray and the location of the interactive element, wherein adjusting the sensitivity comprises adjusting a confidence value threshold using in the gesture identification. 9. The system of claim 1, wherein system is further configured to adjust the sensitivity of the gesture identification based on received contextual information which comprises location information of an interactive element and/or proximity information with respect to the end-point of the directive ray and the location of the interactive element, wherein adjusting the sensitivity comprises adjusting a confidence value threshold using in the gesture identification. 12. The system of claim 1, wherein system is further configured to adjust the sensitivity of the gesture identification based on the trajectory of the directive ray, wherein adjusting the sensitivity comprises adjusting a confidence value threshold using the gesture identification. 10. The system of claim 1, wherein system is further configured to adjust the sensitivity of the gesture identification based on the trajectory of the directive ray, wherein adjusting the sensitivity comprises adjusting a confidence value threshold using the gesture identification. 13. The system of claim 1, wherein the system is further configured to align the directive ray to an origin point, wherein the coordinates of the origin point are determined by the orientation of the user's head or gaze (POV center) where the origin point equals the center of the user's field of view. 11. The system of claim 1, wherein the system is further configured to align the directive ray to an origin point, wherein the coordinates of the origin point are determined by the orientation of the user's head or gaze (POV center) where the origin point equals the center of the user's field of view. 15. The system of claim 1, wherein the system is further configured to reduce disturbance caused by measurement errors of a user's pointing posture by implementing a filter to improve the accuracy of the directive ray. 13. The system of claim 1, wherein the system is further configured to reduce disturbance caused by measurement errors of a user's pointing posture by implementing a filter to improve the accuracy of the directive ray. 16. The system of claim 1, wherein the system is further configured to adjust the sensitivity of a machine learning model based on a previous selection history of the user. 14. The system of claim 1, wherein the system is further configured to adjust the sensitivity of a machine learning model based on a previous selection history of the user. 17. The system of claim 1, wherein the system further incorporates a feature extraction module configured to analyze the sensor data stream and identify additional user actions beyond selection gestures. 15. The system of claim 1, wherein the system further incorporates a feature extraction module configured to analyze the sensor data stream and identify additional user actions beyond selection gestures. 18. The system of claim 1, wherein the directive ray is visualized as a bended curve at least between an interactive element and the wrist-wearable IMU. 16. The system of claim 1, wherein the directive ray is visualized as a bended curve at least between an interactive element and the wrist-wearable IMU. 19. The system of claim 1, wherein elbow location of the arm is estimated, and wherein the directive ray is aligned with the estimated elbow location and the location of the wrist-wearable IMU. 17. The system of claim 1, wherein elbow location of the arm is estimated, and wherein the directive ray is aligned with the estimated elbow location and the location of the wrist-wearable IMU. 20. The system of claim 1, wherein the orientation of the directive ray is corrected using eye-tracking information. 18. The system of claim 1, wherein the orientation of the directive ray is corrected using eye-tracking information. 21. The system of claim 1, wherein the system is configured to re-compute, based at least in part on the normalized gravity vector and based at least in part on the data obtained from the at least one wrist-wearable IMU, a yaw component of the directive ray if the directive ray is substantially parallel or antiparallel with respect to the normalized gravity vector. 19. The system of claim 1, wherein the system is configured to re-compute, based at least in part on the normalized gravity vector and based at least in part on the data obtained from the at least one wrist-wearable IMU, a yaw component of the directive ray if the directive ray is substantially parallel or antiparallel with respect to the normalized gravity vector. 22. A method for computing a directive ray, the method comprising: receiving data from at least one wrist-wearable IMU, the data comprising gravity information, receiving data from at least one head sensor, configured to measure the user, in particular a head orientation for the user, computing, based on the gravity information, and based at least in part on the data received from the head sensor, a yaw component of a directive ray, computing a pitch component of the directive ray, wherein computing the pitch is based at least in part on the received gravity information, and computing a directive ray, wherein the directive ray is based on a combination of the computed yaw component and computed pitch component. 20. A method for computing a directive ray, the method comprising: receiving data from at least one wrist-wearable IMU, the data comprising gravity information, computing, based at least in part on the gravity information, a normalized gravity vector, receiving data from at least one head sensor, configured to measure the user, in particular a head orientation for the user, computing, based on the normalized gravity vector, and based at least in part on the data received from the head sensor, a yaw component of a directive ray, computing a pitch component of the directive ray, wherein computing the pitch is based at least in part on the computed normalized gravity vector, and computing a directive ray, wherein the directive ray is based on a combination of the computed yaw component and computed pitch component. 25. A non-transitory computer readable medium having stored thereon a set of computer readable instructions that, when executed by at least one processor, cause a system to at least: receive data from at least one wrist-wearable IMU, the data comprising gravity information, receiving data from at least one head sensor, configured to measure the user, in particular a head orientation for the user, compute, based at least in part on the received gravity information and based at least in part on the data received from the head sensor, a yaw component of a directive ray, compute a pitch component of the directive ray, wherein computing the pitch is based at least in part on the received gravity information, and compute a directive ray, wherein the directive ray is based on a combination of the computed yaw component and computed pitch component. 22. A non-transitory computer readable medium having stored thereon a set of computer readable instructions that, when executed by at least one processor, cause a system to at least: receive data from at least one wrist-wearable IMU, the data comprising gravity information, receiving data from at least one head sensor, configured to measure the user, in particular a head orientation for the user, compute, based at least in part on the gravity information, a normalized gravity vector, compute, based at least in part on the normalized gravity vector and based at least in part on the data received from the head sensor, a yaw component of a directive ray, compute a pitch component of the directive ray, wherein computing the pitch is based at least in part on the computed normalized gravity vector, and compute a directive ray, wherein the directive ray is based on a combination of the computed yaw component and computed pitch component. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1, 4, 5, 7, 8, 11-13, 16, 17, 19, 20, 22, and 25 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Khan et al. (US 11327630 B1, hereafter Khan). Regarding claim 1, Khan teaches a system comprising: a wrist-wearable apparatus comprising: a mounting component (Fig. 1, Col. 8 ln. 28-64, where there is a smart watch 114), a controller comprising a processing core, at least one memory including computer program code (Fig. 1, Col. 8 ln. 28-64, where the smart watch has a IMU with the necessary processor and memory to perform processing and transmission of movement information); and a wrist-wearable IMU configured to measure a user (Fig. 1, Col. 8 ln. 28-64, where the smart watch has a IMU measuring user hand information), wherein the system is configured to: receive data from the wrist-wearable IMU, the data comprising gravity information (Col. 8 ln. 28-64, where three dimensions of angular acceleration information are detected and transmitted, including pitch which necessarily includes gravity information), receive data from at least one head sensor configured to measure the user, said data comprising a head orientation for the user (Fig. 1, Col. 8 ln. 28-64, where the HMD unit 116 includes an HMD unit IMU that generates similar movement information of the user), compute, based on the gravity information and based at least in part on the data received from the head sensor, a yaw component of a directive ray (Fig. 1, Col. 8 line 65 to Col. 9 line 21, where a viewing direction is calculated based on angular information from the head unit IMU), compute a pitch component of the directive ray, wherein computing the pitch component is based at least in part on the gravity information (Fig. 1, Col. 8 line 65 to Col. 9 line 21, where a viewing direction is calculated based on angular information from the head unit IMU), and compute the directive ray, wherein the directive ray is based on a combination of the computed yaw component and computed pitch component (Fig. 1, Col. 8 line 65 to Col. 9 line 21, where the viewing and/or pointing direction are calculated based on angular information from the head unit IMU). Regarding claim 4, Khan teaches the system of claim 1, wherein the system further comprises a head apparatus comprising: a mounting component, a display unit, and a head apparatus controller (Fig. 1, Col. 8 lines 28-64, where the head apparatus is a head-mounted display), and wherein at least one of the head apparatus controller or the wrist-wearable apparatus controller is configured to receive the wrist-wearable IMU data and the head sensor data and to compute the yaw component, the pitch component and the directive ray (Fig. 1, Col. 8 line 65 to Col. 9 line 21, where a viewing direction is calculated based on angular information from the head unit IMU and wrist unit IMU). Regarding claim 5, Khan teaches the system of claim 1, wherein at least one of the head apparatus controller or the wrist-wearable apparatus controller is configured to provide a data stream representing the directive ray to a head apparatus and/or a computing device (Fig. 1, Col. 8 line 65 to Col. 9 line 21, where the viewing and/or pointing direction are calculated based on angular information from the head unit IMU, where the transmission of data is reasonably interpreted as a data stream). Regarding claim 7, Khan teaches the system of claim 1, wherein the computation of the directive ray is further based on contextual information received from a scene (Col. 12, ln. 19-51, where the context is an extended reality environment and this is used to inform the virtual object selection tasks performed by the pointing ray). Regarding claim 8, Khan teaches the system of claim 1, wherein the system is further configured to: receive optical sensor data from the wrist-wearable apparatus, and identify a selection gesture based on at least one of: the received optical data or the received IMU data, wherein said identification is performed at least in part by a machine learning model, wherein the system comprises the machine learning model (Col. 8 ln. 28-64, where the system includes cameras for sensing; Col. 13 ln. 25-45, where machine learning is used to determine intent of the user hand gesture). Regarding claim 11, Khan teaches the system of claim 1, wherein system is further configured to adjust the sensitivity of the gesture identification based on received contextual information which comprises location information of an interactive element and/or proximity information with respect to the end-point of the directive ray and the location of the interactive element, wherein adjusting the sensitivity comprises adjusting a confidence value threshold using in the gesture identification (Col. 20 ln. 21-53, where virtual objects in the extended reality environment are tracked according to the pointing or destination location of the hand, the intent prediction system evaluating predictions using a confidence value). Regarding claim 12, Khan teaches the system of claim 1, wherein system is further configured to adjust the sensitivity of the gesture identification based on the trajectory of the directive ray, wherein adjusting the sensitivity comprises adjusting a confidence value threshold using the gesture identification (Col. 20 ln. 21-53, where virtual objects in the extended reality environment are variably adjusted according to the pointing or destination location of the hand, the intent prediction system evaluating predictions using a confidence value). Regarding claim 13, Khan teaches the system of claim 1, wherein the system is further configured to align the directive ray to an origin point, wherein the coordinates of the origin point are determined by the orientation of the user's head or gaze (POV center) where the origin point equals the center of the user's field of view (Figs. 7A and 7B, Col. 12 ln. 1-51, where a forward facing vector originating from a point between the eyes is defined as the origin). Regarding claim 16, Khan teaches the system of claim 1, wherein the system is further configured to adjust the sensitivity of a machine learning model based on a previous selection history of the user (Fig. 3, Col. 11 ln. 32-47, where stored previous hand/head movement information is used to adjust a trajectory of a user input). Regarding claim 17, Khan teaches the system of claim 1, wherein the system further incorporates a feature extraction module configured to analyze the sensor data stream and identify additional user actions beyond selection gestures (Col. 13, lines 34-63, where ongoing evaluation of the sensor data allows for the use of selection or other types of manipulation). Regarding claim 19, Khan teaches the system of claim 1, wherein elbow location of the arm is estimated, and wherein the directive ray is aligned with the estimated elbow location and the location of the wrist-wearable IMU (Col. 11 ln. 48-67, where vector from the hand is determined based on the vector from the elbow joint to the tip of the index finger). Regarding claim 20, Khan teaches the system of claim 1, wherein the orientation of the directive ray is corrected using eye-tracking information (Col. 11 ln. 48-67, where eye gaze information is used for the vector from the head). Regarding claim 22, Khan teaches a method for computing a directive ray, the method comprising: receiving data from at least one wrist-wearable IMU, the data comprising gravity information (Col. 8 ln. 28-64, where three dimensions of angular acceleration information are detected and transmitted, including pitch which necessarily includes gravity information), receiving data from at least one head sensor, configured to measure the user, in particular a head orientation for the user (Fig. 1, Col. 8 ln. 28-64, where the HMD unit 116 includes an HMD unit IMU that generates similar movement information of the user), computing, based on the gravity information, and based at least in part on the data received from the head sensor, a yaw component of a directive ray (Fig. 1, Col. 8 line 65 to Col. 9 line 21, where a viewing direction is calculated based on angular information from the head unit IMU), computing a pitch component of the directive ray, wherein computing the pitch is based at least in part on the received gravity information (Fig. 1, Col. 8 line 65 to Col. 9 line 21, where a viewing direction is calculated based on angular information from the head unit IMU), and computing a directive ray, wherein the directive ray is based on a combination of the computed yaw component and computed pitch component (Fig. 1, Col. 8 line 65 to Col. 9 line 21, where the viewing and/or pointing direction are calculated based on angular information from the head unit IMU). Regarding claim 25, Khan teaches a non-transitory computer readable medium having stored thereon a set of computer readable instructions that, when executed by at least one processor, cause a system to at least: receive data from at least one wrist-wearable IMU, the data comprising gravity information (Col. 8 ln. 28-64, where three dimensions of angular acceleration information are detected and transmitted, including pitch which necessarily includes gravity information), receiving data from at least one head sensor, configured to measure the user, in particular a head orientation for the user (Fig. 1, Col. 8 ln. 28-64, where the HMD unit 116 includes an HMD unit IMU that generates similar movement information of the user), compute, based at least in part on the received gravity information and based at least in part on the data received from the head sensor, a yaw component of a directive ray (Fig. 1, Col. 8 line 65 to Col. 9 line 21, where a viewing direction is calculated based on angular information from the head unit IMU), compute a pitch component of the directive ray, wherein computing the pitch is based at least in part on the received gravity information (Fig. 1, Col. 8 line 65 to Col. 9 line 21, where a viewing direction is calculated based on angular information from the head unit IMU), and compute a directive ray, wherein the directive ray is based on a combination of the computed yaw component and computed pitch component (Fig. 1, Col. 8 line 65 to Col. 9 line 21, where the viewing and/or pointing direction are calculated based on angular information from the head unit IMU). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 2 is rejected under 35 U.S.C. 103 as being unpatentable over Khan et al. (US 11327630 B1, hereafter Khan) in view of Vogel et al. (US 20180088675 A1, hereafter Vogel). Regarding claim 2, Khan would show the system of claim 1. But Khan does not explicitly teach the system wherein the system is configured to: compute, based at least in part on the gravity information, a normalized gravity vector, compute, based on the normalized gravity vector and based at least in part on the data received from the head sensor, the yaw component of the directive ray, and compute the pitch component of the directive ray based at least in part on the computed normalized gravity vector. However, this was well known in the art as evidenced by Vogel ([0036]-[0040], where a normalized gravity vector is used when determining the directive ray based on information from a wearable device, the information including 3-axis gyroscope and 3-axis accelerometer inputs). Both Khan and Vogel teach wearable devices comprising acceleration sensors for determining a directive ray. Khan is silent with respect to the use of a normalized gravity vector in calculations. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to use normalization of a vector as taught by Vogel to simplify calculations. Claim 3 is rejected under 35 U.S.C. 103 as being unpatentable over Khan et al. (US 11327630 B1, hereafter Khan) in view of Hu et al. (US 20230297167 A1, hereafter Hu). Regarding claim 3, Khan teaches the system of claim 1. But, Khan does not explicitly teach the system wherein the system is configured to: compute the pitch component of the directive ray using a Madgwick filter configured to bias angular velocity correctness over orientation correctness. However, this was well known in the art as evidenced by Hu ([0051]-[0052], where input data received from movement sensors is processed using a Madgwick orientation filter). Both Khan and Hu teach devices which detection orientation information wearable devices. Khan is silent with respect to the use of a Madgwick orientation filter. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate a Madgwick orientation filter as taught by Hu into the processing of Khan so as to improve sensor data determinations. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to PETER D MCLOONE whose telephone number is (571)272-4631. The examiner can normally be reached M-F 9 AM - 5 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, LunYi Lao can be reached at 5712727671. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /PETER D MCLOONE/Primary Examiner, Art Unit 2621
Read full office action

Prosecution Timeline

Jun 19, 2025
Application Filed
Feb 20, 2026
Non-Final Rejection — §102, §103, §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596452
ELECTRONIC DEVICE
2y 5m to grant Granted Apr 07, 2026
Patent 12596457
DISPLAY DEVICE AND INSPECTING METHOD THEREOF
2y 5m to grant Granted Apr 07, 2026
Patent 12591340
MICRO-LED TOUCH DISPLAY DEVICE
2y 5m to grant Granted Mar 31, 2026
Patent 12591344
TOUCH PANEL, ELECTRONIC DEVICE, AND TOUCH SYSTEM
2y 5m to grant Granted Mar 31, 2026
Patent 12591328
ELECTRONIC DEVICE INCLUDING DISPLAY INCLUDING TOUCH CIRCUIT THAT PROCESSES CONTACT OF EXTERNAL OBJECT
2y 5m to grant Granted Mar 31, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
83%
Grant Probability
86%
With Interview (+2.7%)
1y 11m
Median Time to Grant
Low
PTA Risk
Based on 581 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month