Prosecution Insights
Last updated: April 19, 2026
Application No. 17/893,045

TRACKING UNIT WITH ALIGNMENT CAPABILITY

Non-Final OA §102
Filed
Aug 22, 2022
Examiner
DAVIS, DAVID DONALD
Art Unit
2627
Tech Center
2600 — Communications
Assignee
Hendrix Jennifer
OA Round
5 (Non-Final)
70%
Grant Probability
Favorable
5-6
OA Rounds
3y 2m
To Grant
79%
With Interview

Examiner Intelligence

Grants 70% — above average
70%
Career Allow Rate
631 granted / 900 resolved
+8.1% vs TC avg
Moderate +9% lift
Without
With
+9.1%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
41 currently pending
Career history
941
Total Applications
across all art units

Statute-Specific Performance

§101
1.2%
-38.8% vs TC avg
§103
41.6%
+1.6% vs TC avg
§102
40.8%
+0.8% vs TC avg
§112
10.6%
-29.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 900 resolved cases

Office Action

§102
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on October 7, 2025 has been entered. Drawings The drawings were received on January 3, 2025. These drawings are approved. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1 and 2 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Balan (US 2017/0357332). As per claim 1 Balan et al depicts in figure 7 and discloses: A method or orienting an IMU 44 to an object 10, comprising: coupling the IMU 44 to the object 10 in known alignment with the object { [0046] From the accelerometer and gyroscope, the IMU 44 can detect the orientation of the Controller 40, but only with three degrees of freedom, namely, pitch (elevation angle), yaw (azimuth angle) and roll (rotation). Because the accelerometer can detect the gravity vector, the vertical axis of the frame of reference of the Controller 40 is easily identified and aligned. & [0047] In that case, the frame of reference of the Controller 40 will need to be aligned with or calibrated to the HMD's frame of reference, as discussed in more detail below.}; orienting the object 10 in a pre-selected orientation to determine a reference orientation { [0046] If the IMU 44 also includes a magnetometer, then magnetic north can readily be identified and the frame of reference of the Controller 40 can be north aligned. If both the IMU of the HMD 10 and the IMU 44 of the Controller 40 include a magnetometer, then the frame of reference of the Controller 40 will automatically be aligned with the HMD's frame of reference (subject to some minor variations/offset and drift, which can be corrected over time). }; activating an input device 40 { figures 6 & 7 } in communication with the IMU 44 { figure 7 } to initiate the IMU 44 to process the initial orientation of the IMU 44 { [0067] For a manipulation gesture, the azimuth offset is calculated at the time of the button press by aligning the IMU forward vector with the vector between the hand and the cursor and is maintained constant throughout the manipulation gesture, when the button is released. }; and calculating an orientation offset between the initial orientation of the IMU 44 and the object 10 with a first data frame, the orientation offset shown through the difference between the initial orientation of the IMU and a referenced orientation when couple to the object { [0067] For example, one way to determine the azimuth offset and calibrate the Controller to the HMD's frame reference is to have the user point at a virtual object and calculate the azimuth delta between the HMD's frame of reference and the Controller's frame of reference. Alternatively, a coarse estimate of the hand orientation could also be used to initially estimate the azimuth offset and update it gradually over time using a moving average approach. Such a coarse estimate could be based on the segment between lower arm centroid and palm centroid provided by a hand tracking pipeline. }; wherein the IMU 44 applies the offset to subsequent frames of data reporting on the orientation of the object 10 rather than the orientation of the IMU 44 { [0076] Once the association is established, the process continue to step 118 and the location data derived from the optical sensors of the HMD and the orientation data derived from the IMU of the Controller are fused, thereby recovering 6DOF in relation to the Controller. }; wherein the initial orientation of the IMU 44 is a pre-selected orientation into the IMU 44. { [0049] For example, the location 30 may be calculated based on a predetermined distance and orientation relative to the user 26, such as being two feet in front of the user 26 as one specific example. } As per claim 2 Balan et al depicts in figure 7 and discloses: A method or orienting an IMU 44 to an object 10, comprising: obtaining an IMU 44 and locating in an arbitrary orientation to the object {[0047] If the IMU 44 of the Controller 40 does not include a magnetometer, then the IMU 44 arbitrarily assigns an x-axis when it powers up and then continuously tracks azimuth changes (angular rotation in the horizontal plane) from that initial frame of reference. }; activating an input device 40 { figures 6 & 7 } a first time to initiate the IMU 44 to process an initial orientation of the IMU 44, the input device 40 being in communication with the IMU 44 {figure 7}; coupling the IMU 44 to the object 10 in a second orientation; and activating the input device 40 a second time to initiate the IMU 44 to capture an orientation offset when coupled to the object 10 with a first data frame; the offset shown through the difference between the initial orientation of the IMU and a referenced orientation when couple to the object { [0067] For example, one way to determine the azimuth offset and calibrate the Controller to the HMD's frame reference is to have the user point at a virtual object and calculate the azimuth delta between the HMD's frame of reference and the Controller's frame of reference. Alternatively, a coarse estimate of the hand orientation could also be used to initially estimate the azimuth offset and update it gradually over time using a moving average approach. Such a coarse estimate could be based on the segment between lower arm centroid and palm centroid provided by a hand tracking pipeline. }; wherein the IMU 44 applies the orientation offset to subsequent frames of data reporting on the orientation of the object 10 rather than the orientation of the IMU 44 { [0076] Once the association is established, the process continue to step 118 and the location data derived from the optical sensors of the HMD and the orientation data derived from the IMU of the Controller are fused, thereby recovering 6DOF in relation to the Controller. }; wherein the initial orientation of the IMU 44 is a pre-selected orientation into the IMU 44. { [0049] For example, the location 30 may be calculated based on a predetermined distance and orientation relative to the user 26, such as being two feet in front of the user 26 as one specific example. } Response to Arguments Applicant's arguments filed October 7, 2025 have been fully considered but they are not persuasive. Applicant asserts in the last two paragraphs on page 4 the following: Claims 1 and 2 are hereby amended to clarify the orientations and in particular, the orientation offset. Claim 1 is focused on the second optional method described in the specification wherein the imu is held in the same physical orientation as the object, being a known alignment. Claim 2 is focused on the third optional method described in the specification wherein the IMU is located on the object in an arbitrary orientation. For at least these reasons, the Applicant submits that Claims 1-2 is/are not anticipated by Balan. The Applicant submits that the remarks set forth above in regard to the Claim(s) overcome the Examiner's rejections under 35 U.S.C. § 102 and is/are now in condition for allowance. Therefore, the Applicant respectfully requests that the Claim(s) be allowed. As stated supra, with respect to claim 1 Balan et al discloses in [0046] the following: From the accelerometer and gyroscope, the IMU 44 can detect the orientation of the Controller 40, but only with three degrees of freedom, namely, pitch (elevation angle), yaw (azimuth angle) and roll (rotation). Because the accelerometer can detect the gravity vector, the vertical axis of the frame of reference of the Controller 40 is easily identified and aligned. In [0047] Balan et al discloses the following: “In that case, the frame of reference of the Controller 40 will need to be aligned with or calibrated to the HMD's frame of reference, as discussed in more detail below.” As stated supra, with respect to claim 2 Balan et al discloses in [0047] the following: If the IMU 44 of the Controller 40 does not include a magnetometer, then the IMU 44 arbitrarily assigns an x-axis when it powers up and then continuously tracks azimuth changes (angular rotation in the horizontal plane) from that initial frame of reference. Therefore, contrary to applicant’s assertion, the applied prior art discloses the amended claimed invention. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to DAVID D DAVIS whose telephone number is (571)272-7572. The examiner can normally be reached Monday - Friday, 8 a.m. - 4 p.m.. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ke Xiao can be reached on 571-272-7776. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DAVID D DAVIS/Primary Examiner, Art Unit 2627 ddd
Read full office action

Prosecution Timeline

Aug 22, 2022
Application Filed
Jan 26, 2023
Non-Final Rejection — §102
Aug 01, 2023
Response Filed
Nov 02, 2023
Final Rejection — §102
May 07, 2024
Request for Continued Examination
May 09, 2024
Response after Non-Final Action
Jun 28, 2024
Non-Final Rejection — §102
Jan 03, 2025
Response Filed
Apr 02, 2025
Final Rejection — §102
Oct 07, 2025
Request for Continued Examination
Oct 10, 2025
Response after Non-Final Action
Oct 29, 2025
Non-Final Rejection — §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602106
Ambience-Driven User Experience
2y 5m to grant Granted Apr 14, 2026
Patent 12602128
DISPLAY DEVICE HAVING PIXEL DRIVE CIRCUITS AND SENSOR DRIVE CIRCUITS
2y 5m to grant Granted Apr 14, 2026
Patent 12602121
TOUCH DEVICE FOR PASSIVE RESONANT STYLUS, DRIVING METHOD FOR THE SAME AND TOUCH SYSTEM
2y 5m to grant Granted Apr 14, 2026
Patent 12596265
Aiming Device with a Diffractive Optical Element and Reflective Image Combiner
2y 5m to grant Granted Apr 07, 2026
Patent 12592178
Display Device Including an Electrostatic Discharge Circuit for Discharging Static Electricity
2y 5m to grant Granted Mar 31, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
70%
Grant Probability
79%
With Interview (+9.1%)
3y 2m
Median Time to Grant
High
PTA Risk
Based on 900 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month