DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . It is responsive to the submission dated 02/04/2026. Claims 1-20 are presented for examination, of which, claims 1, 10 and 19 independent claims.
Response to Arguments
2. Applicant’s arguments, see pages 6-8, filed 02/04/2026, with respect to the rejection(s) of claim(s) 1-20 under 35 USC 102(a1) have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of the newly found reference to Busey (US 20240153119).
Claim Rejections - 35 USC § 103
3. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
4. Claims 1-20 are rejected under 35 U.S.C. 103 as being obvious over Nichols et al. (US20200326430) in view of Busey (US 20240153119).
Considering claim 1, Nichols discloses a method of calibrating a device (see fig. 2A) having a camera (116), an antenna (110), an electronic distance measurement (EDM) device (146), and a display (156, fig. 1); see abstract and para. 5), the method comprising:
capturing, at a first time, a first distance to a first point using the EDM device and a first camera image containing the first point using the camera (for examples, Nichols discloses an augmented reality (AR) device 100 comprising a camera component 101 attached to a sensor component 102, wherein camera component 101 may include a camera 116 for capturing a camera image 118 and the sensor component 102 includes an electronic distance measurement (EDM) device 146 for measuring distances to discrete points within the field of view of camera 116. In some embodiments, EDM device 146 is a lidar device that transmits pulsed laser light towards a point of interest and measures the reflected pulses with a sensor. The distance between the lidar device and the point of interest is estimated based on the return time or on phase measurements of the transmitted light. See paras. 68-69 and paras. 106-109);
capturing, at a second time (e.g., in response to a user input), a second distance to a second point (e.g., a point based changed linear position, velocity or angular position) using the EDM device and a second camera image (118, fig. 5) containing the second point using the camera (for examples, Nichols discloses an augmented reality (AR) device 100 comprising a camera component 101 attached to a sensor component 102, wherein camera component 101 may include a camera 116 for capturing a camera image 118 and the sensor component 102 includes an electronic distance measurement (EDM) device 146 for measuring distances to discrete points within the field of view of camera 116. In some embodiments, EDM device 146 may generate a distance map comprising a plurality of detected distances and the relative orientation for each distance. See paras. 68-69. In some embodiments, camera component 101 includes an input device 120 for receiving user input 122 and generating user input data 124 based on the user input. User input 122 may indicate a point of interest (by, for example, moving a cursor being displayed on display 156 so as to indicate the point of interest) for which a GNSS coordinate is to be calculated. In some embodiments, camera component 101 includes a camera 116 for generating one or more camera images 118. Camera images 118 may include a single image, multiple images, a stream of images (e.g., a video), among other possibilities. For example, data processor 138 may analyze one or more camera images 118 to supplement orientation calculations based on angle data 128 or position calculations based on GNSS position data 136. See paras. 76-79. See also para. 104);
determining first and second 2D screen coordinates for the first and second points on the display using the first and second camera images (for examples, Nichols discloses calculating a horizontal distance between the geospatial position of the EDM device and the geospatial position of the point of interest based on the vertical or horizontal angle and the vertical and horizontal distances to the point of interest. See paras. 112-114, 120; Nichols further teaches The camera component 101 includes a camera 116 for generating multiple camera images 118 or a stream of images for comparison and to calculate a change in linear positions between points of interest in the camera images. See paras. 76-79. In addition, Nichols teaches that between the first time and the second time, additional points may be determined for both image frames using two thick solid lines (see para. 104); AR device may receive or generate model data that may include the geospatial positions of the 3D model so that the 3D model may be 2D rendered onto model image 152 based on the orientation angles of camera 116 with respect to horizontal and vertical reference lines (see para. 121); and A distance D is calculated between a geospatial position 1800 and a point P. Any point within a view frustum 1804 can be projected back onto a projection plane 1802 to generate a position (x,y) within projection plane 1802. The projection plane can then be converted into an image. In the illustrated embodiment, the image is set to be 320×240 pixels, and the value of distance D is encoded into the image at the position at which distance D intersects projection plane 1802. These steps are repeated for every detected distance value, thereby building up a distance map using every pixel in a scan. See paras. 129-130. As such, it is submitted that the Nichols teachings encompass the determining of a first and second 2D screen coordinates for the first and second points on the display using the first and second camera image);
computing first and second 3D surface points for the first and second points based on the first and second 2D screen coordinates and the first and second distances (for example, Nichols discloses a technique for correcting the world distance map so that each of the reference point is changed from the respective geospatial position of EDM device 146 to the respective geospatial position of camera 118, by using the third offset vector (X.sub.O3,Y.sub.O3,Z.sub.O3) to transform the world distance map to be referenced from the geospatial position of camera 118. In some embodiments, the geospatial positions of the plurality of points of the world distance map are calculated and the corrected world distance map is generated based on the calculated geospatial positions and the geospatial position of camera 118. See paras. 124 and 131. Specifically, Nichols discloses a receiver generates and outputs GNSS position data 738 comprising a plurality of GNSS points. Each of the plurality of GNSS points may be a 3D coordinate represented by three numbers, wherein the three numbers may correspond to latitude, longitude, and elevation/altitude. In other embodiments, the three numbers may correspond to X, Y, and Z positions. See para. 95, wherein in representing the geospatial position points each as a 3D coordinate based on camera orientation Nichols’ teachings encompass computing first and second 3D surface points for the first and second points based on the first and second 2D screen coordinates and the first and second distances, as claimed);
computing a position of the EDM device using a 3D vector formed between the first and second 3D surface points (for example, Nichols discloses calculating an offset vector extending between the geospatial position of the GNSS receiver and the geospatial position of the camera based on a known relationship between the orientation of the camera and the offset vector, wherein a magnitude of the offset vector is known, and calculating the geospatial position of the camera by modifying the geospatial position of the GNSS receiver with the offset vector. See paras. 30 and 43. Wherein, a first offset vector (X.sub.O1,Y.sub.O1,Z.sub.O1) is defined as the vector extending between the position of EDM device 146 (X.sub.EP,Y.sub.EP,Z.sub.EP) and the position of GNSS receiver 110 (X.sub.RP,Y.sub.RP,Z.sub.RP). And, knowledge of the first offset vector (X.sub.O1,Y.sub.O1,Z.sub.O1) and either the position of EDM device 146 (X.sub.EP,Y.sub.EP,Z.sub.EP) or the position of GNSS receiver 110 (X.sub.RP,Y.sub.RP,Z.sub.RP) can be used to find the unknown position. In some embodiments, the relationship between (e.g., angle formed by) the first offset vector (X.sub.O1,Y.sub.O1,Z.sub.O1) and the orientation of EDM device 146 is known and may be utilized in a way such that knowledge of the orientation of EDM device and nd either the position of EDM device 146 (X.sub.EP,Y.sub.EP,Z.sub.EP) or the position of GNSS receiver 110 (X.sub.RP,Y.sub.RP,Z.sub.RP) can be used to find the unknown position. And wherein, a second offset vector (X.sub.O2,Y.sub.O2,Z.sub.O2) is defined as the vector extending between the position of GNSS receiver and the position of camera 116 (X.sub.CP,Y.sub.CP,Z.sub.CP), and a third offset vector is defined as the vector extending between the position of EDM device and the position of camera 116. Because each of the offset vectors are connected to the other two offset vectors, knowledge of any two of the offset vectors can be used to find the unknown offset vector. See paras. 72-74. Furthermore, Nichols discloses camera orientation data 174 includes a 3D vector representing the orientation of camera 116 at a particular time. See para. 81); and
computing a camera-to-antenna offset based on the position of the EDM device and a known antenna-to-EDM offset (for examples: Nichols discloses the geospatial position of EDM device 146 is calculated based on the geospatial position of GNSS receiver 110 and includes calculating an offset vector extending between the geospatial position of GNSS receiver 110 and the geospatial position of EDM device 146 based on a known relationship between the orientation of EDM device 146 and the offset vector. The known relationship may be an offset angle (e.g., 1 degree, 2 degrees, 5 degrees, 10 degrees, etc.). The magnitude (e.g., length) of the offset vector may be constant and may be determined upon manufacture of sensor component 102. The geospatial position of EDM device 146 may calculated by modifying the geospatial position of GNSS receiver 110 with the offset vector. For example, the geospatial position of EDM device 146 may be calculated by summing the geospatial position of GNSS receiver 110 with the offset vector. The geospatial position of the point of interest is calculated based on the geospatial position of EDM device 146, the orientation of EDM device 146, and the distance to the point of interest. See paras. 11-112. The GNSS receiver 110 includes antenna 716 to receive signals used to estimate camera position points to be compared to the known position and correction data. The correction data may include a 3D offset amount and/or any one of various types of raw or processed satellite data that may be used to improve the accuracy of a position estimate. See paras. 90-91. In some embodiments, calculating the geospatial position of camera 116 includes calculating an offset vector extending between the geospatial position of GNSS receiver 110 and the geospatial position of camera 116 based on a known relationship between the orientation of camera 116 and the offset vector. The known relationship may be an offset angle (e.g., 1 degree, 2 degrees, 5 degrees, 10 degrees, etc.). See para. 119. Thus, according to Nichols since antenna-to-EDM offset is known, determining camera to EDM offset is equivalent to determining camera-to-antenna offset. As such, Nichols discloses computing a camera-to-antenna offset based on the position of the EDM device and a known antenna-to-EDM offset, as claimed).
Although Nichols discloses most claimed features of the invention, Nichols fails to particularly teach computing EDM position relative to a camera using 3D vectors from surface points.
However, Bausey discloses computing EDM position relative to a camera using 3D vectors from surface points (e.g., using data collected by sensor device 116 to determine the position and orientation of an inertial measuring unit (IMU) based on location-based data provided by a global positioning system (GPS), wherein interactive frameworks of the OS of device 116 is used to acquire 3D vectors from data collected by one of the device’s sensor to other UWB transmitters in range and relative pose of sensor device 116. See para. 46).
Accordingly, it would have been obvious to one ordinary skill in the art before the effective filling date of the invention to combine the camera-to-antenna offset determining scheme of Nichols with the computed position of the measuring device relative to a camera using 3D vectors from surface points, as disclosed by Bausey; in order to use the determined position of the distance measuring device to perform dense or semi-dense visual SLAM operations to recognize, segment, and analyze visual or other information about the environment of the camera. See para. 48-49 of Bausey.
As per claim 2, Nichols, as modified by Bausey, discloses the first and second 3D surface points are computed further based on intrinsic parameters of the camera. See paras. 95 and 124-131.
As per claim 3, Nichols, as modified by Bausey, discloses the known antenna-to-EDM offset comprises a 3D vector between a phase center of the antenna and the position of the EDM, wherein the camera-to-antenna offset comprises a 3D vector between a position of the camera and the phase center of the antenna. See paras. 71-74.
As per claim 4, Nichols, as modified by Bausey, discloses determining the first and second 2D screen coordinates for the first and second points on the display includes: receiving a user input identifying the first and second 2D screen coordinates; or analyzing the first and second camera images to automatically identify the first and second 2D screen coordinates. See paras. 107 and 129-131.
As per claim 5, Nichols, as modified by Bausey, discloses displaying the first and second camera images on the display. See paras. 68 and 79.
As per claim 6, Nichols, as modified by Bausey, discloses displaying a model image on the display using the camera-to-antenna offset. See paras. 101-103
As per claim 7, Nichols, as modified by Bausey, discloses the first point is positioned at a first surface and the second point is positioned at a second surface or the first point and the second point are positioned at a same surface. See paras. 124-131.
As per claim 8, Nichols, as modified by Bausey, discloses the device is an augmented reality (AR) device. See para. 6.
As per claim 9, Nichols, as modified by Bausey, discloses the device comprises (i) a camera component including the camera and the display and (ii) a sensor component including the antenna and the EDM device, and wherein the camera component is separable from and configured to removably attach to the sensor component. See paras. 15 and 29.
The invention of claim 10 contains features that correspond in scope with the limitations recited claim 1. As the limitations of claim 1 were found obvious over the combined teachings of Nichols and Bausey, it is readily apparent that the applied prior art performs the underlying elements. As such, the limitations of claim 10 are, therefore, subject to rejections under the same rationale as claim 1.
Claim 11 is rejected under the same rationale as claim 2.
Claim 12 is rejected under the same rationale as claim 3.
Claim 13 is rejected under the same rationale as claim 4.
Claim 14 is rejected under the same rationale as claim 5.
Claim 15 is rejected under the same rationale as claim 6.
Claim 16 is rejected under the same rationale as claim 7.
Claim 17 is rejected under the same rationale as claim 8.
Claim 18 is rejected under the same rationale as claim 9.
The subject-matter of independent claim 19 corresponds in terms of a computer readable medium to that of independent method claim 1, and the rationale raised above to reject the later also apply, mutatis mutandis, to the former.
Claim 20 is rejected under the same rationale as claim 6.
Conclusion
5. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
6. Any inquiry concerning this communication or earlier communications from the examiner should be directed to WESNER SAJOUS whose telephone number is (571)272-7791. The examiner can normally be reached on M-F 10:00 TO 7:30 (ET).
Examiner interviews are available via telephone and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice or email the Examiner directly at wesner.sajous@uspto.gov.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Said Broome can be reached on 571-272-2931. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/WESNER SAJOUS/Primary Examiner, Art Unit 2612
WS
02/27/2026