Prosecution Insights
Last updated: April 19, 2026
Application No. 17/875,222

REAL-TIME THERMAL CAMERA BASED ODOMETRY AND NAVIGATION SYSTEMS AND METHODS

Final Rejection §103§112
Filed
Jul 27, 2022
Examiner
CHEN, CHIA-LING
Art Unit
3645
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Teledyne Flir LLC
OA Round
2 (Final)
46%
Grant Probability
Moderate
3-4
OA Rounds
4y 1m
To Grant
99%
With Interview

Examiner Intelligence

Grants 46% of resolved cases
46%
Career Allow Rate
12 granted / 26 resolved
-5.8% vs TC avg
Strong +64% interview lift
Without
With
+63.6%
Interview Lift
resolved cases with interview
Typical timeline
4y 1m
Avg Prosecution
31 currently pending
Career history
57
Total Applications
across all art units

Statute-Specific Performance

§101
1.4%
-38.6% vs TC avg
§103
60.5%
+20.5% vs TC avg
§102
15.7%
-24.3% vs TC avg
§112
17.1%
-22.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 26 resolved cases

Office Action

§103 §112
DETAILED ACTION The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Response to Amendment The following addresses applicant’s remarks/amendments dated 9th January 2026. Claims 1-6, 8, 11-15 and 18 were amended; no claims were cancelled; no new Claims were added; therefore, claims 1-20 are pending in current application and are addressed below. Response to Arguments Applicant's arguments filed 9th January 2026 have been fully considered but they are not persuasive. Applicant’s arguments with respect to claims 1-20 have been considered but are moot because the arguments do not apply to the specific combination of the references being used in the current rejection. In response to applicant’s argument that references fail to show certain features of applicant’s invention, it is noted that features upon which applicant relies (i.e., “characterizing a dilation or contraction…..the thermal imaging module” and “removing a contribution…..the estimated relative velocity”) are not recited in the rejected claims. Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. See In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993). [[Here, Applicant argues that Whitley and Delaune do not teach, suggest or provide any reason for the dilation/contraction features of the amended claim 1]] However, these claim limitations were not present in the original independent claims and were presented by amendment on 9th January 2026. Therefore, the issue of whether Whitley and Delaune addresses these limitations are not relevant. These amended claims containing new limitations have been addressed by Seydoux in the present Office Action. In response to applicant’s arguments, see page 15-16, filed on 9th January 2026, with respect to the rejections of claim 6 and claim 7 have been fully considered and are not persuasive. Applicant argued that why the absence of common points in the rangefinder data would necessarily (inherently) imply no common points in the images as recited in claim 6/claim 7. However, Whitley disclosed, pages 53-54, to determine the standoff distance between the airframe and the bridge deck, a fixed gain Kalman filter was implemented to fuse the range finder readings with climb rate information stemming. The purpose of this filter is to smooth the noisy rangefinder data caused by sensor noise and discrete jumps in ranges as aircraft traverses under the bridge. It is clear that when a discreate jumps happened, the sequential images will have no common points. Thus, Whitley covers the limitation of “determining no common points of interest exist in the first and second thermal images”. In response to applicant’s arguments, see page 16, filed on 9th January 2026, with respect to the rejections of claim 9 have been fully considered and are not persuasive. Applicant argued that Whitely’s equation 2-27 defined the velocity Vf/o of the camera focal point in the global coordinate system NxNyNz (Whitley pages 58-59). In contrast, the “depth map” of claim 9 pertains to the range form the focal point to a point in a scene being imaged, which is not defined by the global coordinate system; Claim 9 recites “depth map corresponding to a field of view of the thermal imaging module”, and Whitley’s velocity of the focal point is unrelated to the field of view of the imaging module. However, the first and second thermal images are taken by thermal imaging module (thermal imaging camera) and the depth map is calculated based on the first and second thermal images. Thus, the depth map is definitely corresponding to a field of view of the thermal imaging module regarding the coordinate system was using for calculating the depth map. Examiner also awarded that applicant rewrote claim 6 as an independent claim. However, the related method claim 16 is not rewritten in an independent form. Examiner is not sure if applicant is intention to write the method claim 16 to depend on claim 11 or just due to mistake and would like to remind the applicant. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-5 and 7-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Regarding claim 1, last sentence, “removing a contribution of the dilation or contraction from the optical flow rate in obtaining the estimated relative velocity” is indefinite. Though, “characterizing a dilation or contraction of a field of the common points of interest indicative of motion along the optical axis of the thermal imaging module” has been mentioned in operation (A), there is no mention if “characterizing a dilation or contraction of a field of the common points of interest indicative of motion along the optical axis of the thermal imaging module” is performed in operation (B). Therefore, in operation (B) “removing a contribution of … in obtaining the estimated relative velocity” is indefinite. Regarding claim 11, last sentence, “removing a contribution of the dilation or contraction from the optical flow rate in obtaining the estimated relative velocity” is indefinite. Though, “characterizing a dilation or contraction of a field of the common points of interest indicative of motion along the optical axis of the thermal imaging module” has been mentioned in operation (A), there is no mention if “characterizing a dilation or contraction of a field of the common points of interest indicative of motion along the optical axis of the thermal imaging module” is performed operation (B). Therefore, in operation (B) “removing a contribution of … in obtaining the estimated relative velocity” is indefinite. Other claims are rejected due to claim dependency. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-3, 7, 9-13, 16-17 and 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Whitley (“Small unmanned aircraft systems for urban structural inspections”, January 2018, XP 055852823, hereinafter “Whitley”), modified in view of Seydoux et al. (US 20110049290 A1, hereinafter “Seydoux”), in view of Delaune et al. ("Thermal-Inertial Odometry for Autonomous Flight Throughout the Night", 2019 IEEE/RSJ inter. Conf. on Intelligent Robots and Systems (IROS), IEEE, 3 November 2019, hereinafter “Delaune”). Regarding claim 1, Whitley teaches a a Whitley; Fig. 2-11, Fig. 2-13, page 47, page 57, section 2.3.4, paragraph 1 , PX4FLOW sensor (an image module) is installed in an unmanned vehicle on the center (equivalent centered about an optical axis and position is fixed relative to an orientation of the unmanned vehicle); Whitley disclosed the same system as claim 1 but uses visible light image module instead of thermal images module); a ranging sensor system fixed relative to the Whitley; Fig. 2-13, page 47, paragraph 1, line 6, sensor PX4FLOW is used in conjunction with a laser range finder (fixed relative to the image module) to provides navigation information for the autopilot to use; Page 58, last sentence, Page 59, 1st sentence, the goal of this derivation is to determine the coordinates of the camera in the global coordinate system as a function of the velocity of feature points in the camera image frame, standoff distance that has been calculated previously from a laser range finder, and attitude information from an extended Kalman filtered inertial measurement unit and a gyro fixed to the camera itself); and a logic device coupled to and/or integrated with the receive a first Whitley; Fig. 2-16, Section 2.3.4, paragraphs 1-2, and pages 58-59, disclosed the system acquires a sequence of image (equivalent to 1st image and 2nd image)by the optical flow sensor and corresponding range data by the range finder (rp/f is obtained for an image based on the range finder data; also see eq. 2-25); the PX4Flow generates the flow rate and using feature tracking algorithm that calculates the pixel flow in between consecutive frames by identifying feature points and matching them between consecutive images. Once these feature points are matched, the resulting displacement can be determined in the imager coordinate system, measured in pixels); receive a second determine an estimated relative velocity of the unmanned vehicle based, at least in part, on the received first and second Whitley; Fig. 2-17, page 63, disclosed the mapping between the 2D image frame to the camera reference frame using a model based on similar triangles. Page 64, last sentence, disclosed the relative velocity Vf of the UAV with respect to a point p is determined based on the optical flow, i.e., from the received two images (equation 2-46 through 2-49)). wherein determining the estimated relative velocity comprises: identifying one or more common points of interest in the first and second thermal images (Whitley; page 58, paragraph 1, the PX4Flow generate the flow rate by using a KLT feature tracking algorithm that calculates the pixel flow in between consecutive frames (equivalent to first and second images) by identifying feature points and matching them between consecutive images (equivalent to for each common point of interest identified in the first and second images). Once these feature points are matched, the resulting displacement can be determined in the imager coordinate system, measured in pixel); and Whitley does not teach, the acquisition and use of thermal images. performing at least one of operations (A) or (B), wherein operation (A) comprises: characterizing a dilation or contraction of a field of the common points of interest indicative of motion along the optical axis of the thermal imaging module; and using the characterized dilation or contraction to supplement the ranging sensor data indicating the standoff distance between the thermal imaging module and a surface within a field of view of the thermal imaging module and/or to construct a depth map corresponding to the field of view of the thermal imaging module; wherein operation (B) comprises: determining an optical flow rate based, at least in part, on a position deviation for each common point of interest identified in the first and second thermal images; and removing a contribution of the dilation or contraction from the optical flow rate in obtaining the estimated relative velocity. Seydoux disclosed in paragraph [0135], in order to estimate the movement in translation of the aircraft, an estimate is made initially of the movement in translation along the direction of the optical axis of the camera, making use of the distortions of shapes defined by the points of interest tracked between the images (equivalent to characterizing a dilation or contraction of … the thermal image module). Finally, the estimated vector in the frame of reference of the camera is converted into the fixed 3D frame of reference. Since the goal is estimate the movement in translation of the aircraft, this implies after estimate tracking the movement along the optical axis of the camera, making use of the distortions of shapes would be used to estimate the movement in translation of the aircraft. Furthermore, since the range of the scene in the camera is not known, the movement in translation is thus estimated to within a scale factor (equivalent to using the characterized dilation or contraction……the thermal imaging module). It would have been obvious to one of ordinary skill in the art prior to the effective filling date of this invention to modify the imaging odometry system taught by Whitley to include characterizing a dilation or contraction of a field of the common points of interest indicative of motion along the optical axis of the thermal imaging module; and using the characterized dilation or contraction to supplement the ranging sensor data indicating the standoff distance between the thermal imaging module and a surface within a field of view of the thermal imaging module and/or to construct a depth map corresponding to the field of view of the thermal imaging module taught by Seydoux with a reasonable expectation of success. The reasoning for this is that an estimate is made initially of the movement in translation along the direction of the optical axis of the camera and making use of the distortions of shapes defined by the points of interest tracked between the images. Then estimate the movement in translation of the aircraft based on the range of the scene in the camera (Seydoux; [0135]). However, Whitley modified in view of Seydoux still not teach, the acquisition and use of thermal images. Delaune introduced thermal odometry scheme applies image feature constraints to the state propagated with inertial measurement. Using infrared camera and by computing the optical flow by the KLT algorithm based on the acquired thermal images. More detail information can be seen in Section II-B through Section III. It would have been obvious to one of ordinary skill in the art prior to the effective filling date of this invention to modify the imaging odometry system taught by Whitley to include characterizing a dilation or contraction of a field of the common points of interest indicative of motion along the optical axis of the thermal imaging module; and using the characterized dilation or contraction to supplement the ranging sensor data indicating the standoff distance between the thermal imaging module and a surface within a field of view of the thermal imaging module and/or to construct a depth map corresponding to the field of view of the thermal imaging module taught by Seydoux, include replace the visible light images to a thermal odometry image taught by Delaune with a reasonable expectation of success. The reasoning for this is using a thermal cameras which capture the infrared radiation emitted by all objects with a temperature above the absolute zero in their FOV. Furthermore, with increased resolution, decreased size, weight, power and cost of these images, thermal-inertial odometry now appears as a promising approach to fly at night without GPS nor the cost of carrying a lidar sensor, or the range constraint of illumination-aided solutions (Delaune; introduction, paragraph 3). Regarding claim 2, Whitley as modified above teaches the thermal imaging odometry system recited in claim 1, wherein the determining the estimated relative velocity comprises the operation (A) which comprises using the characterized dilation or contraction to construct a depth map corresponding to the field of view of the thermal imaging module (Seydoux; [0135], please also see mapping in claim 1 above); and the logic device is configured to: receive a first orientation of the unmanned vehicle and/or the thermal imaging module associated with the first thermal image from an orientation sensor coupled to the unmanned vehicle and/or the thermal imaging module (Whitley; section 2.3.4, pages 62, 69, optical flow navigation development disclosed the orientation provided by a gyro sensor is used to obtain the absolute velocity vf/o based on the relative velocity which is shown in Equation 2-71 and 2-72 where X A ˙ and Y A ˙ are based on gyro data gx and gy from equation 2-69 and 2-70 and equation 2-36 (gyro attached to the camera reports the angular velocity of the global reference frame relative to the camera reference frame). More detail information can be seen in section 2.3.4); receive a second orientation of the unmanned vehicle and/or the thermal imaging module associated with the second thermal image from the orientation sensor (same as above); and determine an absolute velocity of the unmanned vehicle based, at least in part, on the received first and second orientations and the determined estimated relative velocity of the unmanned vehicle (same as above). Regarding claim 3, Whitley as modified above teaches the thermal imaging odometry system recited in claim 1, wherein: the determining the estimated relative velocity comprises the operation (A) which comprises using the characterized dilation or contraction to supplement the ranging sensor data indicating the standoff distance (Seydoux; [0135], please also see mapping in claim 1 above); and the logic device is configured to: receive a user-defined target position and/or course for the unmanned vehicle (Whitley; Fig. 5-2, Section 5.3.2-5.3.3, pages 149-151, disclosed a user engages the flight mode, the algorithm places a virtual waypoint directly in front of the aircraft at a specified distance from the current point. This waypoint is what the aircraft will orbit about. Once this has been done, a second waypoint will be rotated about the virtual waypoint at an orbit velocity dictated by the aileron stick. While this waypoint is being rotated, the aircraft will constantly attempt to navigate to it. Meanwhile, the autopilot will constantly calculate the heading between the current position of the aircraft to the center “virtual waypoint” and set the yaw accordingly to maintain that heading); determine an absolute velocity of the unmanned vehicle based, at least in part, on the determined estimated relative velocity of the unmanned vehicle (Whitley; section 2.3.4, pages 62, 69, optical flow navigation development disclosed the orientation provided by a gyro sensor is used to obtain the absolute velocity vf/o based on the relative velocity which is shown in Equation 2-71 and 2-72 where X A ˙ and Y A ˙ are based on gyro data gx and gy from equation 2-69 and 2-70 and equation 2-36 (gyro attached to the camera reports the angular velocity of the global reference frame relative to the camera reference frame). More detail information can be seen in section 2.3.4); determine a heading adjustment for the unmanned vehicle based, at least in part, on the received user-defined target position and/or course and the determined absolute velocity of the unmanned vehicle (Whitley; Fig. 5-2, Section 5.3.2-5.3.3, pages 149-151, disclosed a user engages the flight mode, the algorithm places a virtual waypoint directly in front of the aircraft at a specified distance from the current point. This waypoint is what the aircraft will orbit about. Once this has been done, a second waypoint will be rotated about the virtual waypoint at an orbit velocity dictated by the aileron stick. While this waypoint is being rotated, the aircraft will constantly attempt to navigate to it. Meanwhile, the autopilot will constantly calculate the heading between the current position of the aircraft to the center “virtual waypoint” and set the yaw accordingly to maintain that heading); and control a propulsion system of the unmanned vehicle to update a heading of the unmanned vehicle according to the heading adjustment (same as above). Regarding claim 7, Whitley as modified above teaches the thermal imaging odometry system recited in claim 1, wherein the determining the estimated relative velocity of the unmanned vehicle comprises: determining no common points of interest exist in the first and second thermal images (the scenarios of no common points of interest exist in between the 1st and 2nd images could be like there is a leaf or dirt suddenly occur during the measurement or discrete jump during sequence measurement (Whitley; pages 53-54); when these situation happened, it would have been obvious to one of ordinary skill in the art to recognize the data in between 1st and 2nd image measurement is not correct (no common points of interesting) and discard the data. Instead, a previous data series (such as third image (prior to the 1st image) with 1st image) or a subsequent data series (such as third image (subsequent to the 2nd image) with 2nd image) will be used to estimate the relative velocity); identifying one or more common points of interest in the first or second thermal image and a third image received prior to the first image or subsequent to the second image (Whitley; page 58, paragraph 1, the PX4Flow generate the flow rate by using a KLT feature tracking algorithm that calculates the pixel flow in between consecutive frames (equivalent to third (prior to the 1st image) and 1st images or second and third (subsequent to 2nd image) images) by identifying feature points and matching them between consecutive images (equivalent to for each common point of interest identified in the first and second images). Once these feature points are matched, the resulting displacement can be determined in the imager coordinate system, measured in pixel); determining an optical flow rate based, at least in part, on a position deviation for each common point of interest identified in the first or second thermal image and the third image (same as above); and determining the estimated relative velocity of the unmanned vehicle based, at least in part, on the determined optical flow rate and the first or second ranging sensor data and third ranging sensor data corresponding to the third image (Whitley; same as above; Fig. 2-17, page 63, disclosed the mapping between the 2D image frame to the camera reference frame using a model based on similar triangles. Page 64, last sentence, disclosed the relative velocity Vf of the UAV with respect to a point p is determined based on the optical flow, i.e., from the received two images (equation 2-46 through 2-49)). Regarding claim 9, Whitley as modified above teaches the thermal imaging odometry system recited in claim 1, wherein the logic device is configured to: receive an absolute velocity of the unmanned vehicle from a global navigation satellite system and/or another thermal odometry system coupled to the unmanned vehicle (Whitley; section 2.3.4, pages 69-70, the velocities derived in eq. 2-67 and eq. 2-68 can be used for supplying feedback for a stabilization controller, but if absolute position is need, it has to be converted to the global coordinate system. Revisiting eq. 2-62, the ability to convert to the global reference frame is presented as shown in eq. 2-71 and 2-72); and determine a depth map corresponding to a field of view of the thermal imaging module based, at least in part, on the first and second thermal images and the received absolute velocity of the unmanned vehicle (Whitley; section 2.3.4, pages 69-70, to obtain the position of the focal point relative to the global coordinate system, all one has to do is integrate equation eq. 2-27 with respect to time as seen in 2-75; More detail information can be seen in section 2.3.4). Regarding claim 10, Whitley as modified above teaches the thermal imaging odometry system recited in claim 1, wherein: the thermal imaging module comprises a stereo vision system characterized (Whitley; pages 48-49, implies a stereo camera is hinted to replace the rangefinder), at least in part, by an intra-axial distance between first and second thermal imaging modules of the stereo vision system; the first and second times comprise a common time; and the determining the estimated relative velocity of the unmanned vehicle is based, at least in part, on the first and second thermal images and the intra-axial distance of the stereo vision system (Whitley; pages 48-49, implies a stereo camera is hinted to replace the rangefinder; Fig. 3-2, pages 95-97, disclosed the use of two different camera system similar to a stereo vision system with a need for a time sync between the autopilot and DIC data measure; pages 108, disclosed the DIC system allowed the estimated drift in the system to be calculated in addition to the accuracy of the velocity measurements, which impact the position holding accuracy as position is calculated through integrating the calculated velocity. Another benefit to comparing velocity magnitude is it is independent of yaw rotation). Claims 11-13, 17 and 19-20 are the method claims possess nearly identical limitation to those of claims 1-3, 7 and 9-10 and are thus rejected for the same reasoning. Regarding claim 16, Whitley as modified above teaches the method of claim 11, wherein the determining the estimated relative velocity of the unmanned vehicle comprises: determining no common points of interest exist in the first and second thermal images (Whitley; pages 53-54, to determine the standoff distance between the airframe and the bridge deck, a fixed gain Kalman filter was implemented to fuse the rangefinder readings with climb rate information stemming. The purpose of this filter is to smooth the noisy rangefinder data caused by sensor noise and discrete jumps (equivalent no common points of interest exist in the 1st and 2nd thermal images) in ranges as aircraft traverses under the bridge); receiving first and second orientations of the unmanned vehicle and/or the thermal imaging module, corresponding respectively to the first and/or second thermal images, from an orientation sensor coupled to the unmanned vehicle and/or the thermal imaging module (Whitley; section 2.3.4, pages 62, 69, optical flow navigation development disclosed the orientation provided by a gyro sensor is used to obtain the absolute velocity vf/o based on the relative velocity which is shown in Equation 2-71 and 2-72 where X A ˙ and Y A ˙ are based on gyro data gx and gy from equation 2-69 and 2-70 and equation 2-36 (gyro attached to the camera reports the angular velocity of the global reference frame relative to the camera reference frame). More detail information can be seen in section 2.3.4); receiving first and/or second accelerations of the unmanned vehicle and/or the thermal imaging module, corresponding respectively to the first and/or second thermal images, from an accelerometer coupled to the unmanned vehicle and/or the thermal imaging module (Whitley; section 2.3.3, pages 53-55, to formulate the Kalman filter, a state space representation is needed to relate the climb rate derived from the Extended Kalman fusing the barometric pressure sensor with accelerometer data and the normal standoff distances); and determining the estimated relative velocity of the unmanned vehicle based, at least in part, on the received first and second orientations and first and/or second accelerations (same as above). Claim(s) 6 is rejected under 35 U.S.C. 103 as being unpatentable over Whitley modified in view of Delaune. Regarding claim 6, Whitley teaches a a Whitley; Fig. 2-11, Fig. 2-13, page 47, page 57, section 2.3.4, paragraph 1 , PX4FLOW sensor (an image module) is installed in an unmanned vehicle on the center (equivalent centered about an optical axis and position is fixed relative to an orientation of the unmanned vehicle); Whitley disclosed the same system as claim 1 but uses visible light image module instead of thermal images module); a ranging sensor system fixed relative to the Whitley; Fig. 2-13, page 47, paragraph 1, line 6, sensor PX4FLOW is used in conjunction with a laser range finder (fixed relative to the image module) to provides navigation information for the autopilot to use; Page 58, last sentence, Page 59, 1st sentence, the goal of this derivation is to determine the coordinates of the camera in the global coordinate system as a function of the velocity of feature points in the camera image frame, standoff distance that has been calculated previously from a laser range finder, and attitude information from an extended Kalman filtered inertial measurement unit and a gyro fixed to the camera itself); and a logic device coupled to and/or integrated with the receive a first Whitley; Fig. 2-16, Section 2.3.4, paragraphs 1-2, and pages 58-59, disclosed the system acquires a sequence of image (equivalent to 1st image and 2nd image)by the optical flow sensor and corresponding range data by the range finder (rp/f is obtained for an image based on the range finder data; also see eq. 2-25); the PX4Flow generates the flow rate and using feature tracking algorithm that calculates the pixel flow in between consecutive frames by identifying feature points and matching them between consecutive images. Once these feature points are matched, the resulting displacement can be determined in the imager coordinate system, measured in pixels); receive a second determine an estimated relative velocity of the unmanned vehicle based, at least in part, on the received first and second Whitley; Fig. 2-17, page 63, disclosed the mapping between the 2D image frame to the camera reference frame using a model based on similar triangles. Page 64, last sentence, disclosed the relative velocity Vf of the UAV with respect to a point p is determined based on the optical flow, i.e., from the received two images (equation 2-46 through 2-49)). wherein the determining the estimated relative velocity of the unmanned vehicle comprises: determining no common points of interest exist in the first and second Whitley; pages 53-54, to determine the standoff distance between the airframe and the bridge deck, a fixed gain Kalman filter was implemented to fuse the rangefinder readings with climb rate information stemming. The purpose of this filter is to smooth the noisy rangefinder data caused by sensor noise and discrete jumps (equivalent no common points of interest exist in the 1st and 2nd thermal images) in ranges as aircraft traverses under the bridge); receiving first and second orientations of the unmanned vehicle and/or the Whitley; section 2.3.4, pages 62, 69, optical flow navigation development disclosed the orientation provided by a gyro sensor is used to obtain the absolute velocity vf/o based on the relative velocity which is shown in Equation 2-71 and 2-72 where X A ˙ and Y A ˙ are based on gyro data gx and gy from equation 2-69 and 2-70 and equation 2-36 (gyro attached to the camera reports the angular velocity of the global reference frame relative to the camera reference frame). More detail information can be seen in section 2.3.4); receiving first and/or second accelerations of the unmanned vehicle and/or the Whitley; section 2.3.3, pages 53-55, to formulate the Kalman filter, a state space representation is needed to relate the climb rate derived from the Extended Kalman fusing the barometric pressure sensor with accelerometer data and the normal standoff distances); and determining the estimated relative velocity of the unmanned vehicle based, at least in part, on the received first and second orientations and first and/or second accelerations (same as above). However, Whitley does not teach, the acquisition and use of thermal images. Delaune introduced thermal odometry scheme applies image feature constraints to the state propagated with inertial measurement. Using infrared camera and by computing the optical flow by the KLT algorithm based on the acquired thermal images. More detail information can be seen in Section II-B through Section III. It would have been obvious to one of ordinary skill in the art prior to the effective filling date of this invention to modify the imaging odometry system taught by Whitley to replace the visible light images to a thermal odometry image taught by Delaune with a reasonable expectation of success. The reasoning for this is using a thermal cameras which capture the infrared radiation emitted by all objects with a temperature above the absolute zero in their FOV. Furthermore, with increased resolution, decreased size, weight, power and cost of these images, thermal-inertial odometry now appears as a promising approach to fly at night without GPS nor the cost of carrying a lidar sensor, or the range constraint of illumination-aided solutions (Delaune; introduction, paragraph 3). Allowable Subject Matter Claims 4-5, 8, 14-15 and 18 would be allowable if rewritten to overcome the rejection(s) under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), 2nd paragraph, set forth in this Office action and to include all of the limitations of the base claim and any intervening claims. The following is a statement of reasons for the indication of allowable subject matter: Regarding claim 4 and claim 14, the prior art of record does not explicitly teach nor render obvious the following element, along with all other feature: wherein the determining the estimated relative velocity of the unmanned vehicle comprises: performing the operation (B); and the removing is performed by averaging across the field of the common points of interest. Regarding claim 5 and claim 15, the prior art of record does not explicitly teach nor render obvious the following element, along with all other feature: Wherein: the unmanned vehicle comprises an unmanned aerial vehicle; the logic device is configured to determine an angular velocity of the unmanned vehicle and/or the thermal imaging module based, at least in part, on first and second orientations of the unmanned vehicle and/or the thermal imaging module associated with the first and second thermal images provided by an orientation sensor coupled to the unmanned vehicle and/or the thermal imaging module. the determining the estimated relative velocity of the unmanned vehicle comprises: performing the operation (B): determining a net flow rate based, at least in part, on the determined optical flow rate processed by the removing the contribution of the dilation or contraction and angular velocity; and determining the estimated relative velocity of the unmanned vehicle based, at least in part, on the net flow rate. Regarding claim 8 and claim 18, the prior art of record does not explicitly teach nor render obvious the following element, along with all other feature: Wherein: The determining the estimated relative velocity comprises the operations (A) and (B); the estimated relative velocity comprises a first estimated relative velocity, and wherein the logic device is configured to: receive a third thermal image of the scene at a third time and corresponding third ranging sensor data from the ranging sensor system; and determine a second estimated relative velocity of the unmanned vehicle based, at least in part, on the received second and third thermal images and the respective corresponding second and third ranging sensor data. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHIA-LING CHEN whose telephone number is (571)272-1047. The examiner can normally be reached Monday thru Friday 8-5 ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Yuqing Xiao can be reached at (571)270-3630. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /CHIA-LING CHEN/Examiner, Art Unit 3645 /YUQING XIAO/Supervisory Patent Examiner, Art Unit 3645
Read full office action

Prosecution Timeline

Jul 27, 2022
Application Filed
Oct 07, 2025
Non-Final Rejection — §103, §112
Jan 09, 2026
Response Filed
Mar 09, 2026
Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601817
LIDAR SYSTEM CALIBRATION
2y 5m to grant Granted Apr 14, 2026
Patent 12596181
SCANNING LIDAR WITH OPTICAL SWITCHING
2y 5m to grant Granted Apr 07, 2026
Patent 12591058
DYNAMIC LASER EMISSION CONTROL IN LIGHT DETECTION AND RANGING (LIDAR) SYSTEMS
2y 5m to grant Granted Mar 31, 2026
Patent 12571889
DISTANCE MEASUREMENT DEVICE
2y 5m to grant Granted Mar 10, 2026
Patent 12510632
LIDAR SYSTEM COMPRISING TWO DIFFRACTIVE COMPONENTS
2y 5m to grant Granted Dec 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
46%
Grant Probability
99%
With Interview (+63.6%)
4y 1m
Median Time to Grant
Moderate
PTA Risk
Based on 26 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month