Prosecution Insights
Last updated: April 19, 2026
Application No. 18/908,214

Relative Image Capture Device Orientation Calibration

Non-Final OA §103§DP
Filed
Oct 07, 2024
Examiner
PAIGE, TYLER D
Art Unit
3664
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Skydio Inc.
OA Round
1 (Non-Final)
91%
Grant Probability
Favorable
1-2
OA Rounds
2y 1m
To Grant
99%
With Interview

Examiner Intelligence

Grants 91% — above average
91%
Career Allow Rate
1166 granted / 1276 resolved
+39.4% vs TC avg
Moderate +8% lift
Without
With
+8.2%
Interview Lift
resolved cases with interview
Fast prosecutor
2y 1m
Avg Prosecution
28 currently pending
Career history
1304
Total Applications
across all art units

Statute-Specific Performance

§101
17.0%
-23.0% vs TC avg
§103
29.8%
-10.2% vs TC avg
§102
24.1%
-15.9% vs TC avg
§112
18.8%
-21.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1276 resolved cases

Office Action

§103 §DP
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This office action is in response to an application filed on 10/07/2024. The applicant submits an Information Disclosure Statement dated 10/17/2024. The applicant does not make a claim for Foreign priority. The applicant does make a claim for Domestic priority for application with filing dates 01/23/2018 and 02/27/2018. Claims dated 03/17/2025 are examined. Claims 2 – 20 are canceled and claims 21 – 39 are new. Claim Objections Claim 33 is objected to because of the following informalities: The claim contains a double semi-colon after the last “obtaining” clause. Appropriate correction is required. Drawings The drawings are objected to under 37 CFR 1.83(a). The drawings must show every feature of the invention specified in the claims. Therefore, the drawings do not show two image capture devices. Fig. 1 has items 170 and 175, however, the features are distinctly defined in the specification paragraph 0031. Therefore, the two features of a fixed orientation image orientation capture device and adjustable orientation image capture device must be shown or the features canceled from the claims. No new matter should be entered. Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. The figure or figure number of an amended drawing should not be labeled as “amended.” If a drawing figure is to be canceled, the appropriate figure must be removed from the replacement sheet, and where necessary, the remaining figures must be renumbered and appropriate changes made to the brief description of the several views of the drawings for consistency. Additional replacement sheets may be necessary to show the renumbering of the remaining figures. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance. Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. Claims 1, 21 – 39 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1 - 20 of U.S. Patent No. 12,111,659. Although the claims at issue are not identical, they are not patentably distinct from each other because the claims are directed to the same inventive concept of an unmanned aerial vehicle with a fixed and adjustable imaging device collecting a first and second image that is able to control trajectory of the vehicle based upon correlation of the images with respect to a temporal period. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1, 21 - 39 are rejected under 35 U.S.C. 103 as being unpatentable over Harris US 2017/0275023 in view of Lee US 2017/0351900. As per claim 1, An unmanned aerial vehicle, comprising: a fixed orientation image capture device; (Harris paragraph 0042 discloses, “the UAV may include fixed imaging components coupled at each corner junction 131-1, 131-2, 131-3, 131-4 and adjustable imaging components coupled to each side of the UAV.”) an adjustable orientation image capture device; (Harris paragraph 0041 discloses, “ the UAV also includes an imaging component 150 that is coupled to the frame 104 of the UAV with a gimbal 152. The imaging component 150 is discussed in further detail below with respect to FIG. 2. The gimbal 152 may be a one, two, or three axis gimbal that is used to alter an orientation and/or position of the imaging component 150.”) a processor configured to execute instruction stored on a non-transitory computer readable medium to control the unmanned aerial vehicle to traverse a portion of an operational environment of the unmanned aerial vehicle using relative image capture device orientation calibration by: obtaining a first image from the fixed orientation image capture device; (Harris paragraph 0014 discloses, “the imaging component may include a first camera and a second camera that are separated by a known baseline distance such that a field of view of the first camera overlaps with at least a portion of the field of view of the second camera. The first camera and the second camera are configured to form images using visible light.”) obtaining a second image from the adjustable orientation image capture device; (Harris paragraph 0014 discloses, “the imaging component may include a first camera and a second camera that are separated by a known baseline distance such that a field of view of the first camera overlaps with at least a portion of the field of view of the second camera. The first camera and the second camera are configured to form images using visible light.”) obtaining feature correlation data based on the first image and the second image; (Harris paragraph 0014 discloses, “the imaging component may include a first camera and a second camera that are separated by a known baseline distance such that a field of view of the first camera overlaps with at least a portion of the field of view of the second camera. The first camera and the second camera are configured to form images using visible light.”) obtaining current object identification data of an object located within a field of view of the fixed orientation image capture device or the adjustable orientation image capture device; (Lee paragraph 0125 teaches, “The position information may be related to a center point and/or edges of the subject. According to an embodiment of the present disclosure, the photographing information may include position information and size information of two or more objects. When the object is a person, the photographing information may include position information (in the image) and size information of a body part. The body part may be a face of the person. The body part may include the face and/or part or the entire of the body including the face of the person.”) obtaining current object motion data from the current object identification data by determining motion of a current object between a temporal sequence of images captured; (Lee paragraph 0159 teaches, “The processor 500 during the flight may track the second object corresponding to the first object using the camera module 560, and capture the second image including the second object at or near the second 3D position so that the second image taken by the unmanned photographing device corresponds to the first image.”) and a trajectory controller configured to control a trajectory of the unmanned aerial vehicle relative to the current object. (Lee paragraph 0164 teaches, “The unmanned photographing device 690 may compare and analyze the calculated composition information and the target composition information received from the electronic device, and autonomously fly to the determined photographing location (a position of the target composition).”) Harris discloses combining depth and thermal information for object detection and avoidance. Harris does not disclose controlling trajectory relative to a current object. Lee teaches of controlling trajectory relative to a current object. Therefore, at the time filing, it would have been obvious to one of ordinary skill in the art would incorporate the teachings of Lee et.al. into the invention of Harris. Such incorporation is motivated by the need to ensure maintaining the camera on the target. As per claim 21, The unmanned aerial vehicle of claim 1, wherein: a portion of the field of view of the fixed orientation image capture device corresponding to capturing the first image overlaps a portion of the field of view of the adjustable orientation image capture device corresponding to capturing the second image; ( Harris paragraph 0056 discloses, “the baseline distance (di) that separates the two visual cameras and the orientation of the cameras with respect to one another are arranged such that at least a portion of the two fields of view 206, 208 overlap to form an effective field of view 210. Objects positioned within the effective field of view 210 are detectable by both visual cameras 202-1, 202-2 and images formed by the two cameras will both include representations of the object within the effective field of view 210.”) obtaining feature correlation data includes obtaining spatial feature correlation data, wherein obtaining the spatial feature correlation data includes obtaining the spatial feature correlation data such that the spatial feature correlation data indicates a correlation between a feature from the first image and a corresponding feature from the second image; (Harris paragraph 0057 discloses, “the two visual cameras 202-1, 202-2 are calibrated such that the pixel information for each camera corresponding to portions within the effective field of view 210 are correlated such that image information within the effective field of view from the two paired cameras can be merged and, with knowledge of the baseline distance (di), depth information for objects within the field of view may be generated.”) and obtaining relative image capture device orientation calibration data includes using five- point relative positioning based on the spatial feature correlation data. (Harris paragraph 0059 discloses, “in addition to calibrating the two visual cameras 202-1, 202-2 so that depth information can be obtained, the IR camera 204 is also calibrated so that the pixel information within the effective field of view 210 that is encompassed by the field of view of view of the IR camera 204 is correlated with the pixel information of the IR camera 204.”) As per claim 22, The unmanned aerial vehicle of claim 1, wherein: the field of view of the fixed orientation image capture device corresponding to capturing the first image is non-overlapping with the field of view of the adjustable orientation image capture device corresponding to capturing the second image; (Harris paragraph 0061 discloses, “in other implementations the imaging component 250 may include multiple processing components that operate in conjunction or independently to generate the horizontal dimension, vertical dimension, depth dimension, and thermal dimension for each of the pixels within the effective field of view.”) and obtaining feature correlation data includes obtaining temporal feature correlation data. (Harris paragraph 0081 discloses, “the object may be monitored for a period of time to determine whether the object is moving.”) As per claim 23, The unmanned aerial vehicle of claim 1, wherein the current object identification data includes a spatial location, trajectory, or both of the current object. (Lee paragraph 0126 teaches, “The composition information may include information about an object type, an image resolution or size (XY ratio), an object position, an object size, an object orientation, and composition area.”) Harris discloses combining depth and thermal information for object detection and avoidance. Harris does not disclose controlling trajectory relative to a current object. Lee teaches of controlling trajectory relative to a current object. Therefore, at the time filing, it would have been obvious to one of ordinary skill in the art would incorporate the teachings of Lee et.al. into the invention of Harris. Such incorporation is motivated by the need to ensure maintaining the camera on the target. As per claim 24, The unmanned aerial vehicle of claim 23, wherein the spatial location is determined by triangulating the current object based on orientation calibration data. (Harris paragraph 0059 discloses, “in addition to calibrating the two visual cameras 202-1, 202-2 so that depth information can be obtained, the IR camera 204 is also calibrated so that the pixel information within the effective field of view 210 that is encompassed by the field of view of view of the IR camera 204 is correlated with the pixel information of the IR camera 204.”) As per claim 25, The unmanned aerial vehicle of claim 24, wherein obtaining the orientation calibration data includes obtaining relative image capture device orientation calibration data using spatiotemporal calibration based on temporal feature correlation data. (Harris paragraph 0059 discloses, “in addition to calibrating the two visual cameras 202-1, 202-2 so that depth information can be obtained, the IR camera 204 is also calibrated so that the pixel information within the effective field of view 210 that is encompassed by the field of view of view of the IR camera 204 is correlated with the pixel information of the IR camera 204.”) As per claim 26, A method comprising: (Harris paragraph 0084 and claims 14 – 20) controlling, by a processor in response to instructions stored on a non-transitory computer readable medium, an unmanned aerial vehicle to traverse a portion of an operational environment of the unmanned aerial vehicle using relative image capture device orientation calibration by: (Harris paragraph 0084 discloses, “the UAV control system 814 includes one or more processors 802, coupled to a memory, e.g., a non-transitory computer readable storage medium 820, via an input/output (I/O) interface 810.”) obtaining a first image from a fixed orientation image capture device of the unmanned aerial vehicle; (Harris paragraph 0014 discloses, “the imaging component may include a first camera and a second camera that are separated by a known baseline distance such that a field of view of the first camera overlaps with at least a portion of the field of view of the second camera. The first camera and the second camera are configured to form images using visible light.”) obtaining a second image from an adjustable orientation image capture device of the unmanned aerial vehicle; (Harris paragraph 0014 discloses, “the imaging component may include a first camera and a second camera that are separated by a known baseline distance such that a field of view of the first camera overlaps with at least a portion of the field of view of the second camera. The first camera and the second camera are configured to form images using visible light.”) obtaining feature correlation data based on the first image and the second image; (Harris paragraph 0014 discloses, “the imaging component may include a first camera and a second camera that are separated by a known baseline distance such that a field of view of the first camera overlaps with at least a portion of the field of view of the second camera. The first camera and the second camera are configured to form images using visible light.”) obtaining current object identification data of an object located within a field of view of the fixed orientation image capture device or the adjustable orientation image capture device; (Lee paragraph 0125 teaches, “The position information may be related to a center point and/or edges of the subject. According to an embodiment of the present disclosure, the photographing information may include position information and size information of two or more objects. When the object is a person, the photographing information may include position information (in the image) and size information of a body part. The body part may be a face of the person. The body part may include the face and/or part or the entire of the body including the face of the person.”) obtaining current object motion data from the current object identification data by determining motion of a current object between a temporal sequence of images captured; (Lee paragraph 0159 teaches, “The processor 500 during the flight may track the second object corresponding to the first object using the camera module 560, and capture the second image including the second object at or near the second 3D position so that the second image taken by the unmanned photographing device corresponds to the first image.”) and controlling a trajectory of the unmanned aerial vehicle relative to the current object. (Lee paragraph 0164 teaches, “The unmanned photographing device 690 may compare and analyze the calculated composition information and the target composition information received from the electronic device, and autonomously fly to the determined photographing location (a position of the target composition).”) Harris discloses combining depth and thermal information for object detection and avoidance. Harris does not disclose controlling trajectory relative to a current object. Lee teaches of controlling trajectory relative to a current object. Therefore, at the time filing, it would have been obvious to one of ordinary skill in the art would incorporate the teachings of Lee et.al. into the invention of Harris. Such incorporation is motivated by the need to ensure maintaining the camera on the target. As per claim 27, The method of claim 26, wherein a portion of the field of view of the fixed orientation image capture device corresponding to capturing the first image overlaps a portion of a field of view of the adjustable orientation image capture device corresponding to capturing the second image. (Harris paragraph 0056 discloses, “the baseline distance (di) that separates the two visual cameras and the orientation of the cameras with respect to one another are arranged such that at least a portion of the two fields of view 206, 208 overlap to form an effective field of view 210. Objects positioned within the effective field of view 210 are detectable by both visual cameras 202-1, 202-2 and images formed by the two cameras will both include representations of the object within the effective field of view 210.”) As per claim 28, The method of claim 26, further comprising: obtaining feature correlation data that includes obtaining spatial feature correlation data, wherein obtaining the spatial feature correlation data includes obtaining the spatial feature correlation data such that the spatial feature correlation data indicates a correlation between a feature from the first image and a corresponding feature from the second image; (Harris paragraph 0061 discloses, “in other implementations the imaging component 250 may include multiple processing components that operate in conjunction or independently to generate the horizontal dimension, vertical dimension, depth dimension, and thermal dimension for each of the pixels within the effective field of view.”) and obtaining the relative image capture device orientation calibration data includes using five-point relative positioning based on the spatial feature correlation data. (Harris paragraph 0081 discloses, “the object may be monitored for a period of time to determine whether the object is moving.”) As per claim 29, The method of claim 28, wherein a field of view of the fixed orientation image capture device corresponding to capturing the first image is non-overlapping with a field of view of the adjustable orientation image capture device corresponding to capturing the second image. (Harris paragraph 0057 discloses, “the two visual cameras 202-1, 202-2 are calibrated such that the pixel information for each camera corresponding to portions within the effective field of view 210 are correlated such that image information within the effective field of view from the two paired cameras can be merged and, with knowledge of the baseline distance (di), depth information for objects within the field of view may be generated.”) As per claim 30, The method of claim 26, wherein the current object identification data includes a spatial location, trajectory, or both of the current object. (Lee paragraph 0126 teaches, “The composition information may include information about an object type, an image resolution or size (XY ratio), an object position, an object size, an object orientation, and composition area.”) Harris discloses combining depth and thermal information for object detection and avoidance. Harris does not disclose controlling trajectory relative to a current object. Lee teaches of controlling trajectory relative to a current object. Therefore, at the time filing, it would have been obvious to one of ordinary skill in the art would incorporate the teachings of Lee et.al. into the invention of Harris. Such incorporation is motivated by the need to ensure maintaining the camera on the target. As per claim 31, The method of claim 30, wherein the spatial location is determined by triangulating the current object based on orientation calibration data. (Harris paragraph 0059 discloses, “in addition to calibrating the two visual cameras 202-1, 202-2 so that depth information can be obtained, the IR camera 204 is also calibrated so that the pixel information within the effective field of view 210 that is encompassed by the field of view of view of the IR camera 204 is correlated with the pixel information of the IR camera 204.”) As per claim 32, The method of claim 31, wherein obtaining the relative image capture device orientation calibration data includes obtaining the relative image capture device orientation calibration data using spatiotemporal calibration based on temporal feature correlation data. (Harris paragraph 0059 discloses, “in addition to calibrating the two visual cameras 202-1, 202-2 so that depth information can be obtained, the IR camera 204 is also calibrated so that the pixel information within the effective field of view 210 that is encompassed by the field of view of view of the IR camera 204 is correlated with the pixel information of the IR camera 204.”) As per claim 33, A non-transitory computer-readable storage medium, comprising processor- executable instructions for controlling, by a processor in response the instructions, an unmanned aerial vehicle to traverse a portion of an operational environment of the unmanned aerial vehicle using relative image capture device orientation calibration by: obtaining a first image from a fixed orientation image capture device of the unmanned aerial vehicle; (Harris paragraph 0014 discloses, “the imaging component may include a first camera and a second camera that are separated by a known baseline distance such that a field of view of the first camera overlaps with at least a portion of the field of view of the second camera. The first camera and the second camera are configured to form images using visible light.”) obtaining a second image from an adjustable orientation image capture device of the unmanned aerial vehicle; (Harris paragraph 0014 discloses, “the imaging component may include a first camera and a second camera that are separated by a known baseline distance such that a field of view of the first camera overlaps with at least a portion of the field of view of the second camera. The first camera and the second camera are configured to form images using visible light.”) obtaining feature correlation data based on the first image and the second image; (Harris paragraph 0014 discloses, “the imaging component may include a first camera and a second camera that are separated by a known baseline distance such that a field of view of the first camera overlaps with at least a portion of the field of view of the second camera. The first camera and the second camera are configured to form images using visible light.”) obtaining current object identification data of an object located within a field of view of the fixed orientation image capture device or the adjustable orientation image capture device; (Lee paragraph 0125 teaches, “The position information may be related to a center point and/or edges of the subject. According to an embodiment of the present disclosure, the photographing information may include position information and size information of two or more objects. When the object is a person, the photographing information may include position information (in the image) and size information of a body part. The body part may be a face of the person. The body part may include the face and/or part or the entire of the body including the face of the person.”) obtaining current object motion data from the current object identification data by determining motion of a current object between a temporal sequence of images captured;; (Lee paragraph 0159 teaches, “The processor 500 during the flight may track the second object corresponding to the first object using the camera module 560, and capture the second image including the second object at or near the second 3D position so that the second image taken by the unmanned photographing device corresponds to the first image.”) and controlling a trajectory of the unmanned aerial vehicle relative to the current object. (Lee paragraph 0164 teaches, “The unmanned photographing device 690 may compare and analyze the calculated composition information and the target composition information received from the electronic device, and autonomously fly to the determined photographing location (a position of the target composition).”) Harris discloses combining depth and thermal information for object detection and avoidance. Harris does not disclose controlling trajectory relative to a current object. Lee teaches of controlling trajectory relative to a current object. Therefore, at the time filing, it would have been obvious to one of ordinary skill in the art would incorporate the teachings of Lee et.al. into the invention of Harris. Such incorporation is motivated by the need to ensure maintaining the camera on the target. As per claim 34, The non-transitory computer-readable storage medium of claim 33, wherein a portion of the field of view of the fixed orientation image capture device corresponding to capturing the first image overlaps a portion of a field of view of the adjustable orientation image capture device corresponding to capturing the second image. (Harris paragraph 0056 discloses, “the baseline distance (di) that separates the two visual cameras and the orientation of the cameras with respect to one another are arranged such that at least a portion of the two fields of view 206, 208 overlap to form an effective field of view 210. Objects positioned within the effective field of view 210 are detectable by both visual cameras 202-1, 202-2 and images formed by the two cameras will both include representations of the object within the effective field of view 210.”) As per claim 35, The non-transitory computer-readable storage medium of claim 33, wherein: obtaining feature correlation data includes obtaining spatial feature correlation data, wherein obtaining the spatial feature correlation data includes obtaining the spatial feature correlation data such that the spatial feature correlation data indicates a correlation between a feature from the first image and a corresponding feature from the second image; (Harris paragraph 0059 discloses, “in addition to calibrating the two visual cameras 202-1, 202-2 so that depth information can be obtained, the IR camera 204 is also calibrated so that the pixel information within the effective field of view 210 that is encompassed by the field of view of view of the IR camera 204 is correlated with the pixel information of the IR camera 204.”) and obtaining the relative image capture device orientation calibration data includes using five-point relative positioning based on the spatial feature correlation data. (Harris paragraph 0059 discloses, “in addition to calibrating the two visual cameras 202-1, 202-2 so that depth information can be obtained, the IR camera 204 is also calibrated so that the pixel information within the effective field of view 210 that is encompassed by the field of view of view of the IR camera 204 is correlated with the pixel information of the IR camera 204.”) As per claim 36, The non-transitory computer-readable storage medium of claim 35, wherein the field of view of the fixed orientation image capture device corresponding to capturing the first image is non-overlapping with a field of view of the adjustable orientation image capture device corresponding to capturing the second image. (Harris paragraph 0059 discloses, “in addition to calibrating the two visual cameras 202-1, 202-2 so that depth information can be obtained, the IR camera 204 is also calibrated so that the pixel information within the effective field of view 210 that is encompassed by the field of view of view of the IR camera 204 is correlated with the pixel information of the IR camera 204.”) As per claim 37, The non-transitory computer-readable storage medium of claim 33, wherein the current object identification data includes a spatial location, trajectory, or both of the current object. (Lee paragraph 0126 teaches, “The composition information may include information about an object type, an image resolution or size (XY ratio), an object position, an object size, an object orientation, and composition area.”) Harris discloses combining depth and thermal information for object detection and avoidance. Harris does not disclose controlling trajectory relative to a current object. Lee teaches of controlling trajectory relative to a current object. Therefore, at the time filing, it would have been obvious to one of ordinary skill in the art would incorporate the teachings of Lee et.al. into the invention of Harris. Such incorporation is motivated by the need to ensure maintaining the camera on the target. As per claim 38, The non-transitory computer-readable storage medium of claim 37, wherein the spatial location is determined by triangulating the current object based on orientation calibration data. (Harris paragraph 0059 discloses, “in addition to calibrating the two visual cameras 202-1, 202-2 so that depth information can be obtained, the IR camera 204 is also calibrated so that the pixel information within the effective field of view 210 that is encompassed by the field of view of view of the IR camera 204 is correlated with the pixel information of the IR camera 204.”) As per claim 39, The non-transitory computer-readable storage medium of claim 38, wherein obtaining the relative image capture device orientation calibration data includes obtaining the relative image capture device orientation calibration data using spatiotemporal calibration based on temporal feature correlation data. (Harris paragraph 0059 discloses, “in addition to calibrating the two visual cameras 202-1, 202-2 so that depth information can be obtained, the IR camera 204 is also calibrated so that the pixel information within the effective field of view 210 that is encompassed by the field of view of view of the IR camera 204 is correlated with the pixel information of the IR camera 204.”) Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to TYLER D PAIGE whose telephone number is (571)270-5425. The examiner can normally be reached M-F 7:00am - 6:00pm (mst). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kito Robinson can be reached at 5712703921. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /TYLER D PAIGE/Primary Examiner, Art Unit 3664
Read full office action

Prosecution Timeline

Oct 07, 2024
Application Filed
Mar 17, 2025
Response after Non-Final Action
Mar 19, 2026
Non-Final Rejection — §103, §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597357
AUTOMATIC AIRCRAFT TAXIING
2y 5m to grant Granted Apr 07, 2026
Patent 12592102
OPERATION DATA SUPPORT SYSTEM FOR INDUSTRIAL MACHINERY
2y 5m to grant Granted Mar 31, 2026
Patent 12586424
DRIVING DIAGNOSIS DEVICE
2y 5m to grant Granted Mar 24, 2026
Patent 12586425
RARE EVENT DETECTION SYSTEM
2y 5m to grant Granted Mar 24, 2026
Patent 12579849
DETECTING AN UNUSUAL OPERATION OF A VEHICLE OUTSIDE OF A TIME FENCE AND NOTIFYING NEIGHBORING VEHICLES
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
91%
Grant Probability
99%
With Interview (+8.2%)
2y 1m
Median Time to Grant
Low
PTA Risk
Based on 1276 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month