Prosecution Insights
Last updated: April 19, 2026
Application No. 18/534,713

CONTROL APPARATUS, BASE STATION, CONTROL METHOD, AND PROGRAM

Non-Final OA §103
Filed
Dec 10, 2023
Examiner
CASS, JEAN PAUL
Art Unit
3666
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Fujifilm Corporation
OA Round
3 (Non-Final)
73%
Grant Probability
Favorable
3-4
OA Rounds
3y 1m
To Grant
99%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
719 granted / 984 resolved
+21.1% vs TC avg
Strong +26% interview lift
Without
With
+25.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
83 currently pending
Career history
1067
Total Applications
across all art units

Statute-Specific Performance

§101
10.5%
-29.5% vs TC avg
§103
56.8%
+16.8% vs TC avg
§102
12.6%
-27.4% vs TC avg
§112
12.8%
-27.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 984 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Receipt is acknowledged of a request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e) and a submission, filed on 2-26. Response to the Applicant’s arguments The previous rejection is withdrawn. Applicant’s amendments are entered. Applicant’s remarks are also entered into the record. A new search was made necessitated by the applicant’s amendments. A new reference was found. A new rejection is made herein. Applicant’s arguments are now moot in view of the new rejection of the claims. Claims 1 and 21-22 are amended to recite and the primary reference is silent but NADIR teaches “... even in a case where a distance between the inspection target object and each of the plurality of first imaging positions changes, derive a distance between the inspection target object and the first imaging apparatus and perform control for maintaining a constant pixel resolution of the first imaging apparatus by (see paragraph 231 and claims 1-6 where the drone can be moved in position while maintaining the frame rate and resolution of Full resolution and Programmable up to 14 fps or VGA (with binning) and Programmable up to 53 fps or 720P (1280 × 720) and Programmable up to 60 fps or with ADC resolution 12-bit, on-chip and the Pixel dynamic range of either 70.1 dB (full resolution), 76 dB (2 × 2 binning) SNRMA 38.1 dB (full resolution), 44 dB (2 × 2 binning)”). adjusting a zoom magnification of the first imaging apparatus to a zoom magnification at which the pixel resolution of the first imaging apparatus becomes a predetermined reference value based on the distance” (see paragraph 214-220 and 175-182 where the user can zoom in on the image based on the correct pixel resolution regardless of the distance from the drone to the target with a remote user using a window of interest to view the target). It would have been obvious for one of ordinary skill in the art to combine the disclosure of NISHIDA and the teachings of NADIR with a reasonable expectation of success to provide for a drone that can fly and maintain a distance against the target and provide 1. A zooming function and 2 a constant pixel resolution of the images based on the distance that is HD 720p images for a remote observer to control the drone and provide a remote viewing of the target based on controlling a window of interest to provide a zooming function of the target and also a 720p pixel resolution to provide an increased ability to view the target clearly without having to continuously focus the image. See claims 1-6 and paragraph 214-220 and 175-182 of NADIR. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1 and 20-22 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of Japanese Patent Pub. No.: JP 2019-117127 A to Nishida and in view of Japanese Patent Pub. No.: JP 2016-111414 A to Sowa et al that was filed in 2014 and in view of United States Patent Application Pub. No.: US 2019/0068953 A1 to Choi et al. that was filed in 2017 and assigned to Aurora Flight Sciences and in view of United States Patent Application Pub. No.: US20120200703A1 to NADIR. PNG media_image1.png 782 904 media_image1.png Greyscale In regard to claim 1, 21, and 22, NISHIDA discloses “ 1. A control apparatus comprising: a processor; and a memory connected to or incorporated in the processor, wherein the processor is configured to: (see claim 1 where the device include a processing unit and a memory) rotate a distance measurement device via a rotational drive apparatus to which the distance measurement device is attached; measure a first distance between an inspection target object and (see FIG. 1 where the device can be a movie type camera and has a laser scanner and the drone also include a camera and where both a point cloud and a camera image can be simultaneously obtained using the rotational device of the combined laser canner and point cloud formation device ; As shown in FIG. 2, the TS 100 has a structure in which the TS main body 20 and the laser scanner unit 101 are combined (combined). The TS 100 has a main body 11. The main body 11 is held by the pedestal 12 in a horizontally rotatable state. The pedestal 12 is fixed to the top of a tripod (not shown). The main body portion 11 has a substantially U-shape having two extending portions extending upward as viewed from the direction of the Y axis, and the movable portion 13 has a vertical angle between the two extending portions. Control of (elevation angle and depression angle) is possible. The main body 11 rotates electrically with respect to the pedestal 12. That is, in the main body portion 11, angle control of the horizontal rotation angle with respect to the pedestal 12 is performed by the motor. Further, the movable portion 13 is subjected to angle control of the vertical angle by the motor. The drive for the angle control of the horizontal rotation angle and the vertical angle is performed by the horizontal rotation drive unit 108 and the vertical rotation drive unit 109 (see the block diagram of FIG. 4) incorporated in the main body unit 11. The description of the horizontal rotation drive unit 108 and the vertical rotation drive unit 109 will be described later. In the main body 11, a horizontal rotation angle control dial 14a and a vertical angle control dial 14b are disposed. The horizontal rotation angle of the main body 11 (the movable portion 13) is adjusted by operating the horizontal rotation angle control dial 14a, and the vertical angle of the movable portion 13 is adjusted by operating the vertical angle control dial 14b. It takes place.) the distance measurement device at a plurality of distance measurement locations of the inspection target object via the distance measurement device; (see abstract and claim 1-10 and FIG .9 where the movable section in the u direction of the tracking image can be provided by the uav 200 and the x axis of the tracking image sensor can also be captured to provide a scan direction in a rotational direction of the canning unit 25 where the laser scanner can be provided and rotated in the z axis and the TS 100 and the drone 20 can be combined to scan vertically and horizontally and in the z direction) set a flying route for causing a flying object to fly along the inspection target object based on the first distance measured for each distance measurement location; and (see FIG. 3 where As an example of the processing performed by the control operation determination unit 306, the UAV 200 is flying in a 10 m space between the structure A and the structure B, and the position of the UAV 200 is 1 m from the structure A and 9 m from the structure B If it is a position and the set control method is to position the control target in the center of the space, it is determined to move 4 m in the direction of the structure B. The control signal generation unit 307 generates a control signal in order to perform the operation determined by the control operation determination unit 306 as a control target. In the present embodiment, the generated control signal is transmitted by the communication unit 112 of the TS 100 to be used for control of the UAV 200. The control signal generated by the control signal generation unit 307 may not generate the signal itself because control is not necessary when the control operation determination unit 306 determines that the control operation to be performed on the control object is not particularly performed. It may be a signal to which only the processing result of the position calculation unit 305 is given. Furthermore, the control signal generated by the control signal generation unit 307 is not limited to the signal directly controlling the control target, and may be a value calculated or calculated in the section calculation unit 304, the control target position calculation unit 305, and the control operation determination unit 306. Control may be indirectly performed via an operator or the like by displaying or generating information on an operation device of an unmanned aerial vehicle to be controlled, of information derived from the value. As an example of indirect control, the control device of the unmanned aircraft to be controlled is caused to display or generate the distance from the controlled object to an object that may be an obstacle, or the distance to an object that is an obstacle is approaching In the case, it is possible to output a warning sentence or an alarm sound. Therefore, the UAV 200 used in the present embodiment may autonomously fly a predetermined flight route, or may be flight control by operator's control. The point cloud data obtained in the scan control unit 303 and the setting data of the control method used in the control operation determination unit 306 may be stored in a storage unit (not shown) provided in the three-dimensional information processing unit 300. Alternatively, the three-dimensional information processing unit 300 may be stored in a storage unit of a device (in the present embodiment, the storage unit 102 of the TS 100) provided with the three-dimensional information processing unit 300 or a storage unit of a device capable of communicating with the three-dimensional information processing unit 300.) PNG media_image2.png 682 570 media_image2.png Greyscale in a case of causing the flying object to fly along the flying route and acquiring each of a plurality of first images by imaging each of a plurality of imaged regions of the inspection target object via a first imaging apparatus mounted on the flying object each time the flying object reaches each of a plurality of first imaging positions set on the flying route, (see Fig 8 where FIG. 8 is a diagram showing a scan range by the laser scanner unit 101 provided in the TS 100. As shown in FIG. As shown in FIG. 8, laser scanning is also performed in the depth direction (Y-axis direction in FIG. 8). Therefore, according to this technology, while tracking and positioning of the UAV 200 are continuously performed, detection of an obstacle and setting of a movable range in a three-dimensional range centering on the UAV 200 become possible. According to this example, since laser scanning is performed on the left and right spaces of the UAV 200, for example, control in the case of flying the UAV 200 along the wall surface can be performed with high accuracy.) Sowa teaches “...perform a control of constantly maintaining pixel resolution of the first imaging apparatus . ( According to the present embodiment, the control of the control device 208 can control the flight of the unmanned air vehicle 100 so as to keep the distance from the structure OBJ constant without relying on the GPS signal, and thereby the structure OBJ. You can avoid collisions. Further, in the case of performing an image inspection, it is desirable that the field of view and resolution in imaging are substantially constant, but this can be realized by imaging from a certain distance with respect to the structure OBJ. Further, the actual size of the structure in the captured image can be converted based on the known angle of view of the high pixel camera 202 and the distance to the structure OBJ as the imaging target obtained by the distance measuring sensor 204. It can also be used for inspections that require actual dimensions such as cracks.)”. It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of NISHIDA with the teachings of SOWA with a reasonable expectation of success since SOWA teaches that a drone can provide a fixed predetermined shooting distance with a LIDAR sensor from a target. The UAV can have a movement vector calculation to maintain the range. This is in a 3d coordinate system. A high pixel camera and a low pixel camera can capture images 202 and 203. The shooting angle of the cameras also can be maintained. The 2d images can be corrected based on the different in the directions of the flying points from a path. This can provide an improved photographic function while capturing high and low pixel data as the drone flies and images the target. See abstract and paragraph 1-12 from the bottom of the publication. Claims 1 and 21-22 are amended to recite and Nishida is silent but Choi teaches “...and in a case of causing the flying object to fly along the flying route and acquiring each of a plurality of first images by imaging each of a plurality of imaged regions of the inspection target object, at different distances, via a first imaging apparatus....”. (see FIG. 2d where the edvice has a sensor payload including a camera 225a and see paragraph 84 where the device has a camera based seeking system to scan a target and can maintain the pixel resolution at different distances. The stereo-vision system may be operatively coupled to the processor via a universal serial bus (USB). For example, a USB 3.0 machine vision cameras enable designers to trade resolution for frame rate—the FLIR/Point Grey 5MP camera, for example, can achieve 2448×2048 pixel resolution at 73 fps and 800×600 px at 199 fps. Alternatively, Ximea produces a USB3.0 camera with either 640×400 px @ 1000 fps or 210 fps @ 1280×1024 px.)”. PNG media_image3.png 648 974 media_image3.png Greyscale PNG media_image4.png 764 1230 media_image4.png Greyscale It would have been obvious for one of ordinary skill in the art to combine the disclosure of NISHIDA and the teachings of CHOI with a reasonable expectation of success to provide for a drone that can fly and maintain a distance against a second drone 104. The drone can include a sensor pay load 226 in FIG. 2d. This can include multiple cameras for capturing video and still images and LIDAR images. The cameras can provide a pixel resolution of 2448 by 2048 at 73 frames per second or a 800 by 600 pixel resolution at 199 frames per second on the target as the drone is moving. The device can include a camera and light source in FIG. 3a and provide the resolution at the different distances within the triangle. See paragraph 80-90. Claim 1 is amended to recite and the primary reference is silent but NADIR teaches “... even in a case where a distance between the inspection target object and each of the plurality of first imaging positions changes, derive a distance between the inspection target object and the first imaging apparatus and perform control for maintaining a constant pixel resolution of the first imaging apparatus by (see paragraph 231 and claims 1-6 where the drone can be moved in position while maintaining the frame rate and resolution of Full resolution and Programmable up to 14 fps or VGA (with binning) and Programmable up to 53 fps or 720P (1280 × 720) and Programmable up to 60 fps or with ADC resolution 12-bit, on-chip and the Pixel dynamic range of either 70.1 dB (full resolution), 76 dB (2 × 2 binning) SNRMA 38.1 dB (full resolution), 44 dB (2 × 2 binning)”). adjusting a zoom magnification of the first imaging apparatus to a zoom magnification at which the pixel resolution of the first imaging apparatus becomes a predetermined reference value based on the distance” (see paragraph 214-220 and 175-182 where the user can zoom in on the image based on the correct pixel resolution regardless of the distance from the drone to the target with a remote user using a window of interest to view the target). It would have been obvious for one of ordinary skill in the art to combine the disclosure of NISHIDA and the teachings of NADIR with a reasonable expectation of success to provide for a drone that can fly and maintain a distance against the target and provide 1. A zooming function and 2 a constant pixel resolution of the images based on the distance that is HD 720p images for a remote observer to control the drone and provide a remote viewing of the target based on controlling a window of interest to provide a zooming function of the target and also a 720p pixel resolution to provide an increased ability to view the target clearly without having to continuously focus the image. See claims 1-6 and paragraph 214-220 and 175-182 of NADIR. Claims 2-3 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of Japanese Patent Pub. No.: JP 2019-117127 A to Nishida and in view of Japanese Patent Pub. No.: JP 2016-111414 A to Sowa et al that was filed in 2014 and in view of Japanese Patent Pub. No.: JP 2006-027448 A to Sumiya and Choi and Nadir. Sumiya teaches “...2. The control apparatus according to claim 1, wherein the processor is configured to: adjust a rotational angle of the rotational drive apparatus to a second rotational angle at which the flying object is included within a distance measurement range of the distance measurement device; measure a second distance between the flying object and the distance measurement device via the distance measurement device; and perform a control of causing the flying object to fly along the flying route based on the second rotational angle and on the second distance”. ( see detailed description where in the aerial photography method, in flight, in the direction of the shooting target so that the shooting target can be reliably shot even when the aircraft shakes or the direction of the nose changes due to the influence of wind during the flight. It is preferable to capture the video data of the object to be photographed while moving the directed camera up and down or left and right at a predetermined angle. Preferably, the aerial imaging method preferably includes an autonomous control device that allows an unmanned air vehicle to fly along a predetermined route, a flight position specifying unit that specifies a position in flight from a GPS signal, an airframe, and an object to be imaged. Distance measuring means for measuring the distance to the camera, camera control means for controlling the camera angle, zoom magnification, shooting start and end, and the shooting start position, camera angle, distance from the shooting start position to the object to be shot , Means for recording shooting condition information such as camera zoom magnification and shooting end position, and actual distance measuring means for controlling the distance measuring means at the shooting start position to measure the distance between the airframe and the shooting object. The camera zoom magnification is calculated from the difference between the actual distance to the object to be photographed and the distance to the object to be photographed recorded in the photographing condition information so that the subject has a predetermined size within the camera frame. And means for updating the photographing condition to the zoom magnification, and an imaging processing control means for controlling said each means based on the imaging condition information can be carried out by aerial device provided on the body. The aerial imaging device includes a unit for recording video data of a subject to be photographed, and the recorded video data is collated with video data output from a camera during photographing so that the subject of photographing is included in the video data. And a means for determining whether or not the image is included, and only the video data including the photographing object can be recorded in the recording means.) PNG media_image5.png 576 680 media_image5.png Greyscale It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of NISHIDA with the teachings of SUMIYA with a reasonable expectation of success since SUMIYA teaches that a drone can include a zoom magnification and an angle of the camera and the shooting start position and a shooting end position. The lidar device can provide a distance to the target and the rotation and zooming and starting and ending can be provided. Additionally, addition objects can be removed from the finished product. This can provide an improved rotational and zooming photographic function while capturing as the drone flies and images the target in FIG..4. See abstract and claim 1-7. Nishida discloses “...3. The control apparatus according to claim 2, wherein the distance measurement device includes a LiDAR scanner”, (see abstract). Sumiya teaches “...the second distance is a distance between the flying object and the LiDAR scanner, and the processor is configured to: derive second absolute coordinates of the flying object based on first absolute coordinates of the rotational drive apparatus, the second rotational angle, an angle of laser light emitted from the LiDAR scanner toward the flying object, and the second distance; and perform a control of causing the flying object to fly along the flying route based on the second absolute coordinates”. (see claims 1-7 where the device can include a UAV that can scan the target and record the target using a three dimensional coordinate system and the control target using a lidar device) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of NISHIDA with the teachings of SUMIYA with a reasonable expectation of success since SUMIYA teaches that a drone can include a zoom magnification and an angle of the camera and the shooting start position and a shooting end position. The lidar device can provide a distance to the target and the rotation and zooming and starting and ending can be provided. Additionally, addition objects can be removed from the finished product. This can provide an improved rotational and zooming photographic function while capturing as the drone flies and images the target in FIG..4. See abstract and claim 1-7. Claim 4 is rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of Japanese Patent Pub. No.: JP 2019-117127 A to Nishida and in view of Japanese Patent Pub. No.: JP 2016-111414 A to Sowa et al that was filed in 2014 and in view of Japanese Patent Pub. No.: JP 2006-027448 A to Sumiya and Choi and Nadir. Sumiya teaches “..4. The control apparatus according to claim 2, wherein a second imaging apparatus is attached to the rotational drive apparatus, and (see camera unit and the stereo imaging sensor; The distance measuring sensor 34 is attached to the front part of the A / C helicopter 1 and detects the distance from the airframe to the ground structure to keep the distance from the A / C helicopter 1 constant. While preventing generation | occurrence | production, it is for measuring the distance from the electric wire W which is imaging | photography object of the imaging device 4 mentioned later to the body. As the distance measuring sensor, for example, a sensor using a stereo vision camera or a distance measuring means using a laser can be used. ) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of NISHIDA with the teachings of SUMIYA with a reasonable expectation of success since SUMIYA teaches that a drone can include a zoom magnification and an angle of the camera and the shooting start position and a shooting end position. The lidar device can provide a distance to the target and the rotation and zooming and starting and ending can be provided. Additionally, addition objects can be removed from the finished product. This can provide an improved rotational and zooming photographic function while capturing as the drone flies and images the target in FIG..4. See abstract and claim 1-7. Nishida discloses “...the processor is configured to perform a control of adjusting the rotational angle of the rotational drive apparatus to the second rotational angle based on a second image obtained by imaging the flying object via the second imaging apparatus”. (see claims 1-7 where the device can include a UAV that can scan the target and record the target using a three dimensional coordinate system and the control target using a lidar device) Claim 5 is rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of Japanese Patent Pub. No.: JP 2019-117127 A to Nishida and in view of Japanese Patent Pub. No.: JP 2016-111414 A to Sowa et al that was filed in 2014 and in view of Japanese Patent Pub. No.: JP 2006-027448 A to Sumiya and Choi and Nadir. Sumiya teaches “...5. The control apparatus according to claim 4, wherein the second rotational angle is an angle at which the flying object is positioned in a center portion of an angle of view of the second imaging apparatus”. ( As described above, in the state where the A / C helicopter 1 is hovering at the point (a), the control device 23 of the ground station 2 is remotely operated to set the camera 42 to the direction of imaging the electric wire W, and When the monitor 24 confirms that the image of the electric wire W is located at the center, a distance measurement signal is output from the control device 23 to the photographing device 4. Upon receiving the signal, the photographing apparatus 4 measures the actual distance to the electric wire W by the actual distance measuring means 410, calculates an appropriate zoom magnification based on the distance measured by the zoom magnification calculating means 411, and zoom magnification updating means. The imaging condition information is updated to the zoom magnification calculated in 412. At the same time, the video signal output from the camera 42 is recorded as video data in the recording unit 41 by the object specifying unit 413, and at the same time, the direction control unit 414 takes in the camera orientation information from the angle control unit 45.) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of NISHIDA with the teachings of SUMIYA with a reasonable expectation of success since SUMIYA teaches that a drone can include a zoom magnification and an angle of the camera and the shooting start position and a shooting end position. The lidar device can provide a distance to the target and the rotation and zooming and starting and ending can be provided. Additionally, addition objects can be removed from the finished product. This can provide an improved rotational and zooming photographic function while capturing as the drone flies and images the target in FIG..4. See abstract and claim 1-7. Claims 6-18 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of Japanese Patent Pub. No.: JP 2019-117127 A to Nishida and in view of Japanese Patent Pub. No.: JP 2016-111414 A to Sowa et al that was filed in 2014 and in view of Japanese Patent Pub. No.: JP 2006-027448 A to Sumiya and Choi and Nadir. Sumiya teaches “...6. The control apparatus according to claim 4, wherein the flying object includes a plurality of members categorized with different aspects, and the processor is configured to control a posture of the flying object based on positions of the plurality of members captured in the second image”. ( As described above, in the state where the A / C helicopter 1 is hovering at the point (a), the control device 23 of the ground station 2 is remotely operated to set the camera 42 to the direction of imaging the electric wire W, and When the monitor 24 confirms that the image of the electric wire W is located at the center, a distance measurement signal is output from the control device 23 to the photographing device 4. Upon receiving the signal, the photographing apparatus 4 measures the actual distance to the electric wire W by the actual distance measuring means 410, calculates an appropriate zoom magnification based on the distance measured by the zoom magnification calculating means 411, and zoom magnification updating means. The imaging condition information is updated to the zoom magnification calculated in 412. At the same time, the video signal output from the camera 42 is recorded as video data in the recording unit 41 by the object specifying unit 413, and at the same time, the direction control unit 414 takes in the camera orientation information from the angle control unit 45.) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of NISHIDA with the teachings of SUMIYA with a reasonable expectation of success since SUMIYA teaches that a drone can include a zoom magnification and an angle of the camera and the shooting start position and a shooting end position. The lidar device can provide a distance to the target and the rotation and zooming and starting and ending can be provided. Additionally, addition objects can be removed from the finished product. This can provide an improved rotational and zooming photographic function while capturing as the drone flies and images the target in FIG..4. See abstract and claim 1-7. Sumiya teaches “...7. The control apparatus according to claim 6, wherein the different aspects are different colors, and the members are propellers”. (( As described above, in the state where the A / C helicopter 1 is hovering at the point (a), the control device 23 of the ground station 2 is remotely operated to set the camera 42 to the direction of imaging the electric wire W, and When the monitor 24 confirms that the image of the electric wire W is located at the center, a distance measurement signal is output from the control device 23 to the photographing device 4. Upon receiving the signal, the photographing apparatus 4 measures the actual distance to the electric wire W by the actual distance measuring means 410, calculates an appropriate zoom magnification based on the distance measured by the zoom magnification calculating means 411, and zoom magnification updating means. The imaging condition information is updated to the zoom magnification calculated in 412. At the same time, the video signal output from the camera 42 is recorded as video data in the recording unit 41 by the object specifying unit 413, and at the same time, the direction control unit 414 takes in the camera orientation information from the angle control unit 45.)) (see also claims 1-4 where The zoom magnification of the camera is calculated so that the subject has a predetermined size within the camera frame from the difference from the distance to the shooting target recorded in the shooting condition information, and the calculated zoom magnification is Aerial apparatus using means for updating the shadow conditions, the unmanned air vehicle, characterized in that an imaging processing control means for controlling said each means based on the imaging condition information provided on the body. The means for recording the video data of the shooting object and the recorded video data are compared with the video data output from the camera during shooting to determine whether or not the shooting object is included in the video data. The aerial imaging apparatus using the unmanned aerial vehicle according to claim 3, further comprising: a recording unit configured to record only video data including a photographing object in the recording unit.) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of NISHIDA with the teachings of SUMIYA with a reasonable expectation of success since SUMIYA teaches that a drone can include a zoom magnification and an angle of the camera and the shooting start position and a shooting end position. The lidar device can provide a distance to the target and the rotation and zooming and starting and ending can be provided. Additionally, addition objects can be removed from the finished product. This can provide an improved rotational and zooming photographic function while capturing as the drone flies and images the target in FIG..4. See abstract and claim 1-7. Sumiya teaches “..8. The control apparatus according to claim 6, wherein the different aspects are different colors, and the members are light-emitting objects”. (( As described above, in the state where the A / C helicopter 1 is hovering at the point (a), the control device 23 of the ground station 2 is remotely operated to set the camera 42 to the direction of imaging the electric wire W, and When the monitor 24 confirms that the image of the electric wire W is located at the center, a distance measurement signal is output from the control device 23 to the photographing device 4. Upon receiving the signal, the photographing apparatus 4 measures the actual distance to the electric wire W by the actual distance measuring means 410, calculates an appropriate zoom magnification based on the distance measured by the zoom magnification calculating means 411, and zoom magnification updating means. The imaging condition information is updated to the zoom magnification calculated in 412. At the same time, the video signal output from the camera 42 is recorded as video data in the recording unit 41 by the object specifying unit 413, and at the same time, the direction control unit 414 takes in the camera orientation information from the angle control unit 45.)) (see also claims 1-4 where The zoom magnification of the camera is calculated so that the subject has a predetermined size within the camera frame from the difference from the distance to the shooting target recorded in the shooting condition information, and the calculated zoom magnification is Aerial apparatus using means for updating the shadow conditions, the unmanned air vehicle, characterized in that an imaging processing control means for controlling said each means based on the imaging condition information provided on the body. The means for recording the video data of the shooting object and the recorded video data are compared with the video data output from the camera during shooting to determine whether or not the shooting object is included in the video data. The aerial imaging apparatus using the unmanned aerial vehicle according to claim 3, further comprising: a recording unit configured to record only video data including a photographing object in the recording unit.) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of NISHIDA with the teachings of SUMIYA with a reasonable expectation of success since SUMIYA teaches that a drone can include a zoom magnification and an angle of the camera and the shooting start position and a shooting end position. The lidar device can provide a distance to the target and the rotation and zooming and starting and ending can be provided. Additionally, addition objects can be removed from the finished product. This can provide an improved rotational and zooming photographic function while capturing as the drone flies and images the target in FIG..4. See abstract and claim 1-7. Sumiya teaches “...9. The control apparatus according to claim 6, wherein the different aspects are different turn-on and turn-off patterns, and the members are light-emitting objects”. (( As described above, in the state where the A / C helicopter 1 is hovering at the point (a), the control device 23 of the ground station 2 is remotely operated to set the camera 42 to the direction of imaging the electric wire W, and When the monitor 24 confirms that the image of the electric wire W is located at the center, a distance measurement signal is output from the control device 23 to the photographing device 4. Upon receiving the signal, the photographing apparatus 4 measures the actual distance to the electric wire W by the actual distance measuring means 410, calculates an appropriate zoom magnification based on the distance measured by the zoom magnification calculating means 411, and zoom magnification updating means. The imaging condition information is updated to the zoom magnification calculated in 412. At the same time, the video signal output from the camera 42 is recorded as video data in the recording unit 41 by the object specifying unit 413, and at the same time, the direction control unit 414 takes in the camera orientation information from the angle control unit 45.)) (see also claims 1-4 where The zoom magnification of the camera is calculated so that the subject has a predetermined size within the camera frame from the difference from the distance to the shooting target recorded in the shooting condition information, and the calculated zoom magnification is Aerial apparatus using means for updating the shadow conditions, the unmanned air vehicle, characterized in that an imaging processing control means for controlling said each means based on the imaging condition information provided on the body. The means for recording the video data of the shooting object and the recorded video data are compared with the video data output from the camera during shooting to determine whether or not the shooting object is included in the video data. The aerial imaging apparatus using the unmanned aerial vehicle according to claim 3, further comprising: a recording unit configured to record only video data including a photographing object in the recording unit.) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of NISHIDA with the teachings of SUMIYA with a reasonable expectation of success since SUMIYA teaches that a drone can include a zoom magnification and an angle of the camera and the shooting start position and a shooting end position. The lidar device can provide a distance to the target and the rotation and zooming and starting and ending can be provided. Additionally, addition objects can be removed from the finished product. This can provide an improved rotational and zooming photographic function while capturing as the drone flies and images the target in FIG..4. See abstract and claim 1-7. Sumiya teaches “..10. The control apparatus according to claim 1, wherein the plurality of first imaging positions are positions at which the first images acquired at adjacent first imaging positions among the plurality of first imaging positions partially overlap with each other”. (( As described above, in the state where the A / C helicopter 1 is hovering at the point (a), the control device 23 of the ground station 2 is remotely operated to set the camera 42 to the direction of imaging the electric wire W, and When the monitor 24 confirms that the image of the electric wire W is located at the center, a distance measurement signal is output from the control device 23 to the photographing device 4. Upon receiving the signal, the photographing apparatus 4 measures the actual distance to the electric wire W by the actual distance measuring means 410, calculates an appropriate zoom magnification based on the distance measured by the zoom magnification calculating means 411, and zoom magnification updating means. The imaging condition information is updated to the zoom magnification calculated in 412. At the same time, the video signal output from the camera 42 is recorded as video data in the recording unit 41 by the object specifying unit 413, and at the same time, the direction control unit 414 takes in the camera orientation information from the angle control unit 45.)) (see also claims 1-4 where The zoom magnification of the camera is calculated so that the subject has a predetermined size within the camera frame from the difference from the distance to the shooting target recorded in the shooting condition information, and the calculated zoom magnification is Aerial apparatus using means for updating the shadow conditions, the unmanned air vehicle, characterized in that an imaging processing control means for controlling said each means based on the imaging condition information provided on the body. The means for recording the video data of the shooting object and the recorded video data are compared with the video data output from the camera during shooting to determine whether or not the shooting object is included in the video data. The aerial imaging apparatus using the unmanned aerial vehicle according to claim 3, further comprising: a recording unit configured to record only video data including a photographing object in the recording unit.) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of NISHIDA with the teachings of SUMIYA with a reasonable expectation of success since SUMIYA teaches that a drone can include a zoom magnification and an angle of the camera and the shooting start position and a shooting end position. The lidar device can provide a distance to the target and the rotation and zooming and starting and ending can be provided. Additionally, addition objects can be removed from the finished product. This can provide an improved rotational and zooming photographic function while capturing as the drone flies and images the target in FIG..4. See abstract and claim 1-7. Sumiya teaches “...11. The control apparatus according to claim 1, wherein in a case where a surface of the inspection target object has a recessed portion and an area of an opening portion of the recessed portion is less than a predetermined area, the processor is configured to set the flying route on a smooth virtual plane facing the surface”. (And while the A / C helicopter 1 flies from the point (a) to the point (b), the angle control means 45 is controlled by the direction control means 414, and the camera 42 is swung up and down along the vertical vertical direction. The video data is captured while the electric wire W is being tracked. As shown in FIG. 5, the swing control by the direction control means 414 is based on the direction of the camera 42 (Fc in the figure) in which the electric wire W is located at the center of the area F in the imaging area F of the camera 42. This is performed so that the camera 42 swings the head by a predetermined range above and below the electric wire W, for example, an interval of about 1 m above and below. As described above, when the video data of the electric wire W that is the target is captured by the target determination unit 415, information on the camera orientation that is the reference for swinging is updated. The head swing control is always performed around the electric wire W toward the periphery thereof, so that even if the body is shaken during photographing and the photographing direction of the camera 42 is shifted, the video data is taken again in the reference direction, and the electric wire W is taken. The image data tracked in the can be obtained. After the video data of the reference electric wire W to be imaged and the reference camera orientation information are recorded by the object specifying means 413, the direction control means 414 changes the direction and changes the direction in the video output signal of the camera 42. The object determining means 415 determines whether or not the electric wire W is included. The determination can be made by a video data processing procedure as shown in FIG. 6, for example. That is, first, in the figure, the video signal of the camera 42 input to the control means 47 via the input interface 43 is corrected in S1. Next, in S2, filter processing is performed to absorb image quality due to individual differences in cameras and weather differences at the time of shooting. In S3, vertical edge detection processing is performed, and the wire W taken in a direction across the camera 42 is processed. The outline is emphasized. The video data subjected to the edge detection process in S4 is subjected to Hough transform, and in S5, it is determined whether or not there is a straight line crossing in the horizontal direction in the imaging region. If there is no crossing straight line, the next video data is captured and processed, and if there is a crossing straight line, it is determined that the video data is for the electric wire W, and the Y coordinate of the pixel having the video data of the straight line is detected in S6.) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of NISHIDA with the teachings of SUMIYA with a reasonable expectation of success since SUMIYA teaches that a drone can include a zoom magnification and an angle of the camera and the shooting start position and a shooting end position. The lidar device can provide a distance to the target and the rotation and zooming and starting and ending can be provided. Additionally, addition objects can be removed from the finished product. This can provide an improved rotational and zooming photographic function while capturing as the drone flies and images the target in FIG..4. See abstract and claim 1-7. Sumiya teaches “..12. The control apparatus according to claim 11, wherein the processor is configured to, in a case where the flying object flies across the recessed portion, perform a control of constantly maintaining the pixel resolution by operating at least one of a zoom lens or a focus lens of the first imaging apparatus”. (And while the A / C helicopter 1 flies from the point (a) to the point (b), the angle control means 45 is controlled by the direction control means 414, and the camera 42 is swung up and down along the vertical vertical direction. The video data is captured while the electric wire W is being tracked. As shown in FIG. 5, the swing control by the direction control means 414 is based on the direction of the camera 42 (Fc in the figure) in which the electric wire W is located at the center of the area F in the imaging area F of the camera 42. This is performed so that the camera 42 swings the head by a predetermined range above and below the electric wire W, for example, an interval of about 1 m above and below. As described above, when the video data of the electric wire W that is the target is captured by the target determination unit 415, information on the camera orientation that is the reference for swinging is updated. The head swing control is always performed around the electric wire W toward the periphery thereof, so that even if the body is shaken during photographing and the photographing direction of the camera 42 is shifted, the video data is taken again in the reference direction, and the electric wire W is taken. The image data tracked in the can be obtained. After the video data of the reference electric wire W to be imaged and the reference camera orientation information are recorded by the object specifying means 413, the direction control means 414 changes the direction and changes the direction in the video output signal of the camera 42. The object determining means 415 determines whether or not the electric wire W is included. The determination can be made by a video data processing procedure as shown in FIG. 6, for example. That is, first, in the figure, the video signal of the camera 42 input to the control means 47 via the input interface 43 is corrected in S1. Next, in S2, filter processing is performed to absorb image quality due to individual differences in cameras and weather differences at the time of shooting. In S3, vertical edge detection processing is performed, and the wire W taken in a direction across the camera 42 is processed. The outline is emphasized. The video data subjected to the edge detection process in S4 is subjected to Hough transform, and in S5, it is determined whether or not there is a straight line crossing in the horizontal direction in the imaging region. If there is no crossing straight line, the next video data is captured and processed, and if there is a crossing straight line, it is determined that the video data is for the electric wire W, and the Y coordinate of the pixel having the video data of the straight line is detected in S6.) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of NISHIDA with the teachings of SUMIYA with a reasonable expectation of success since SUMIYA teaches that a drone can include a zoom magnification and an angle of the camera and the shooting start position and a shooting end position. The lidar device can provide a distance to the target and the rotation and zooming and starting and ending can be provided. Additionally, addition objects can be removed from the finished product. This can provide an improved rotational and zooming photographic function while capturing as the drone flies and images the target in FIG..4. See abstract and claim 1-7. Sumiya teaches “..13. The control apparatus according to claim 1, wherein the processor is configured to: rotate a first distance measurement device as the distance measurement device via a first rotational drive apparatus as the rotational drive apparatus to which the first distance measurement device is attached; measure the first distance at a plurality of first distance measurement locations among the plurality of distance measurement locations via the first distance measurement device; rotate a second distance measurement device as the distance measurement device via a second rotational drive apparatus as the rotational drive apparatus to which the second distance measurement device is attached; measure the first distance at a plurality of second distance measurement locations among the plurality of distance measurement locations via the second distance measurement device; and set the flying route based on the first distance measured for each first distance measurement location and on the first distance measured for each second distance measurement location”. (see claims 1-4; And while the A / C helicopter 1 flies from the point (a) to the point (b), the angle control means 45 is controlled by the direction control means 414, and the camera 42 is swung up and down along the vertical vertical direction. The video data is captured while the electric wire W is being tracked. As shown in FIG. 5, the swing control by the direction control means 414 is based on the direction of the camera 42 (Fc in the figure) in which the electric wire W is located at the center of the area F in the imaging area F of the camera 42. This is performed so that the camera 42 swings the head by a predetermined range above and below the electric wire W, for example, an interval of about 1 m above and below. As described above, when the video data of the electric wire W that is the target is captured by the target determination unit 415, information on the camera orientation that is the reference for swinging is updated. The head swing control is always performed around the electric wire W toward the periphery thereof, so that even if the body is shaken during photographing and the photographing direction of the camera 42 is shifted, the video data is taken again in the reference direction, and the electric wire W is taken. The image data tracked in the can be obtained. After the video data of the reference electric wire W to be imaged and the reference camera orientation information are recorded by the object specifying means 413, the direction control means 414 changes the direction and changes the direction in the video output signal of the camera 42. The object determining means 415 determines whether or not the electric wire W is included. The determination can be made by a video data processing procedure as shown in FIG. 6, for example. That is, first, in the figure, the video signal of the camera 42 input to the control means 47 via the input interface 43 is corrected in S1. Next, in S2, filter processing is performed to absorb image quality due to individual differences in cameras and weather differences at the time of shooting. In S3, vertical edge detection processing is performed, and the wire W taken in a direction across the camera 42 is processed. The outline is emphasized. The video data subjected to the edge detection process in S4 is subjected to Hough transform, and in S5, it is determined whether or not there is a straight line crossing in the horizontal direction in the imaging region. If there is no crossing straight line, the next video data is captured and processed, and if there is a crossing straight line, it is determined that the video data is for the electric wire W, and the Y coordinate of the pixel having the video data of the straight line is detected in S6.) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of NISHIDA with the teachings of SUMIYA with a reasonable expectation of success since SUMIYA teaches that a drone can include a zoom magnification and an angle of the camera and the shooting start position and a shooting end position. The lidar device can provide a distance to the target and the rotation and zooming and starting and ending can be provided. Additionally, addition objects can be removed from the finished product. This can provide an improved rotational and zooming photographic function while capturing as the drone flies and images the target in FIG..4. See abstract and claim 1-7. Sumiya teaches “...14. The control apparatus according to claim 13, wherein the processor is configured to convert the first distance measured by the second distance measurement device into a distance with reference to a position of the first distance measurement device based on predetermined first calibration information”. (see claims 1-4 and while the A / C helicopter 1 flies from the point (a) to the point (b), the angle control means 45 is controlled by the direction control means 414, and the camera 42 is swung up and down along the vertical vertical direction. The video data is captured while the electric wire W is being tracked. As shown in FIG. 5, the swing control by the direction control means 414 is based on the direction of the camera 42 (Fc in the figure) in which the electric wire W is located at the center of the area F in the imaging area F of the camera 42. This is performed so that the camera 42 swings the head by a predetermined range above and below the electric wire W, for example, an interval of about 1 m above and below. As described above, when the video data of the electric wire W that is the target is captured by the target determination unit 415, information on the camera orientation that is the reference for swinging is updated. The head swing control is always performed around the electric wire W toward the periphery thereof, so that even if the body is shaken during photographing and the photographing direction of the camera 42 is shifted, the video data is taken again in the reference direction, and the electric wire W is taken. The image data tracked in the can be obtained. After the video data of the reference electric wire W to be imaged and the reference camera orientation information are recorded by the object specifying means 413, the direction control means 414 changes the direction and changes the direction in the video output signal of the camera 42. The object determining means 415 determines whether or not the electric wire W is included. The determination can be made by a video data processing procedure as shown in FIG. 6, for example. That is, first, in the figure, the video signal of the camera 42 input to the control means 47 via the input interface 43 is corrected in S1. Next, in S2, filter processing is performed to absorb image quality due to individual differences in cameras and weather differences at the time of shooting. In S3, vertical edge detection processing is performed, and the wire W taken in a direction across the camera 42 is processed. The outline is emphasized. The video data subjected to the edge detection process in S4 is subjected to Hough transform, and in S5, it is determined whether or not there is a straight line crossing in the horizontal direction in the imaging region. If there is no crossing straight line, the next video data is captured and processed, and if there is a crossing straight line, it is determined that the video data is for the electric wire W, and the Y coordinate of the pixel having the video data of the straight line is detected in S6). It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of NISHIDA with the teachings of SUMIYA with a reasonable expectation of success since SUMIYA teaches that a drone can include a zoom magnification and an angle of the camera and the shooting start position and a shooting end position. The lidar device can provide a distance to the target and the rotation and zooming and starting and ending can be provided. Additionally, addition objects can be removed from the finished product. This can provide an improved rotational and zooming photographic function while capturing as the drone flies and images the target in FIG..4. See abstract and claim 1-7. Sumiya teaches “...15. The control apparatus according to claim 14, wherein the processor is configured to convert a position of the flying object measured by the second distance measurement device into a position with reference to a position of the first distance measurement device based on predetermined second calibration information”. ( Preferably, the aerial imaging method preferably includes an autonomous control device that allows an unmanned air vehicle to fly along a predetermined route, a flight position specifying unit that specifies a position in flight from a GPS signal, an airframe, and an object to be imaged. Distance measuring means for measuring the distance to the camera, camera control means for controlling the camera angle, zoom magnification, shooting start and end, and the shooting start position, camera angle, distance from the shooting start position to the object to be shot , Means for recording shooting condition information such as camera zoom magnification and shooting end position, and actual distance measuring means for controlling the distance measuring means at the shooting start position to measure the distance between the airframe and the shooting object. The camera zoom magnification is calculated from the difference between the actual distance to the object to be photographed and the distance to the object to be photographed recorded in the photographing condition information so that the subject has a predetermined size within the camera frame. And means for updating the photographing condition to the zoom magnification, and an imaging processing control means for controlling said each means based on the imaging condition information can be carried out by aerial device provided on the body. The aerial imaging device includes a unit for recording video data of a subject to be photographed, and the recorded video data is collated with video data output from a camera during photographing so that the subject of photographing is included in the video data. And a means for determining whether or not the image is included, and only the video data including the photographing object can be recorded in the recording means. According to the aerial imaging method and aerial imaging apparatus using the unmanned air vehicle of the present invention, the image of the object to be photographed is accurately and sufficiently enlarged within the camera frame with the camera mounted on the unmanned air vehicle. It can be recorded as high-definition video data. Therefore, it can be effectively and effectively used for inspection of the surface of a high-rise structure and inspection of a disaster site, and the cost required for the inspection by the conventional method is greatly reduced.) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of NISHIDA with the teachings of SUMIYA with a reasonable expectation of success since SUMIYA teaches that a drone can include a zoom magnification and an angle of the camera and the shooting start position and a shooting end position. The lidar device can provide a distance to the target and the rotation and zooming and starting and ending can be provided. Additionally, addition objects can be removed from the finished product. This can provide an improved rotational and zooming photographic function while capturing as the drone flies and images the target in FIG..4. See abstract and claim 1-7. Sumiya teaches “..16. The control apparatus according to claim 14, wherein the processor is configured to select a distance measurement device to measure a position of the flying object from the first distance measurement device and the second distance measurement device in accordance with the position of the flying object”. (As shown in FIG. 2, the autonomous control device 3 includes a recording means 31 to which flight plan information is input, an autonomous control means 32 connected to the rotor driving means 32a, a GPS signal receiving means 33, a distance measuring sensor 34, a route guidance. A camera 35, an altitude sensor 36, an orientation sensor 37, a communication means 38, and a control means 39 for controlling these parts are provided. Based on the flight plan information recorded in the recording means 31, the driving of the rotor is controlled by the autonomous control means 32. The A / C helicopter 1 is configured to be able to stably fly autonomously along a predetermined flight route. Specifically, the flight plan information sets the flight conditions such as the flight route, flight speed, flight altitude, and turning location during the period from takeoff from the departure point, flight within the specified range, return to the departure point, and landing again. Data, a processing program for controlling the rotor so as to fly under the set conditions, parameters for executing the program, etc., which are set and inputted in the computer 21 of the ground station 2 and are transmitted via the communication means 38 before the flight. The computer 21 connected to the autonomous control device 3 is transferred to the same device and recorded in the recording means 31. As described above, the flight route is configured by setting a large number of flight points specified by data such as latitude, longitude, altitude, and direction and inputting these data. The GPS signal receiving means 33 is a means for specifying the position of the aircraft in flight, receives GPS signals from GPS satellites, and outputs latitude and longitude data of the flight position as current position information. The distance measuring sensor 34 is attached to the front part of the A / C helicopter 1 and detects the distance from the airframe to the ground structure to keep the distance from the A / C helicopter 1 constant. While preventing generation | occurrence | production, it is for measuring the distance from the electric wire W which is imaging | photography object of the imaging device 4 mentioned later to the body. As the distance measuring sensor, for example, a sensor using a stereo vision camera or a distance measuring means using a laser can be used.) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of NISHIDA with the teachings of SUMIYA with a reasonable expectation of success since SUMIYA teaches that a drone can include a zoom magnification and an angle of the camera and the shooting start position and a shooting end position. The lidar device can provide a distance to the target and the rotation and zooming and starting and ending can be provided. Additionally, addition objects can be removed from the finished product. This can provide an improved rotational and zooming photographic function while capturing as the drone flies and images the target in FIG..4. See abstract and claim 1-7. Sumiya teaches “..17. The control apparatus according to claim 14, wherein the processor is configured to, in a case of setting the flying route with reference to a point positioned outside a first distance measurement region of the first distance measurement device and outside a second distance measurement region of the second distance measurement device, derive a distance between the point and the first distance measurement device based on an angle of a direction in which the point is positioned with respect to the first distance measurement device and on a distance between the first distance measurement device and the second distance measurement device”. (As shown in FIG. 2, the autonomous control device 3 includes a recording means 31 to which flight plan information is input, an autonomous control means 32 connected to the rotor driving means 32a, a GPS signal receiving means 33, a distance measuring sensor 34, a route guidance. A camera 35, an altitude sensor 36, an orientation sensor 37, a communication means 38, and a control means 39 for controlling these parts are provided. Based on the flight plan information recorded in the recording means 31, the driving of the rotor is controlled by the autonomous control means 32. The A / C helicopter 1 is configured to be able to stably fly autonomously along a predetermined flight route. Specifically, the flight plan information sets the flight conditions such as the flight route, flight speed, flight altitude, and turning location during the period from takeoff from the departure point, flight within the specified range, return to the departure point, and landing again. Data, a processing program for controlling the rotor so as to fly under the set conditions, parameters for executing the program, etc., which are set and inputted in the computer 21 of the ground station 2 and are transmitted via the communication means 38 before the flight. The computer 21 connected to the autonomous control device 3 is transferred to the same device and recorded in the recording means 31. As described above, the flight route is configured by setting a large number of flight points specified by data such as latitude, longitude, altitude, and direction and inputting these data. The GPS signal receiving means 33 is a means for specifying the position of the aircraft in flight, receives GPS signals from GPS satellites, and outputs latitude and longitude data of the flight position as current position information. The distance measuring sensor 34 is attached to the front part of the A / C helicopter 1 and detects the distance from the airframe to the ground structure to keep the distance from the A / C helicopter 1 constant. While preventing generation | occurrence | production, it is for measuring the distance from the electric wire W which is imaging | photography object of the imaging device 4 mentioned later to the body. As the distance measuring sensor, for example, a sensor using a stereo vision camera or a distance measuring means using a laser can be used.) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of NISHIDA with the teachings of SUMIYA with a reasonable expectation of success since SUMIYA teaches that a drone can include a zoom magnification and an angle of the camera and the shooting start position and a shooting end position. The lidar device can provide a distance to the target and the rotation and zooming and starting and ending can be provided. Additionally, addition objects can be removed from the finished product. This can provide an improved rotational and zooming photographic function while capturing as the drone flies and images the target in FIG..4. See abstract and claim 1-7. Sumiya teaches “..18. The control apparatus according to claim 17, wherein the processor is configured to, in a case where the flying object is positioned outside the first distance measurement region and outside the second distance measurement region, derive a distance between the flying object and the first distance measurement device based on an angle of a direction in which the flying object is positioned with respect to the first distance measurement device and on the distance between the first distance measurement device and the second distance measurement device”. (As shown in FIG. 2, the autonomous control device 3 includes a recording means 31 to which flight plan information is input, an autonomous control means 32 connected to the rotor driving means 32a, a GPS signal receiving means 33, a distance measuring sensor 34, a route guidance. A camera 35, an altitude sensor 36, an orientation sensor 37, a communication means 38, and a control means 39 for controlling these parts are provided. Based on the flight plan information recorded in the recording means 31, the driving of the rotor is controlled by the autonomous control means 32. The A / C helicopter 1 is configured to be able to stably fly autonomously along a predetermined flight route. Specifically, the flight plan information sets the flight conditions such as the flight route, flight speed, flight altitude, and turning location during the period from takeoff from the departure point, flight within the specified range, return to the departure point, and landing again. Data, a processing program for controlling the rotor so as to fly under the set conditions, parameters for executing the program, etc., which are set and inputted in the computer 21 of the ground station 2 and are transmitted via the communication means 38 before the flight. The computer 21 connected to the autonomous control device 3 is transferred to the same device and recorded in the recording means 31. As described above, the flight route is configured by setting a large number of flight points specified by data such as latitude, longitude, altitude, and direction and inputting these data. The GPS signal receiving means 33 is a means for specifying the position of the aircraft in flight, receives GPS signals from GPS satellites, and outputs latitude and longitude data of the flight position as current position information. The distance measuring sensor 34 is attached to the front part of the A / C helicopter 1 and detects the distance from the airframe to the ground structure to keep the distance from the A / C helicopter 1 constant. While preventing generation | occurrence | production, it is for measuring the distance from the electric wire W which is imaging | photography object of the imaging device 4 mentioned later to the body. As the distance measuring sensor, for example, a sensor using a stereo vision camera or a distance measuring means using a laser can be used.)”. It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of NISHIDA with the teachings of SUMIYA with a reasonable expectation of success since SUMIYA teaches that a drone can include a zoom magnification and an angle of the camera and the shooting start position and a shooting end position. The lidar device can provide a distance to the target and the rotation and zooming and starting and ending can be provided. Additionally, addition objects can be removed from the finished product. This can provide an improved rotational and zooming photographic function while capturing as the drone flies and images the target in FIG..4. See abstract and claim 1-7. Claim 19 is rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of Japanese Patent Pub. No.: JP 2019-117127 A to Nishida and in view of Japanese Patent Pub. No.: JP 2016-111414 A to Sowa et al that was filed in 2014 and in view of International Patent Pub. No.: WO2019023914A1 that was filed in 2017 and Choi and Nadir. PNG media_image6.png 500 982 media_image6.png Greyscale The 914 publication teaches “...19. The control apparatus according to claim 1, wherein the flying object includes a third imaging apparatus, the processor is configured to perform position correction processing of correcting a position of the flying object based on a third image obtained by imaging the inspection target object via the third imaging apparatus in a case where the flying object that has moved from a second imaging position set on the flying route has reached a third imaging position set on the flying route, and (In the embodiment of the present invention, the target flight trajectory includes, but is not limited to, the following target flight trajectory for illustration. For example, assume that drones fly from left to right along the target flight path. Please refer to FIG. 3, FIG. 4, FIG. 5, FIG. 6, and FIG. The shooting scene shown in FIG. 3a is a person standing at the right end of a straight road, and the first target flight trajectory shown includes a first flight trajectory and a second flight trajectory, which is parallel to the target flight trajectory. A portion of the target subject, that is, a portion parallel to the road, the second flight trajectory is a curved portion of the target flight trajectory, and the curvature of the second flight trajectory is increased from small to large. The photographing position interval and the photographing posture determined according to the first target flight trajectory are that the photographing position intervals in the first flight locus are the same, and the photographing posture corresponding to the photographing position in the first flight locus is vertically oriented toward the target photographing object; The photographing position interval in the second flight locus is changed from large to small, and the photographing posture corresponding to the photographing position in the second flight locus is inclined toward the target photographing object.) in a case of acquiring a fourth image by imaging the inspection target object via the third imaging apparatus in accordance with reaching of the flying object to the second imaging position and then acquiring a fifth image by imaging the inspection target object via the third imaging apparatus in accordance with reaching of the flying object to the third imaging position, the position correction processing is processing of correcting the position of the flying object to a position at which an overlap amount between the fourth image and the fifth image is a predetermined overlap amount based on an overlap amount between the fourth image and the third image”. (see claims 1-11 where the drone can provide a panoramic first trajectory and taking multiple images that are stitched or spliced together) PNG media_image7.png 804 932 media_image7.png Greyscale It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of NISHIDA with the teachings of the 914 publication with a reasonable expectation of success since the 914 publication teaches that a drone can include a zoom magnification and an angle of the camera and the start position and an end position and to provide a panoramic video of a curved UAV flight path with different patterns shown in FIG. 1-5. The lidar device can provide a distance to the target and the rotation and zooming and starting and ending can be provided. This can provide a special effect panoramic image that is a video effect. See claims 1-8 and abstract. Nishida teaches “...20. A base station comprising: the control apparatus according to claim 1; the rotational drive apparatus; and the distance measurement device” (see FIG. 1-3 that has a laser scanning sensor. Claim 23 is rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of Japanese Patent Pub. No.: JP 2019-117127 A to Nishida and in view of Japanese Patent Pub. No.: JP 2016-111414 A to Sowa et al that was filed in 2014 and in view of United States Patent Application Pub. No.: US 2019/0068953 A1 to Choi et al. that was filed in 2017 and assigned to Aurora Flight Sciences in view of Japanese Patent Pub. No.: JP2006115137A to Akihide and in view of Nadir. Akihide teaches “...The control apparatus of claim 1, wherein the processor is configured to derive distances between the inspection target and the first imaging apparatus and to perform control for maintaining a constant pixel resolution of the first imaging apparatus by adjusting a zoom magnification of the first imaging apparatus to a zoom magnification at which the pixel resolution of the first imaging apparatus becomes a predetermined reference value based on the distance..”. (see abstract and claims 1-5 where the optical path of the image can be shortened to provide a shortened path to provide a zoomed and magnified image using the mirror drive 21 and 22 to change the optical path from the lens to the image sensor and this is performed without changing the number of pixels; Thus, according to the configuration of this example, the mirror driving units 21 and 22 change the magnification of the image formed on the image sensor 17 by changing the optical path length from the document to the lens 16, Thus, for example, when it is desired to read a small-size original having a narrow reading range such as a photograph or film at a high density, the light arriving from the original reaches the lens 16 in the mirrors 12, 13, 14, and 15. An image is formed on the image sensor 17 by reducing the number of reflections and rotating predetermined mirrors 12 and 13 so as to be guided to the lens 16 so as not to pass through the mirror 14 and shortening the optical path length. Since the image is enlarged, a sufficiently fine image can be obtained as needed without increasing the number of read pixels. That is, when reading an image with relatively high definition, the optical path length from the document K to the lens 16 is shortened to provide an optimum reduction optical system, so that a relatively high image sharpness (for example, MTF) ) And a sufficiently high output can be obtained, so that a high-definition and high-quality image can be obtained. Here, since the number of read pixels is not increased, the amount of data to be processed and the necessary storage capacity can be suppressed, and an expensive CCD and a high-resolution lens are not required, thereby reducing costs. And does not hinder downsizing. )”. It would have been obvious for one of ordinary skill in the art to combine NISHIDA with the teachings of AKIHIDE with a reasonable expectation of success since AKIHIDE teaches that a high definition image can be provided and a zoomed image with maintaining the number of pixels and just changing the optical path for a lower cost. See abstract. PNG media_image3.png 648 974 media_image3.png Greyscale PNG media_image4.png 764 1230 media_image4.png Greyscale It would have been obvious for one of ordinary skill in the art to combine the disclosure of NISHIDA and the teachings of CHOI with a reasonable expectation of success to provide for a drone that can fly and maintain a distance against a second drone 104. The drone can include a sensor pay load 226 in FIG. 2d. This can include multiple cameras for capturing video and still images and LIDAR images. The cameras can provide a pixel resolution of 2448 by 2048 at 73 frames per second or a 800 by 600 pixel resolution at 199 frames per second on the target as the drone is moving. The device can include a camera and light source in FIG. 3a and provide the resolution at the different distances within the triangle. See paragraph 80-90. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JEAN PAUL CASS whose telephone number is (571)270-1934. The examiner can normally be reached Monday to Friday 7 am to 7 pm; Saturday 10 am to 12 noon. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Scott A. Browne can be reached at 571-270-0151. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JEAN PAUL CASS/Primary Examiner, Art Unit 3666
Read full office action

Prosecution Timeline

Dec 10, 2023
Application Filed
Jun 12, 2025
Non-Final Rejection — §103
Jul 17, 2025
Interview Requested
Jul 31, 2025
Applicant Interview (Telephonic)
Jul 31, 2025
Examiner Interview Summary
Sep 10, 2025
Response Filed
Nov 26, 2025
Final Rejection — §103
Feb 03, 2026
Examiner Interview Summary
Feb 03, 2026
Applicant Interview (Telephonic)
Feb 18, 2026
Response after Non-Final Action
Feb 26, 2026
Request for Continued Examination
Mar 13, 2026
Response after Non-Final Action
Mar 17, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12593752
SYSTEM AND METHOD FOR CONTROLLING HARVESTING IMPLEMENT OPERATION OF AN AGRICULTURAL HARVESTER BASED ON TILT ACTUATOR FORCE
2y 5m to grant Granted Apr 07, 2026
Patent 12596986
GLOBAL ADDRESS SYSTEM AND METHOD
2y 5m to grant Granted Apr 07, 2026
Patent 12590801
REAL TIME DETERMINATION OF PEDESTRIAN DIRECTION OF TRAVEL
2y 5m to grant Granted Mar 31, 2026
Patent 12583572
MARINE VESSEL AND MARINE VESSEL PROPULSION CONTROL SYSTEM
2y 5m to grant Granted Mar 24, 2026
Patent 12571183
EXCAVATOR
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
73%
Grant Probability
99%
With Interview (+25.9%)
3y 1m
Median Time to Grant
High
PTA Risk
Based on 984 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month