Prosecution Insights
Last updated: April 19, 2026
Application No. 18/203,684

AIRBORNE INSPECTION METROLOGY

Non-Final OA §103
Filed
May 31, 2023
Examiner
CASS, JEAN PAUL
Art Unit
3666
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Caterpillar Inc.
OA Round
3 (Non-Final)
73%
Grant Probability
Favorable
3-4
OA Rounds
3y 1m
To Grant
99%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
719 granted / 984 resolved
+21.1% vs TC avg
Strong +26% interview lift
Without
With
+25.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
83 currently pending
Career history
1067
Total Applications
across all art units

Statute-Specific Performance

§101
10.5%
-29.5% vs TC avg
§103
56.8%
+16.8% vs TC avg
§102
12.6%
-27.4% vs TC avg
§112
12.8%
-27.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 984 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to the Applicant’s arguments The previous rejection is withdrawn. Applicant’s amendments are entered. Applicant’s remarks are also entered into the record. A new search was made necessitated by the applicant’s amendments. A new reference was found. A new rejection is made herein. Applicant’s arguments are now moot in view of the new rejection of the claims. PNG media_image1.png 636 824 media_image1.png Greyscale Claim 1 and the independent claims are amended to recite and Hunan teaches “...1, (Currently Amended) An airborne coordinate measuring machine (CMM) comprising: a noncontact 3D scanner constructed to obtain measurement data regarding a surface of a component of a work machine under scrutiny; a drone aircraft mechanically coupled to the 3D scanner and constructed to (see Fig. 5 where the drone has a number of sensors including 1. A binocular camera device and also a laser projection device for emitting a laser to form a water mark on the wind turbine; Step S103, generating an inspection flight path of the UAV according to the translation mapping of the inspection path; wherein the UAV is equipped with a binocular camera device for photographing the wind turbine and a laser projection device for emitting laser to form a laser mark on the wind turbine; Step S104, before the UAV performs an inspection flight, the center of the wind turbine hub is used to locate the initial position and collect an initial image; Step S105, identifying the center of the wind turbine hub in the initial image, and determining the coordinate vector Q 0 of the optical center of the binocular camera device of the drone in the first space coordinate system at the initial position;) traverse a previously determined component-specific flight path, which -is specific to the surface the object component under scrutiny_,. to inspect the surface of the component under scrutiny against a specification therefor that defines physical dimensions of the surface of the component under scrutiny; and circuitry configured to (Step S107, determining the coordinate vector P g of the laser mark relative to the optical center of the binocular camera device in the initial image according to the coordinate Q 0 of the optical center of the binocular camera device of the drone in the initial image in the first space coordinate system and the coordinate vector G 0 of the laser mark relative to the optical center of the binocular camera device; wherein P g =G 0 -(-Q 0 ); Step S108, obtaining L inspection images captured by the drone at a set frequency during the inspection flight of the wind turbine blade to be inspected, and determining the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system when shooting the qth inspection image, and determining the coordinate vector G q of the laser mark relative to the optical center of the binocular camera device in the qth inspection image; Step S109, performing error determination on the inspection image according to the coordinate vector Qq and the coordinate vector Gq , and determining whether to re-shoot the inspection image according to the determination result; Step S110, determining whether the wind turbine blade is damaged according to the inspection image, and if so, determining the damage location according to the inspection image corresponding to the damage. In some technical solutions, step S109 includes: Step S109a, calculating and deriving a marking point P q according to the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system, P q =Q q -Q 0 ; Step S109b, calculating the measurement mark point Dq according to the coordinate vector Gq of the laser mark relative to the optical center of the binocular camera device, Dq = Qq- ( Pg - Gq ); Step S109c, calculating the error level R according to the derived marking point Pq and the measured marking point Dq ; the calculation formula of the error level R is: Step S109d: when the error level R exceeds the threshold F, the inspection image is retaken. In some technical solutions, step S110 specifically includes: performing image processing on the inspection image, and extracting color features, shape features, and texture features to determine whether the blade area corresponding to the inspection image is damaged and the type of damage.) based on a selection of the component of the work machine to scrutinize, perform an inspection process according to the previously determined component-specific flight path to inspect the surface of the component under scrutiny, wherein the inspection process includes: determining whether the obtained measurement data regarding the surface of the object component under scrutiny[[,]] is within tolerances relative to the physical dimensions of the surface defined by the specification, (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) and determining whether the component of the work machine is to be repaired or replaced based on a result of said determining whether the measurement data is within the tolerances relative to the physical dimensions of the surface of the component defined by the specification, and update a maintenance log ( The determination module 209 is used to perform error determination on the inspection image according to the coordinate vector Qq and the coordinate vector Gq , and determine whether it is necessary to re-shoot the inspection image according to the determination result. The damage determination module 210 is used to determine whether the wind turbine blade is damaged according to the inspection image, and if so, determine the damage location according to the inspection image corresponding to the damage. Referring to FIG. 7 , in some implementations, the determination module 209 includes: The first calculation submodule 209a is used to calculate and derive the marking point P q according to the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system, where P q = Q q -Q 0 . The second calculation submodule 209b is used to calculate the measurement mark point Dq according to the coordinate vector Gq of the laser mark relative to the optical center of the binocular camera device, Dq = Qq- ( Pg - Gq ). The third calculation submodule 209c is used to calculate the error level R according to the derived marking point Pq and the measured marking point Dq ; the calculation formula of the error level R is: The reshoot submodule 209d is used to reshoot the inspection image when the error level R exceeds the threshold F. In some embodiments, the damage determination module is specifically used to: perform image processing on the inspection image, and extract color features, shape features, and texture features to determine whether the blade area corresponding to the inspection image is damaged and the type of damage.)indicating whether the component of work machine under scrutiny is within the tolerances relative to the physical dimensions of the surface defined by the specification”. (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) It would have been obvious for one of ordinary skill in the art at the time the invention was made to combine the teachings of HUNAN with the disclosure of the primary reference since HUNAN teaches that there can be a first drone that has cameras and a laser scanner. The drone can inspect the wind turbine blades to detect if the wind turbine to see if there can be bulging, cracking and instead requires maintenance and requires replacement. This can provide a constant inspection of the blades to determine early damage control and damage positioning can be performed after the inspection. See abstract and claims 1-10. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1-8 and 21 are rejected under 35 U.S.C. sec.103 as being unpatentable as obvious in view of United States Patent Application Pub. No.: US20170001723A1 to Tanahasi that was filed in 2015 and in view of United States Patent Application Pub. No.: US 20200180791A1 to Kimberly and being assigned to BOEING™ and in view of Chinese Patent Pub. No.: CN114639025B to Hunan that was filed in 2022. PNG media_image1.png 636 824 media_image1.png Greyscale Claim 1 is amended to recite and Hunan teaches “...1, (Currently Amended) An airborne coordinate measuring machine (CMM) comprising: a noncontact 3D scanner constructed to obtain measurement data regarding a surface of a component of a work machine under scrutiny; a drone aircraft mechanically coupled to the 3D scanner and constructed to (see Fig. 5 where the drone has a number of sensors including 1. A binocular camera device and also a laser projection device for emitting a laser to form a water mark on the wind turbine; Step S103, generating an inspection flight path of the UAV according to the translation mapping of the inspection path; wherein the UAV is equipped with a binocular camera device for photographing the wind turbine and a laser projection device for emitting laser to form a laser mark on the wind turbine; Step S104, before the UAV performs an inspection flight, the center of the wind turbine hub is used to locate the initial position and collect an initial image; Step S105, identifying the center of the wind turbine hub in the initial image, and determining the coordinate vector Q 0 of the optical center of the binocular camera device of the drone in the first space coordinate system at the initial position;) traverse a previously determined component-specific flight path, which -is specific to the surface the object component under scrutiny_,. to inspect the surface of the component under scrutiny against a specification therefor that defines physical dimensions of the surface of the component under scrutiny; and circuitry configured to (Step S107, determining the coordinate vector P g of the laser mark relative to the optical center of the binocular camera device in the initial image according to the coordinate Q 0 of the optical center of the binocular camera device of the drone in the initial image in the first space coordinate system and the coordinate vector G 0 of the laser mark relative to the optical center of the binocular camera device; wherein P g =G 0 -(-Q 0 ); Step S108, obtaining L inspection images captured by the drone at a set frequency during the inspection flight of the wind turbine blade to be inspected, and determining the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system when shooting the qth inspection image, and determining the coordinate vector G q of the laser mark relative to the optical center of the binocular camera device in the qth inspection image; Step S109, performing error determination on the inspection image according to the coordinate vector Qq and the coordinate vector Gq , and determining whether to re-shoot the inspection image according to the determination result; Step S110, determining whether the wind turbine blade is damaged according to the inspection image, and if so, determining the damage location according to the inspection image corresponding to the damage. In some technical solutions, step S109 includes: Step S109a, calculating and deriving a marking point P q according to the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system, P q =Q q -Q 0 ; Step S109b, calculating the measurement mark point Dq according to the coordinate vector Gq of the laser mark relative to the optical center of the binocular camera device, Dq = Qq- ( Pg - Gq ); Step S109c, calculating the error level R according to the derived marking point Pq and the measured marking point Dq ; the calculation formula of the error level R is: Step S109d: when the error level R exceeds the threshold F, the inspection image is retaken. In some technical solutions, step S110 specifically includes: performing image processing on the inspection image, and extracting color features, shape features, and texture features to determine whether the blade area corresponding to the inspection image is damaged and the type of damage.) based on a selection of the component of the work machine to scrutinize, perform an inspection process according to the previously determined component-specific flight path to inspect the surface of the component under scrutiny, wherein the inspection process includes: determining whether the obtained measurement data regarding the surface of the object component under scrutiny[[,]] is within tolerances relative to the physical dimensions of the surface defined by the specification, (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) and determining whether the component of the work machine is to be repaired or replaced based on a result of said determining whether the measurement data is within the tolerances relative to the physical dimensions of the surface of the component defined by the specification, and update a maintenance log ( The determination module 209 is used to perform error determination on the inspection image according to the coordinate vector Qq and the coordinate vector Gq , and determine whether it is necessary to re-shoot the inspection image according to the determination result. The damage determination module 210 is used to determine whether the wind turbine blade is damaged according to the inspection image, and if so, determine the damage location according to the inspection image corresponding to the damage. Referring to FIG. 7 , in some implementations, the determination module 209 includes: The first calculation submodule 209a is used to calculate and derive the marking point P q according to the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system, where P q = Q q -Q 0 . The second calculation submodule 209b is used to calculate the measurement mark point Dq according to the coordinate vector Gq of the laser mark relative to the optical center of the binocular camera device, Dq = Qq- ( Pg - Gq ). The third calculation submodule 209c is used to calculate the error level R according to the derived marking point Pq and the measured marking point Dq ; the calculation formula of the error level R is: The reshoot submodule 209d is used to reshoot the inspection image when the error level R exceeds the threshold F. In some embodiments, the damage determination module is specifically used to: perform image processing on the inspection image, and extract color features, shape features, and texture features to determine whether the blade area corresponding to the inspection image is damaged and the type of damage.)indicating whether the component of work machine under scrutiny is within the tolerances relative to the physical dimensions of the surface defined by the specification”. (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) It would have been obvious for one of ordinary skill in the art at the time the invention was made to combine the teachings of HUNAN with the disclosure of the primary reference since HUNAN teaches that there can be a first drone that has cameras and a laser scanner. The drone can inspect the wind turbine blades to detect if the wind turbine to see if there can be bulging, cracking and instead requires maintenance and requires replacement. This can provide a constant inspection of the blades to determine early damage control and damage positioning can be performed after the inspection. See abstract and claims 1-10. PNG media_image2.png 710 1042 media_image2.png Greyscale Tanahasi discloses “... 1. An airborne coordinate measuring machine (CMM) comprising: (see drone in paragraph 1-5 and 19-21 that can fly about the construction area and vehicles and excavator in a pattern that is shown for scanning the construction site and vehicles) PNG media_image3.png 688 514 media_image3.png Greyscale a noncontact 3D scanner constructed to obtain measurement data ..... (see paragraph 37-41 where the drone has a sensor that is a LIDAR sensor and camera and GPS detection to detect the parameters of the vehicle and the construction site) PNG media_image4.png 706 1138 media_image4.png Greyscale a drone aircraft mechanically coupled to the 3D scanner and constructed to traverse ..... (see paragraph 100-105 where the drone in FIG. 8 moves in a back and forth pattern around each object under scrutiny to scan the construction site) PNG media_image5.png 750 1210 media_image5.png Greyscale PNG media_image6.png 682 1170 media_image6.png Greyscale Claim 1 is amended to recite and the primary reference to TANAHASI is silent but KIMBERLY teaches “....... against a specification therefor, and (see paragraph 25 where the inspection drone can determine if there is 1. Excessive sound, vibration, force, temperature to generate data that there is a problem and 2. Service levels and 3. Condition monitoring for proper levels of tire pressure, o2, engine oil, power and health data can be compared to determine if this is a healthy or non healthy aircraft see paragraph 37 where the signature can be compared to a baseline specification to determine a problem or not) circuitry configured to determine ... (see paragraph 25-37 and blocks 102-114 where the system can indicate that a drone has a failure that is likely with a high probability and a second inspection drone is dispatched in block 110 and the second inspection drone can inspect the first done in blocks 112-114) (see FIG. 1 where the inspection drone is dispatched to inspect the first drone; see paragraph 29-31) ) (see paragraph 27-28 where the drone for inspection is provided to inspect the eddy current of the other drone; In accordance with a further alternative method of inspection, the I-UAV 20 may fly in tandem with the U-UAV 10 at a separation distance. If the U-UAV 10 is on the ground, then the I-UAV 20 may move to a location in proximity to the U-UAV 10 and land at that location or land on the U-UAV 10. In accordance with a further alternative inspection method, the I-UAV 20 may fly around the U-UAV 10 while the U-UAV 10 is on the ground. Another ground variant would be when the U-UAV 10 is taxiing and the I-UAV 20 is orbiting around it to capture manifestations of faults that may be exhibited during that flight mode. The distance (if any) separating the I-UAV 20 from the U-UAV 10 during an inspection will depend on the inspection method being used. The I-UAV 20 may be contained in the U-UAV 10 or in a maintenance facility when not in inspection mode.) the obtained measurement data , (see paragraph 37-41 and 54-55 where the first UAV is position on the side of a rotor of the second UAV to listen to the rotor to reduce a failure probability of the scanning whereas if it was place in a second location that is noisy this would cause a failure in the scanning and the test would have to be done a second time) (see paragraph 10-24 where the inspection UAV can provide an inspection of a second UAV that is a utility UAV having multiple rotors; see paragraph 25 where during flight the uav can inspect airplane health management and include an inspection of the engine oil level and the fuel efficiency of the engine and if it is emitting and what are the emission levels and if there is an engine is on or off) is within tolerances relative .... update a maintenance log ...... discrepancy of the surface rendering of the inspected object is withing tolerances relative ..... specification”. It would have been obvious for one of ordinary skill in the art at the time the invention was made to combine the teachings of KIMBERLY with the disclosure of the primary reference since KIMBERLY of BOEING™ teaches that there can be a first drone 20 and a second inspection drone 10. A failure model can indicate that the first drone is having an issue based on the collected data and can have a rotor, or engine failure or some other critical issue that is predicted. See block 102, 104 and 108 and 110. Then the second inspection drone can be dispatched. See blocks 110-114. The I-UAV 20 may fly in tandem with the U-UAV 10 at a separation distance. If the U-UAV 10 is on the ground, then the I-UAV 20 may move to a location in proximity to the U-UAV 10 and land at that location or land on the U-UAV 10. In accordance with a further alternative inspection method, the I-UAV 20 may fly around the U-UAV 10 while the U-UAV 10 is on the ground. Another ground variant would be when the U-UAV 10 is taxiing and the I-UAV 20 is orbiting around it to capture manifestations of faults that may be exhibited during that flight mode. The distance (if any) separating the I-UAV 20 from the U-UAV 10 during an inspection will depend on the inspection method being used. The I-UAV 20 may be contained in the U-UAV 10 or in a maintenance facility when not in inspection mode. The second inspection drone can measure one or more parameters of the first drone using sensors (radar, LIDAR, sonar, x-ray, vibrations, ndi sensors) and then recommend a course of conduct for repair. For example, the drone can measure eddy currents, or irregular noise for a repair and replacement of the engine, motor or sensors. This can provide an automated inspection process that is critical. See paragraph 20-32 and claims 15-20 of Kimberly. Tanahasi discloses “... 2. The airborne CMM of claim 1, wherein the drone aircraft comprises: propellor motors constructed to drive respective propellors in flight according to propellor drive signals provided thereto; (see FIG. 1 where the drone has a propellor for vertical flight and take off and flying over the site in a pattern that is desired) a flight controller communicatively coupled to the propellor motors and constructed to generate the propellor drive signals according to flight path data; and (See element 42) flight path memory circuitry communicatively coupled to the flight controller and constructed to store the flight path data that, when executed by the flight controller, compels the drone aircraft to traverse the flight path that is specific to the object under scrutiny” (see element 11 and 13 and where the drone is controlled to move in a zig zag pattern over the construction machines and the site to capture the lidar and camera data) (see paragraph 100-105 where the drone in FIG. 8 moves in a back and forth pattern around each object under scrutiny to scan the construction site) PNG media_image1.png 636 824 media_image1.png Greyscale Claim 2 is amended to recite and Hunan teaches “..to traverse the previously determined component specific path that is specific to the component of the work machine under scrutiny”. (see Fig. 5 where the drone has a number of sensors including 1. A binocular camera device and also a laser projection device for emitting a laser to form a water mark on the wind turbine; Step S103, generating an inspection flight path of the UAV according to the translation mapping of the inspection path; wherein the UAV is equipped with a binocular camera device for photographing the wind turbine and a laser projection device for emitting laser to form a laser mark on the wind turbine; Step S104, before the UAV performs an inspection flight, the center of the wind turbine hub is used to locate the initial position and collect an initial image; Step S105, identifying the center of the wind turbine hub in the initial image, and determining the coordinate vector Q 0 of the optical center of the binocular camera device of the drone in the first space coordinate system at the initial position;) (Step S107, determining the coordinate vector P g of the laser mark relative to the optical center of the binocular camera device in the initial image according to the coordinate Q 0 of the optical center of the binocular camera device of the drone in the initial image in the first space coordinate system and the coordinate vector G 0 of the laser mark relative to the optical center of the binocular camera device; wherein P g =G 0 -(-Q 0 ); Step S108, obtaining L inspection images captured by the drone at a set frequency during the inspection flight of the wind turbine blade to be inspected, and determining the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system when shooting the qth inspection image, and determining the coordinate vector G q of the laser mark relative to the optical center of the binocular camera device in the qth inspection image; Step S109, performing error determination on the inspection image according to the coordinate vector Qq and the coordinate vector Gq , and determining whether to re-shoot the inspection image according to the determination result; Step S110, determining whether the wind turbine blade is damaged according to the inspection image, and if so, determining the damage location according to the inspection image corresponding to the damage. In some technical solutions, step S109 includes: Step S109a, calculating and deriving a marking point P q according to the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system, P q =Q q -Q 0 ; Step S109b, calculating the measurement mark point Dq according to the coordinate vector Gq of the laser mark relative to the optical center of the binocular camera device, Dq = Qq- ( Pg - Gq ); Step S109c, calculating the error level R according to the derived marking point Pq and the measured marking point Dq ; the calculation formula of the error level R is: Step S109d: when the error level R exceeds the threshold F, the inspection image is retaken. In some technical solutions, step S110 specifically includes: performing image processing on the inspection image, and extracting color features, shape features, and texture features to determine whether the blade area corresponding to the inspection image is damaged and the type of damage.) (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) ( The determination module 209 is used to perform error determination on the inspection image according to the coordinate vector Qq and the coordinate vector Gq , and determine whether it is necessary to re-shoot the inspection image according to the determination result. The damage determination module 210 is used to determine whether the wind turbine blade is damaged according to the inspection image, and if so, determine the damage location according to the inspection image corresponding to the damage. Referring to FIG. 7 , in some implementations, the determination module 209 includes: The first calculation submodule 209a is used to calculate and derive the marking point P q according to the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system, where P q = Q q -Q 0 . The second calculation submodule 209b is used to calculate the measurement mark point Dq according to the coordinate vector Gq of the laser mark relative to the optical center of the binocular camera device, Dq = Qq- ( Pg - Gq ). The third calculation submodule 209c is used to calculate the error level R according to the derived marking point Pq and the measured marking point Dq ; the calculation formula of the error level R is: The reshoot submodule 209d is used to reshoot the inspection image when the error level R exceeds the threshold F. In some embodiments, the damage determination module is specifically used to: perform image processing on the inspection image, and extract color features, shape features, and texture features to determine whether the blade area corresponding to the inspection image is damaged and the type of damage.) (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) It would have been obvious for one of ordinary skill in the art at the time the invention was made to combine the teachings of HUNAN with the disclosure of the primary reference since HUNAN teaches that there can be a first drone that has cameras and a laser scanner. The drone can inspect the wind turbine blades to detect if the wind turbine to see if there can be bulging, cracking and instead requires maintenance and requires replacement. This can provide a constant inspection of the blades to determine early damage control and damage positioning can be performed after the inspection. See abstract and claims 1-10. Tanahasi discloses “... 3. The airborne CMM of claim 2, wherein the drone aircraft further comprises a drone communications component constructed to accept the flight path data from a data source”. (see paragraph 112-119 where the drone can receive and provide data from the remote station that is a communication station) Tanahasi discloses “... 4. The airborne CMM of claim 1, wherein the 3D scanner comprises: a laser array constructed to irradiate a surface region on the object under scrutiny; (See shape detection sensor) a detector constructed to accept laser light reflected from the irradiated surface region; and (see paragraph 37-44) a reference processor constructed to associate the measurement data determined from the reflected laser light with coordinates of a local reference frame”. (see paragraph 44 where the camera, lidar and the location data are all associated using the GPS device) Claim 4 is amended to recite and Hunan teaches “.a laser array constructed to irradiate a region on the surface of the components of the work machine under scrutiny”. (see Fig. 5 where the drone has a number of sensors including 1. A binocular camera device and also a laser projection device for emitting a laser to form a water mark on the wind turbine; Step S103, generating an inspection flight path of the UAV according to the translation mapping of the inspection path; wherein the UAV is equipped with a binocular camera device for photographing the wind turbine and a laser projection device for emitting laser to form a laser mark on the wind turbine; Step S104, before the UAV performs an inspection flight, the center of the wind turbine hub is used to locate the initial position and collect an initial image; Step S105, identifying the center of the wind turbine hub in the initial image, and determining the coordinate vector Q 0 of the optical center of the binocular camera device of the drone in the first space coordinate system at the initial position;) (Step S107, determining the coordinate vector P g of the laser mark relative to the optical center of the binocular camera device in the initial image according to the coordinate Q 0 of the optical center of the binocular camera device of the drone in the initial image in the first space coordinate system and the coordinate vector G 0 of the laser mark relative to the optical center of the binocular camera device; wherein P g =G 0 -(-Q 0 ); Step S108, obtaining L inspection images captured by the drone at a set frequency during the inspection flight of the wind turbine blade to be inspected, and determining the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system when shooting the qth inspection image, and determining the coordinate vector G q of the laser mark relative to the optical center of the binocular camera device in the qth inspection image; Step S109, performing error determination on the inspection image according to the coordinate vector Qq and the coordinate vector Gq , and determining whether to re-shoot the inspection image according to the determination result; Step S110, determining whether the wind turbine blade is damaged according to the inspection image, and if so, determining the damage location according to the inspection image corresponding to the damage. In some technical solutions, step S109 includes: Step S109a, calculating and deriving a marking point P q according to the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system, P q =Q q -Q 0 ; Step S109b, calculating the measurement mark point Dq according to the coordinate vector Gq of the laser mark relative to the optical center of the binocular camera device, Dq = Qq- ( Pg - Gq ); Step S109c, calculating the error level R according to the derived marking point Pq and the measured marking point Dq ; the calculation formula of the error level R is: Step S109d: when the error level R exceeds the threshold F, the inspection image is retaken. In some technical solutions, step S110 specifically includes: performing image processing on the inspection image, and extracting color features, shape features, and texture features to determine whether the blade area corresponding to the inspection image is damaged and the type of damage.) (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) ( The determination module 209 is used to perform error determination on the inspection image according to the coordinate vector Qq and the coordinate vector Gq , and determine whether it is necessary to re-shoot the inspection image according to the determination result. The damage determination module 210 is used to determine whether the wind turbine blade is damaged according to the inspection image, and if so, determine the damage location according to the inspection image corresponding to the damage. Referring to FIG. 7 , in some implementations, the determination module 209 includes: The first calculation submodule 209a is used to calculate and derive the marking point P q according to the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system, where P q = Q q -Q 0 . The second calculation submodule 209b is used to calculate the measurement mark point Dq according to the coordinate vector Gq of the laser mark relative to the optical center of the binocular camera device, Dq = Qq- ( Pg - Gq ). The third calculation submodule 209c is used to calculate the error level R according to the derived marking point Pq and the measured marking point Dq ; the calculation formula of the error level R is: The reshoot submodule 209d is used to reshoot the inspection image when the error level R exceeds the threshold F. In some embodiments, the damage determination module is specifically used to: perform image processing on the inspection image, and extract color features, shape features, and texture features to determine whether the blade area corresponding to the inspection image is damaged and the type of damage.) (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) It would have been obvious for one of ordinary skill in the art at the time the invention was made to combine the teachings of HUNAN with the disclosure of the primary reference since HUNAN teaches that there can be a first drone that has cameras and a laser scanner. The drone can inspect the wind turbine blades to detect if the wind turbine to see if there can be bulging, cracking and instead requires maintenance and requires replacement. This can provide a constant inspection of the blades to determine early damage control and damage positioning can be performed after the inspection. See abstract and claims 1-10. Tanahasi discloses “... 5. The airborne CMM of claim 4, wherein the 3D scanner further comprises a scanner communications component constructed to convey the coordinates of the local reference frame as a point cloud to a data processor” (see paragraph 20-29 where the data is stored in a server and see paragraph 44 where the camera, lidar and the location data are all associated using the GPS device) Tanahasi discloses “... 6. The airborne CMM of claim 1, wherein the 3D scanner is a 3D scanner. (See paragraph 37-44 and 128) see paragraph 44 where the camera, lidar and the location data are all associated using the GPS device)” Claim 6 is amended to recite and the primary reference to TANAHASI is silent but KIMBERLY teaches “...surface rendering in rendered on a translated point cloud as the drone aircraft having a mechanically coupled thereto, the 3d scanner traverse the flight path that is specific to the object under scrutiny (see paragraph 25 where the inspection drone can determine if there is 1. Excessive sound, vibration, force, temperature to generate data that there is a problem and 2. Service levels and 3. Condition monitoring for proper levels of tire pressure, o2, engine oil, power and health data can be compared to determine if this is a healthy or non healthy aircraft see paragraph 37 where the signature can be compared to a baseline specification to determine a problem or not) (see paragraph 25-37 and blocks 102-114 where the system can indicate that a drone has a failure that is likely with a high probability and a second inspection drone is dispatched in block 110 and the second inspection drone can inspect the first done in blocks 112-114) (see FIG. 1 where the inspection drone is dispatched to inspect the first drone; see paragraph 29-31) ) (see paragraph 27-28 where the drone for inspection is provided to inspect the eddy current of the other drone; In accordance with a further alternative method of inspection, the I-UAV 20 may fly in tandem with the U-UAV 10 at a separation distance. If the U-UAV 10 is on the ground, then the I-UAV 20 may move to a location in proximity to the U-UAV 10 and land at that location or land on the U-UAV 10. In accordance with a further alternative inspection method, the I-UAV 20 may fly around the U-UAV 10 while the U-UAV 10 is on the ground. Another ground variant would be when the U-UAV 10 is taxiing and the I-UAV 20 is orbiting around it to capture manifestations of faults that may be exhibited during that flight mode. The distance (if any) separating the I-UAV 20 from the U-UAV 10 during an inspection will depend on the inspection method being used. The I-UAV 20 may be contained in the U-UAV 10 or in a maintenance facility when not in inspection mode.) (see paragraph 37-41 and 54-55 where the first UAV is position on the side of a rotor of the second UAV to listen to the rotor to reduce a failure probability of the scanning whereas if it was place in a second location that is noisy this would cause a failure in the scanning and the test would have to be done a second time) (see paragraph 10-24 where the inspection UAV can provide an inspection of a second UAV that is a utility UAV having multiple rotors; see paragraph 25 where during flight the uav can inspect airplane health management and include an inspection of the engine oil level and the fuel efficiency of the engine and if it is emitting and what are the emission levels and if there is an engine is on or off) It would have been obvious for one of ordinary skill in the art at the time the invention was made to combine the teachings of KIMBERLY with the disclosure of the primary reference since KIMBERLY of BOEING™ teaches that there can be a first drone 20 and a second inspection drone 10. A failure model can indicate that the first drone is having an issue based on the collected data and can have a rotor, or engine failure or some other critical issue that is predicted. See block 102, 104 and 108 and 110. Then the second inspection drone can be dispatched. See blocks 110-114. The I-UAV 20 may fly in tandem with the U-UAV 10 at a separation distance. If the U-UAV 10 is on the ground, then the I-UAV 20 may move to a location in proximity to the U-UAV 10 and land at that location or land on the U-UAV 10. In accordance with a further alternative inspection method, the I-UAV 20 may fly around the U-UAV 10 while the U-UAV 10 is on the ground. Another ground variant would be when the U-UAV 10 is taxiing and the I-UAV 20 is orbiting around it to capture manifestations of faults that may be exhibited during that flight mode. The distance (if any) separating the I-UAV 20 from the U-UAV 10 during an inspection will depend on the inspection method being used. The I-UAV 20 may be contained in the U-UAV 10 or in a maintenance facility when not in inspection mode. The second inspection drone can measure one or more parameters of the first drone using sensors (radar, LIDAR, sonar, x-ray, vibrations, ndi sensors) and then recommend a course of conduct for repair. For example, the drone can measure eddy currents, or irregular noise for a repair and replacement of the engine, motor or sensors. This can provide an automated inspection process that is critical. See paragraph 20-32 and claims 15-20 of Kimberly. Claim 6 is amended to recite and Hunan teaches “..The 3d scanner traverse the previously determined component specific path that is specific to the component of the work machine under scrutiny”. (see Fig. 5 where the drone has a number of sensors including 1. A binocular camera device and also a laser projection device for emitting a laser to form a water mark on the wind turbine; Step S103, generating an inspection flight path of the UAV according to the translation mapping of the inspection path; wherein the UAV is equipped with a binocular camera device for photographing the wind turbine and a laser projection device for emitting laser to form a laser mark on the wind turbine; Step S104, before the UAV performs an inspection flight, the center of the wind turbine hub is used to locate the initial position and collect an initial image; Step S105, identifying the center of the wind turbine hub in the initial image, and determining the coordinate vector Q 0 of the optical center of the binocular camera device of the drone in the first space coordinate system at the initial position;) (Step S107, determining the coordinate vector P g of the laser mark relative to the optical center of the binocular camera device in the initial image according to the coordinate Q 0 of the optical center of the binocular camera device of the drone in the initial image in the first space coordinate system and the coordinate vector G 0 of the laser mark relative to the optical center of the binocular camera device; wherein P g =G 0 -(-Q 0 ); Step S108, obtaining L inspection images captured by the drone at a set frequency during the inspection flight of the wind turbine blade to be inspected, and determining the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system when shooting the qth inspection image, and determining the coordinate vector G q of the laser mark relative to the optical center of the binocular camera device in the qth inspection image; Step S109, performing error determination on the inspection image according to the coordinate vector Qq and the coordinate vector Gq , and determining whether to re-shoot the inspection image according to the determination result; Step S110, determining whether the wind turbine blade is damaged according to the inspection image, and if so, determining the damage location according to the inspection image corresponding to the damage. In some technical solutions, step S109 includes: Step S109a, calculating and deriving a marking point P q according to the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system, P q =Q q -Q 0 ; Step S109b, calculating the measurement mark point Dq according to the coordinate vector Gq of the laser mark relative to the optical center of the binocular camera device, Dq = Qq- ( Pg - Gq ); Step S109c, calculating the error level R according to the derived marking point Pq and the measured marking point Dq ; the calculation formula of the error level R is: Step S109d: when the error level R exceeds the threshold F, the inspection image is retaken. In some technical solutions, step S110 specifically includes: performing image processing on the inspection image, and extracting color features, shape features, and texture features to determine whether the blade area corresponding to the inspection image is damaged and the type of damage.) (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) ( The determination module 209 is used to perform error determination on the inspection image according to the coordinate vector Qq and the coordinate vector Gq , and determine whether it is necessary to re-shoot the inspection image according to the determination result. The damage determination module 210 is used to determine whether the wind turbine blade is damaged according to the inspection image, and if so, determine the damage location according to the inspection image corresponding to the damage. Referring to FIG. 7 , in some implementations, the determination module 209 includes: The first calculation submodule 209a is used to calculate and derive the marking point P q according to the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system, where P q = Q q -Q 0 . The second calculation submodule 209b is used to calculate the measurement mark point Dq according to the coordinate vector Gq of the laser mark relative to the optical center of the binocular camera device, Dq = Qq- ( Pg - Gq ). The third calculation submodule 209c is used to calculate the error level R according to the derived marking point Pq and the measured marking point Dq ; the calculation formula of the error level R is: The reshoot submodule 209d is used to reshoot the inspection image when the error level R exceeds the threshold F. In some embodiments, the damage determination module is specifically used to: perform image processing on the inspection image, and extract color features, shape features, and texture features to determine whether the blade area corresponding to the inspection image is damaged and the type of damage.) (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) It would have been obvious for one of ordinary skill in the art at the time the invention was made to combine the teachings of HUNAN with the disclosure of the primary reference since HUNAN teaches that there can be a first drone that has cameras and a laser scanner. The drone can inspect the wind turbine blades to detect if the wind turbine to see if there can be bulging, cracking and instead requires maintenance and requires replacement. This can provide a constant inspection of the blades to determine early damage control and damage positioning can be performed after the inspection. See abstract and claims 1-10. PNG media_image2.png 710 1042 media_image2.png Greyscale Tanahasi discloses “... 7. An airborne inspection metrology system constructed to inspect an object, the airborne inspection metrology system comprising: : (see drone in paragraph 1-5 and 19-21 that can fly about the construction area and vehicles and excavator in a pattern that is shown for scanning the construction site and vehicles) PNG media_image3.png 688 514 media_image3.png Greyscale airborne coordinate measuring machine (CMM) comprising: (see paragraph 37-41 where the drone has a sensor that is a LIDAR sensor and camera and GPS detection to detect the parameters of the vehicle and the construction site) a noncontact 3D scanner constructed to obtain measurement data of an object under scrutiny; and (see paragraph 37-41 where the drone has a sensor that is a LIDAR sensor and camera and GPS detection to detect the parameters of the vehicle and the construction site) PNG media_image6.png 682 1170 media_image6.png Greyscale a drone aircraft mechanically coupled to the 3D scanner and constructed to traverse a flight path that is specific to the object under scrutiny; and (see FIG. 1 where the drone has a propellor for vertical flight and take off and flying over the site in a pattern that is desired) an information system communicatively coupled to the 3D scanner and the drone aircraft and constructed to accept the measurement data from the 3D scanner and to convey flight path data to the drone aircraft that defines the flight path. (see element 11 and 13 and where the drone is controlled to move in a zig zag pattern over the construction machines and the site to capture the lidar and camera data) (see paragraph 100-105 where the drone in FIG. 8 moves in a back and forth pattern around each object under scrutiny to scan the construction site) Claim 7 is amended to recite and the primary reference to TANAHASI is silent but KIMBERLY teaches “...wherein the information system includes an inspection processor constructed to determine whether the measurement data are within specifications defined according to a component model of the object under scrutiny, and wherein ithe inspection processor determines whether the measurement data as assigned with a translated coordinates are within the specifications defined according to the component model. (see paragraph 25 where the inspection drone can determine if there is 1. Excessive sound, vibration, force, temperature to generate data that there is a problem and 2. Service levels and 3. Condition monitoring for proper levels of tire pressure, o2, engine oil, power and health data can be compared to determine if this is a healthy or non healthy aircraft see paragraph 37 where the signature can be compared to a baseline specification to determine a problem or not) (see paragraph 25-37 and blocks 102-114 where the system can indicate that a drone has a failure that is likely with a high probability and a second inspection drone is dispatched in block 110 and the second inspection drone can inspect the first done in blocks 112-114) (see FIG. 1 where the inspection drone is dispatched to inspect the first drone; see paragraph 29-31) ) (see paragraph 27-28 where the drone for inspection is provided to inspect the eddy current of the other drone; In accordance with a further alternative method of inspection, the I-UAV 20 may fly in tandem with the U-UAV 10 at a separation distance. If the U-UAV 10 is on the ground, then the I-UAV 20 may move to a location in proximity to the U-UAV 10 and land at that location or land on the U-UAV 10. In accordance with a further alternative inspection method, the I-UAV 20 may fly around the U-UAV 10 while the U-UAV 10 is on the ground. Another ground variant would be when the U-UAV 10 is taxiing and the I-UAV 20 is orbiting around it to capture manifestations of faults that may be exhibited during that flight mode. The distance (if any) separating the I-UAV 20 from the U-UAV 10 during an inspection will depend on the inspection method being used. The I-UAV 20 may be contained in the U-UAV 10 or in a maintenance facility when not in inspection mode.) (see paragraph 37-41 and 54-55 where the first UAV is position on the side of a rotor of the second UAV to listen to the rotor to reduce a failure probability of the scanning whereas if it was place in a second location that is noisy this would cause a failure in the scanning and the test would have to be done a second time) (see paragraph 10-24 where the inspection UAV can provide an inspection of a second UAV that is a utility UAV having multiple rotors; see paragraph 25 where during flight the uav can inspect airplane health management and include an inspection of the engine oil level and the fuel efficiency of the engine and if it is emitting and what are the emission levels and if there is an engine is on or off) It would have been obvious for one of ordinary skill in the art at the time the invention was made to combine the teachings of KIMBERLY with the disclosure of the primary reference since KIMBERLY of BOEING™ teaches that there can be a first drone 20 and a second inspection drone 10. A failure model can indicate that the first drone is having an issue based on the collected data and can have a rotor, or engine failure or some other critical issue that is predicted. See block 102, 104 and 108 and 110. Then the second inspection drone can be dispatched. See blocks 110-114. The I-UAV 20 may fly in tandem with the U-UAV 10 at a separation distance. If the U-UAV 10 is on the ground, then the I-UAV 20 may move to a location in proximity to the U-UAV 10 and land at that location or land on the U-UAV 10. In accordance with a further alternative inspection method, the I-UAV 20 may fly around the U-UAV 10 while the U-UAV 10 is on the ground. Another ground variant would be when the U-UAV 10 is taxiing and the I-UAV 20 is orbiting around it to capture manifestations of faults that may be exhibited during that flight mode. The distance (if any) separating the I-UAV 20 from the U-UAV 10 during an inspection will depend on the inspection method being used. The I-UAV 20 may be contained in the U-UAV 10 or in a maintenance facility when not in inspection mode. The second inspection drone can measure one or more parameters of the first drone using sensors (radar, LIDAR, sonar, x-ray, vibrations, ndi sensors) and then recommend a course of conduct for repair. For example, the drone can measure eddy currents, or irregular noise for a repair and replacement of the engine, motor or sensors. This can provide an automated inspection process that is critical. See paragraph 20-32 and claims 15-20 of Kimberly. PNG media_image1.png 636 824 media_image1.png Greyscale Claim 7 is amended to recite and Hunan teaches “...... measurement data regarding a surface of a component of a work machine under scrutiny; a drone aircraft mechanically coupled to the 3D scanner and constructed to (see Fig. 5 where the drone has a number of sensors including 1. A binocular camera device and also a laser projection device for emitting a laser to form a water mark on the wind turbine; Step S103, generating an inspection flight path of the UAV according to the translation mapping of the inspection path; wherein the UAV is equipped with a binocular camera device for photographing the wind turbine and a laser projection device for emitting laser to form a laser mark on the wind turbine; Step S104, before the UAV performs an inspection flight, the center of the wind turbine hub is used to locate the initial position and collect an initial image; Step S105, identifying the center of the wind turbine hub in the initial image, and determining the coordinate vector Q 0 of the optical center of the binocular camera device of the drone in the first space coordinate system at the initial position;) traverse a previously determined component-specific flight path, which -is specific to the surface the object component under scrutiny_,. to inspect the surface of the component under scrutiny against a specification therefor that defines physical dimensions of the surface of the component under scrutiny; and circuitry configured to (Step S107, determining the coordinate vector P g of the laser mark relative to the optical center of the binocular camera device in the initial image according to the coordinate Q 0 of the optical center of the binocular camera device of the drone in the initial image in the first space coordinate system and the coordinate vector G 0 of the laser mark relative to the optical center of the binocular camera device; wherein P g =G 0 -(-Q 0 ); Step S108, obtaining L inspection images captured by the drone at a set frequency during the inspection flight of the wind turbine blade to be inspected, and determining the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system when shooting the qth inspection image, and determining the coordinate vector G q of the laser mark relative to the optical center of the binocular camera device in the qth inspection image; Step S109, performing error determination on the inspection image according to the coordinate vector Qq and the coordinate vector Gq , and determining whether to re-shoot the inspection image according to the determination result; Step S110, determining whether the wind turbine blade is damaged according to the inspection image, and if so, determining the damage location according to the inspection image corresponding to the damage. In some technical solutions, step S109 includes: Step S109a, calculating and deriving a marking point P q according to the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system, P q =Q q -Q 0 ; Step S109b, calculating the measurement mark point Dq according to the coordinate vector Gq of the laser mark relative to the optical center of the binocular camera device, Dq = Qq- ( Pg - Gq ); Step S109c, calculating the error level R according to the derived marking point Pq and the measured marking point Dq ; the calculation formula of the error level R is: Step S109d: when the error level R exceeds the threshold F, the inspection image is retaken. In some technical solutions, step S110 specifically includes: performing image processing on the inspection image, and extracting color features, shape features, and texture features to determine whether the blade area corresponding to the inspection image is damaged and the type of damage.) based on a selection of the component of the work machine to scrutinize, perform an inspection process according to the previously determined component-specific flight path to inspect the surface of the component under scrutiny, wherein the inspection process includes: determining whether the obtained measurement data regarding the surface of the object component under scrutiny[[,]] is within tolerances relative to the physical dimensions of the surface defined by the specification, (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) and determining whether the component of the work machine is to be repaired or replaced based on a result of said determining whether the measurement data is within the tolerances relative to the physical dimensions of the surface of the component defined by the specification, and update a maintenance log ( The determination module 209 is used to perform error determination on the inspection image according to the coordinate vector Qq and the coordinate vector Gq , and determine whether it is necessary to re-shoot the inspection image according to the determination result. The damage determination module 210 is used to determine whether the wind turbine blade is damaged according to the inspection image, and if so, determine the damage location according to the inspection image corresponding to the damage. Referring to FIG. 7 , in some implementations, the determination module 209 includes: The first calculation submodule 209a is used to calculate and derive the marking point P q according to the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system, where P q = Q q -Q 0 . The second calculation submodule 209b is used to calculate the measurement mark point Dq according to the coordinate vector Gq of the laser mark relative to the optical center of the binocular camera device, Dq = Qq- ( Pg - Gq ). The third calculation submodule 209c is used to calculate the error level R according to the derived marking point Pq and the measured marking point Dq ; the calculation formula of the error level R is: The reshoot submodule 209d is used to reshoot the inspection image when the error level R exceeds the threshold F. In some embodiments, the damage determination module is specifically used to: perform image processing on the inspection image, and extract color features, shape features, and texture features to determine whether the blade area corresponding to the inspection image is damaged and the type of damage.)indicating whether the component of work machine under scrutiny is within the tolerances relative to the physical dimensions of the surface defined by the specification”. (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) It would have been obvious for one of ordinary skill in the art at the time the invention was made to combine the teachings of HUNAN with the disclosure of the primary reference since HUNAN teaches that there can be a first drone that has cameras and a laser scanner. The drone can inspect the wind turbine blades to detect if the wind turbine to see if there can be bulging, cracking and instead requires maintenance and requires replacement. This can provide a constant inspection of the blades to determine early damage control and damage positioning can be performed after the inspection. See abstract and claims 1-10. Tanahasi discloses “... 8. The airborne inspection metrology system of claim 7, wherein the information system includes a database constructed to store the flight path data in association with the indicator of the object under scrutiny. (see paragraph 20-29 where the data is stored in a server and see paragraph 44 where the camera, lidar and the location data are all associated using the GPS device) (see paragraph 100-105 where the drone in FIG. 8 moves in a back and forth pattern around each object under scrutiny to scan the construction site) Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim 9 is rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Application Pub. No.: US20170001723A1 to Tanahasi that was filed in 2015 and in view of Chinese Patent Application Pub. No.: CN115640634A to Shanxi Lu that was filed and Kimberly and Hunan. Shanxi Lu teaches “.. 9. The airborne inspection metrology system of claim 7, wherein the information system further includes a site path editor”. (See blocks s20 to s24 where the data can be used to edit the construction plans for construction of a roadway) Tanahasi discloses “...by which the flight path data are generated” (see drone in paragraph 1-5 and 19-21 that can fly about the construction area and vehicles and excavator in a pattern that is shown for scanning the construction site and vehicles) (see paragraph 37-41 where the drone has a sensor that is a LIDAR sensor and camera and GPS detection to detect the parameters of the vehicle and the construction site) (see element 11 and 13 and where the drone is controlled to move in a zig zag pattern over the construction machines and the site to capture the lidar and camera data) (see paragraph 100-105 where the drone in FIG. 8 moves in a back and forth pattern around each object under scrutiny to scan the construction site) (see paragraph 44 where the camera, lidar and the location data are all associated using the GPS device) It would have been obvious for one of ordinary skill in the art to combine the disclosure of TANAHASHI with the teachings of SHANXI LU with a reasonable expectation of success since SHANI LU teaches that an editing CAD software module can provide and receive data from a construction site for modeling and constructing the site for a roadway with increased productivity and planning purposes without being on site and expending resources. See abstract. Claims 10-20 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Application Pub. No.: US20170001723A1 to Tanahasi that was filed in 2015 and in view of Kimberly and Hunan. Tanahasi discloses “... 10. The airborne inspection metrology system of claim 7, wherein the database is further constructed to store the component model in association with the indicator of the ...... under scrutiny, the component model defining specifications on the object under scrutiny. (see paragraph 21-29 where the control center can detect the shape of the vehicles at the construction site using the lidar device and then detect that the shape is correct for operating the vehicles) 22 Hunan teaches “part of the construction structure or machine under scrutiny”. (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) It would have been obvious for one of ordinary skill in the art at the time the invention was made to combine the teachings of HUNAN with the disclosure of the primary reference since HUNAN teaches that there can be a first drone that has cameras and a laser scanner. The drone can inspect the wind turbine blades to detect if the wind turbine to see if there can be bulging, cracking and instead requires maintenance and requires replacement. This can provide a constant inspection of the blades to determine early damage control and damage positioning can be performed after the inspection. See abstract and claims 1-10. Claim 11 is cancelled. Tanahasi discloses “... 11. The airborne inspection metrology system of claim 10, wherein the information system includes an inspection processor constructed to determine whether the measurement data are within the specifications defined on the component model.” (see paragraph 21-29 where the control center can detect the shape of the vehicles at the construction site using the lidar device and then detect that the shape is correct for operating the vehicles) Tanahasi discloses “... 12. The airborne inspection metrology system of claim 7, wherein the airborne CMM comprises: a laser array constructed to irradiate a surface region ..... a detector constructed to accept laser light reflected from the irradiated surface region; and a reference processor constructed to associate the measurement data determined from the reflected laser light with coordinates of a local reference frame. (see paragraph 37-44 where the camera, lidar and the location data are all associated using the GPS device) Hunan teaches “part of the construction structure or machine under scrutiny”. (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) It would have been obvious for one of ordinary skill in the art at the time the invention was made to combine the teachings of HUNAN with the disclosure of the primary reference since HUNAN teaches that there can be a first drone that has cameras and a laser scanner. The drone can inspect the wind turbine blades to detect if the wind turbine to see if there can be bulging, cracking and instead requires maintenance and requires replacement. This can provide a constant inspection of the blades to determine early damage control and damage positioning can be performed after the inspection. See abstract and claims 1-10. Tanahasi discloses “... 13. The airborne inspection metrology system of claim 12 further comprising: a mobile tracker constructed to traverse a tracker path that corresponds to the flight path; (see paragraph 37-44 and 100-104 where the camera, lidar and the location data are all associated using the GPS device) a base tracker constructed to maintain a fixed position during traversal of the flight path by the airborne CMM; and (See fixed system 11 that can control the drone) a tracker communications component in the airborne CMM constructed to accept distance data from the mobile tracker and the base tracker, the tracker communications component being communicatively coupled to the reference processor by which the coordinates of the local reference frame are translated into coordinates of a global reference frame anchored at the fixed position. (see paragraph 30 and 40-45 where the control center can track the movement from the drone to the site and provide RTK tracking data to the drone for determining the shape of the objects at the construction site and their location and if they are working for control purposes) Hunan teaches “...a previous determined flight specific flight path...” (Step S107, determining the coordinate vector P g of the laser mark relative to the optical center of the binocular camera device in the initial image according to the coordinate Q 0 of the optical center of the binocular camera device of the drone in the initial image in the first space coordinate system and the coordinate vector G 0 of the laser mark relative to the optical center of the binocular camera device; wherein P g =G 0 -(-Q 0 ); Step S108, obtaining L inspection images captured by the drone at a set frequency during the inspection flight of the wind turbine blade to be inspected, and determining the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system when shooting the qth inspection image, and determining the coordinate vector G q of the laser mark relative to the optical center of the binocular camera device in the qth inspection image; Step S109, performing error determination on the inspection image according to the coordinate vector Qq and the coordinate vector Gq , and determining whether to re-shoot the inspection image according to the determination result; Step S110, determining whether the wind turbine blade is damaged according to the inspection image, and if so, determining the damage location according to the inspection image corresponding to the damage. In some technical solutions, step S109 includes: Step S109a, calculating and deriving a marking point P q according to the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system, P q =Q q -Q 0 ; Step S109b, calculating the measurement mark point Dq according to the coordinate vector Gq of the laser mark relative to the optical center of the binocular camera device, Dq = Qq- ( Pg - Gq ); Step S109c, calculating the error level R according to the derived marking point Pq and the measured marking point Dq ; the calculation formula of the error level R is: Step S109d: when the error level R exceeds the threshold F, the inspection image is retaken. In some technical solutions, step S110 specifically includes: performing image processing on the inspection image, and extracting color features, shape features, and texture features to determine whether the blade area corresponding to the inspection image is damaged and the type of damage.) It would have been obvious for one of ordinary skill in the art at the time the invention was made to combine the teachings of HUNAN with the disclosure of the primary reference since HUNAN teaches that there can be a first drone that has cameras and a laser scanner. The drone can inspect the wind turbine blades to detect if the wind turbine to see if there can be bulging, cracking and instead requires maintenance and requires replacement. This can provide a constant inspection of the blades to determine early damage control and damage positioning can be performed after the inspection. See abstract and claims 1-10. Tanahasi discloses “... 14. The airborne inspection metrology system of claim 13, wherein the inspection processor determines whether the measurement data as assigned with the translated coordinates are within a specifications defined on a component model. (see paragraph 30 and 40-45 where the control center can track the movement from the drone to the site and provide RTK tracking data to the drone for determining the shape of the objects at the construction site and their location and if they are working for control purposes) Tanahasi discloses “... 15. A method of component inspection by airborne inspection metrology, the method comprising: irradiating a surface region on the component with an airborne laser; : (see drone in paragraph 1-5 and 19-21 that can fly about the construction area and vehicles and excavator in a pattern that is shown for scanning the construction site and vehicles) PNG media_image3.png 688 514 media_image3.png Greyscale (see paragraph 37-41 where the drone has a sensor that is a LIDAR sensor and camera and GPS detection to detect the parameters of the vehicle and the construction site) accepting laser returns reflected from the irradiated surface region on the component at an airborne detector; (see paragraph 100-105 where the drone in FIG. 8 moves in a back and forth pattern around each object under scrutiny to scan the construction site and receiving the LIDAR data to detect the shape of the construction vehicles and objects) generating measurement data from the laser returns that are assigned coordinates of a local reference frame; (see element 11 and 13 and where the drone is controlled to move in a zig zag pattern over the construction machines and the site to capture the lidar and camera data) (see paragraph 100-105 where the drone in FIG. 8 moves in a back and forth pattern around each object under scrutiny to scan the construction site) translating the coordinates of the local reference frame into coordinates of a global reference frame from which physical distance is determined; and (see paragraph 112-119 where the drone can receive and provide data from the remote station that is a communication station) (see paragraph 44 where the camera, lidar and the location data are all associated using the GPS device) determining whether the measurement data translated into the global reference frame are within specifications defined on a component model of the component. (see paragraph 44 where the camera, lidar and the location data are all associated using the GPS device) TANAHASI is silent but KIMBERLY teaches “...determine whether the measurement data translated into a global reference frame are compliant with specifications defined for the component model of the component and (see paragraph 25 where the inspection drone can determine if there is 1. Excessive sound, vibration, force, temperature to generate data that there is a problem and 2. Service levels and 3. Condition monitoring for proper levels of tire pressure, o2, engine oil, power and health data can be compared to determine if this is a healthy or non healthy aircraft see paragraph 37 where the signature can be compared to a baseline specification to determine a problem or not) (see paragraph 25-37 and blocks 102-114 where the system can indicate that a drone has a failure that is likely with a high probability and a second inspection drone is dispatched in block 110 and the second inspection drone can inspect the first done in blocks 112-114) (see FIG. 1 where the inspection drone is dispatched to inspect the first drone; see paragraph 29-31) ) (see paragraph 27-28 where the drone for inspection is provided to inspect the eddy current of the other drone; In accordance with a further alternative method of inspection, the I-UAV 20 may fly in tandem with the U-UAV 10 at a separation distance. If the U-UAV 10 is on the ground, then the I-UAV 20 may move to a location in proximity to the U-UAV 10 and land at that location or land on the U-UAV 10. In accordance with a further alternative inspection method, the I-UAV 20 may fly around the U-UAV 10 while the U-UAV 10 is on the ground. Another ground variant would be when the U-UAV 10 is taxiing and the I-UAV 20 is orbiting around it to capture manifestations of faults that may be exhibited during that flight mode. The distance (if any) separating the I-UAV 20 from the U-UAV 10 during an inspection will depend on the inspection method being used. The I-UAV 20 may be contained in the U-UAV 10 or in a maintenance facility when not in inspection mode.) updating a maintenance record based on whether or not said determining whether the measurement data translated into the global reference frame indicates compliance with the specifications for the component model of the component. (see paragraph 37-41 and 54-55 where the first UAV is position on the side of a rotor of the second UAV to listen to the rotor to reduce a failure probability of the scanning whereas if it was place in a second location that is noisy this would cause a failure in the scanning and the test would have to be done a second time) (see paragraph 10-24 where the inspection UAV can provide an inspection of a second UAV that is a utility UAV having multiple rotors; see paragraph 25 where during flight the uav can inspect airplane health management and include an inspection of the engine oil level and the fuel efficiency of the engine and if it is emitting and what are the emission levels and if there is an engine is on or off) It would have been obvious for one of ordinary skill in the art at the time the invention was made to combine the teachings of KIMBERLY with the disclosure of the primary reference since KIMBERLY of BOEING™ teaches that there can be a first drone 20 and a second inspection drone 10. A failure model can indicate that the first drone is having an issue based on the collected data and can have a rotor, or engine failure or some other critical issue that is predicted. See block 102, 104 and 108 and 110. Then the second inspection drone can be dispatched. See blocks 110-114. The I-UAV 20 may fly in tandem with the U-UAV 10 at a separation distance. If the U-UAV 10 is on the ground, then the I-UAV 20 may move to a location in proximity to the U-UAV 10 and land at that location or land on the U-UAV 10. In accordance with a further alternative inspection method, the I-UAV 20 may fly around the U-UAV 10 while the U-UAV 10 is on the ground. Another ground variant would be when the U-UAV 10 is taxiing and the I-UAV 20 is orbiting around it to capture manifestations of faults that may be exhibited during that flight mode. The distance (if any) separating the I-UAV 20 from the U-UAV 10 during an inspection will depend on the inspection method being used. The I-UAV 20 may be contained in the U-UAV 10 or in a maintenance facility when not in inspection mode. The second inspection drone can measure one or more parameters of the first drone using sensors (radar, LIDAR, sonar, x-ray, vibrations, ndi sensors) and then recommend a course of conduct for repair. For example, the drone can measure eddy currents, or irregular noise for a repair and replacement of the engine, motor or sensors. This can provide an automated inspection process that is critical. See paragraph 20-32 and claims 15-20 of Kimberly. PNG media_image1.png 636 824 media_image1.png Greyscale Claim 16 is amended to recite and Hunan teaches “... accepting flight path data defining [[a]]the previously determined component-specific flight path that is specific to the component of the physical structure or machine under scrutiny”. (see Fig. 5 where the drone has a number of sensors including 1. A binocular camera device and also a laser projection device for emitting a laser to form a water mark on the wind turbine; Step S103, generating an inspection flight path of the UAV according to the translation mapping of the inspection path; wherein the UAV is equipped with a binocular camera device for photographing the wind turbine and a laser projection device for emitting laser to form a laser mark on the wind turbine; Step S104, before the UAV performs an inspection flight, the center of the wind turbine hub is used to locate the initial position and collect an initial image; Step S105, identifying the center of the wind turbine hub in the initial image, and determining the coordinate vector Q 0 of the optical center of the binocular camera device of the drone in the first space coordinate system at the initial position;) (Step S107, determining the coordinate vector P g of the laser mark relative to the optical center of the binocular camera device in the initial image according to the coordinate Q 0 of the optical center of the binocular camera device of the drone in the initial image in the first space coordinate system and the coordinate vector G 0 of the laser mark relative to the optical center of the binocular camera device; wherein P g =G 0 -(-Q 0 ); Step S108, obtaining L inspection images captured by the drone at a set frequency during the inspection flight of the wind turbine blade to be inspected, and determining the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system when shooting the qth inspection image, and determining the coordinate vector G q of the laser mark relative to the optical center of the binocular camera device in the qth inspection image; Step S109, performing error determination on the inspection image according to the coordinate vector Qq and the coordinate vector Gq , and determining whether to re-shoot the inspection image according to the determination result; Step S110, determining whether the wind turbine blade is damaged according to the inspection image, and if so, determining the damage location according to the inspection image corresponding to the damage. In some technical solutions, step S109 includes: Step S109a, calculating and deriving a marking point P q according to the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system, P q =Q q -Q 0 ; Step S109b, calculating the measurement mark point Dq according to the coordinate vector Gq of the laser mark relative to the optical center of the binocular camera device, Dq = Qq- ( Pg - Gq ); Step S109c, calculating the error level R according to the derived marking point Pq and the measured marking point Dq ; the calculation formula of the error level R is: Step S109d: when the error level R exceeds the threshold F, the inspection image is retaken. In some technical solutions, step S110 specifically includes: performing image processing on the inspection image, and extracting color features, shape features, and texture features to determine whether the blade area corresponding to the inspection image is damaged and the type of damage.) (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) ( The determination module 209 is used to perform error determination on the inspection image according to the coordinate vector Qq and the coordinate vector Gq , and determine whether it is necessary to re-shoot the inspection image according to the determination result. The damage determination module 210 is used to determine whether the wind turbine blade is damaged according to the inspection image, and if so, determine the damage location according to the inspection image corresponding to the damage. Referring to FIG. 7 , in some implementations, the determination module 209 includes: The first calculation submodule 209a is used to calculate and derive the marking point P q according to the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system, where P q = Q q -Q 0 . The second calculation submodule 209b is used to calculate the measurement mark point Dq according to the coordinate vector Gq of the laser mark relative to the optical center of the binocular camera device, Dq = Qq- ( Pg - Gq ). The third calculation submodule 209c is used to calculate the error level R according to the derived marking point Pq and the measured marking point Dq ; the calculation formula of the error level R is: The reshoot submodule 209d is used to reshoot the inspection image when the error level R exceeds the threshold F. In some embodiments, the damage determination module is specifically used to: perform image processing on the inspection image, and extract color features, shape features, and texture features to determine whether the blade area corresponding to the inspection image is damaged and the type of damage.)indicating whether the component of work machine under scrutiny is within the tolerances relative to the physical dimensions of the surface defined by the specification”. (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) It would have been obvious for one of ordinary skill in the art at the time the invention was made to combine the teachings of HUNAN with the disclosure of the primary reference since HUNAN teaches that there can be a first drone that has cameras and a laser scanner. The drone can inspect the wind turbine blades to detect if the wind turbine to see if there can be bulging, cracking and instead requires maintenance and requires replacement. This can provide a constant inspection of the blades to determine early damage control and damage positioning can be performed after the inspection. See abstract and claims 1-10. Tanahasi discloses “...16. The method of component inspection of claim 15, further comprising: accepting flight path data defining a flight path that is specific to the component; and .... (see element 11 and 13 and where the drone is controlled to move in a zig zag pattern over the construction machines and the site to capture the lidar and camera data) (see paragraph 100-105 where the drone in FIG. 8 moves in a back and forth pattern around each object under scrutiny to scan the construction site) Claim 1 is amended to recite and Hunan teaches “...1, (Currently Amended) An airborne coordinate measuring machine (CMM) comprising: a noncontact 3D scanner constructed to obtain measurement data regarding a surface of a component of a work machine under scrutiny; a drone aircraft mechanically coupled to the 3D scanner and constructed to (see Fig. 5 where the drone has a number of sensors including 1. A binocular camera device and also a laser projection device for emitting a laser to form a water mark on the wind turbine; Step S103, generating an inspection flight path of the UAV according to the translation mapping of the inspection path; wherein the UAV is equipped with a binocular camera device for photographing the wind turbine and a laser projection device for emitting laser to form a laser mark on the wind turbine; Step S104, before the UAV performs an inspection flight, the center of the wind turbine hub is used to locate the initial position and collect an initial image; Step S105, identifying the center of the wind turbine hub in the initial image, and determining the coordinate vector Q 0 of the optical center of the binocular camera device of the drone in the first space coordinate system at the initial position;) traverse a previously determined component-specific flight path, which -is specific to the surface the object component under scrutiny_,. to inspect the surface of the component under scrutiny against a specification therefor that defines physical dimensions of the surface of the component under scrutiny; and circuitry configured to (Step S107, determining the coordinate vector P g of the laser mark relative to the optical center of the binocular camera device in the initial image according to the coordinate Q 0 of the optical center of the binocular camera device of the drone in the initial image in the first space coordinate system and the coordinate vector G 0 of the laser mark relative to the optical center of the binocular camera device; wherein P g =G 0 -(-Q 0 ); Step S108, obtaining L inspection images captured by the drone at a set frequency during the inspection flight of the wind turbine blade to be inspected, and determining the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system when shooting the qth inspection image, and determining the coordinate vector G q of the laser mark relative to the optical center of the binocular camera device in the qth inspection image; Step S109, performing error determination on the inspection image according to the coordinate vector Qq and the coordinate vector Gq , and determining whether to re-shoot the inspection image according to the determination result; Step S110, determining whether the wind turbine blade is damaged according to the inspection image, and if so, determining the damage location according to the inspection image corresponding to the damage. In some technical solutions, step S109 includes: Step S109a, calculating and deriving a marking point P q according to the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system, P q =Q q -Q 0 ; Step S109b, calculating the measurement mark point Dq according to the coordinate vector Gq of the laser mark relative to the optical center of the binocular camera device, Dq = Qq- ( Pg - Gq ); Step S109c, calculating the error level R according to the derived marking point Pq and the measured marking point Dq ; the calculation formula of the error level R is: Step S109d: when the error level R exceeds the threshold F, the inspection image is retaken. In some technical solutions, step S110 specifically includes: performing image processing on the inspection image, and extracting color features, shape features, and texture features to determine whether the blade area corresponding to the inspection image is damaged and the type of damage.) based on a selection of the component of the work machine to scrutinize, perform an inspection process according to the previously determined component-specific flight path to inspect the surface of the component under scrutiny, wherein the inspection process includes: determining whether the obtained measurement data regarding the surface of the object component under scrutiny[[,]] is within tolerances relative to the physical dimensions of the surface defined by the specification, (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) and determining whether the component of the work machine is to be repaired or replaced based on a result of said determining whether the measurement data is within the tolerances relative to the physical dimensions of the surface of the component defined by the specification, and update a maintenance log ( The determination module 209 is used to perform error determination on the inspection image according to the coordinate vector Qq and the coordinate vector Gq , and determine whether it is necessary to re-shoot the inspection image according to the determination result. The damage determination module 210 is used to determine whether the wind turbine blade is damaged according to the inspection image, and if so, determine the damage location according to the inspection image corresponding to the damage. Referring to FIG. 7 , in some implementations, the determination module 209 includes: The first calculation submodule 209a is used to calculate and derive the marking point P q according to the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system, where P q = Q q -Q 0 . The second calculation submodule 209b is used to calculate the measurement mark point Dq according to the coordinate vector Gq of the laser mark relative to the optical center of the binocular camera device, Dq = Qq- ( Pg - Gq ). The third calculation submodule 209c is used to calculate the error level R according to the derived marking point Pq and the measured marking point Dq ; the calculation formula of the error level R is: The reshoot submodule 209d is used to reshoot the inspection image when the error level R exceeds the threshold F. In some embodiments, the damage determination module is specifically used to: perform image processing on the inspection image, and extract color features, shape features, and texture features to determine whether the blade area corresponding to the inspection image is damaged and the type of damage.)indicating whether the component of work machine under scrutiny is within the tolerances relative to the physical dimensions of the surface defined by the specification”. (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) It would have been obvious for one of ordinary skill in the art at the time the invention was made to combine the teachings of HUNAN with the disclosure of the primary reference since HUNAN teaches that there can be a first drone that has cameras and a laser scanner. The drone can inspect the wind turbine blades to detect if the wind turbine to see if there can be bulging, cracking and instead requires maintenance and requires replacement. This can provide a constant inspection of the blades to determine early damage control and damage positioning can be performed after the inspection. See abstract and claims 1-10. Tanahasi discloses 17. The method of component inspection of claim 16, further comprising retrieving the flight path data through a database query based on an indicator of the component. (see element 11 and 13 and where the drone is controlled to move in a zig zag pattern over the construction machines and the site to capture the lidar and camera data) (see paragraph 100-105 where the drone in FIG. 8 moves in a back and forth pattern around each object under scrutiny to scan the construction site) Claim 17 is amended to recite and Hunan teaches “...of the physical structure or machine under scrutiny; (see Fig. 5 where the drone has a number of sensors including 1. A binocular camera device and also a laser projection device for emitting a laser to form a water mark on the wind turbine; Step S103, generating an inspection flight path of the UAV according to the translation mapping of the inspection path; wherein the UAV is equipped with a binocular camera device for photographing the wind turbine and a laser projection device for emitting laser to form a laser mark on the wind turbine; Step S104, before the UAV performs an inspection flight, the center of the wind turbine hub is used to locate the initial position and collect an initial image; Step S105, identifying the center of the wind turbine hub in the initial image, and determining the coordinate vector Q 0 of the optical center of the binocular camera device of the drone in the first space coordinate system at the initial position;) (Step S107, determining the coordinate vector P g of the laser mark relative to the optical center of the binocular camera device in the initial image according to the coordinate Q 0 of the optical center of the binocular camera device of the drone in the initial image in the first space coordinate system and the coordinate vector G 0 of the laser mark relative to the optical center of the binocular camera device; wherein P g =G 0 -(-Q 0 ); Step S108, obtaining L inspection images captured by the drone at a set frequency during the inspection flight of the wind turbine blade to be inspected, and determining the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system when shooting the qth inspection image, and determining the coordinate vector G q of the laser mark relative to the optical center of the binocular camera device in the qth inspection image; Step S109, performing error determination on the inspection image according to the coordinate vector Qq and the coordinate vector Gq , and determining whether to re-shoot the inspection image according to the determination result; Step S110, determining whether the wind turbine blade is damaged according to the inspection image, and if so, determining the damage location according to the inspection image corresponding to the damage. In some technical solutions, step S109 includes: Step S109a, calculating and deriving a marking point P q according to the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system, P q =Q q -Q 0 ; Step S109b, calculating the measurement mark point Dq according to the coordinate vector Gq of the laser mark relative to the optical center of the binocular camera device, Dq = Qq- ( Pg - Gq ); Step S109c, calculating the error level R according to the derived marking point Pq and the measured marking point Dq ; the calculation formula of the error level R is: Step S109d: when the error level R exceeds the threshold F, the inspection image is retaken. In some technical solutions, step S110 specifically includes: performing image processing on the inspection image, and extracting color features, shape features, and texture features to determine whether the blade area corresponding to the inspection image is damaged and the type of damage.) (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) ( The determination module 209 is used to perform error determination on the inspection image according to the coordinate vector Qq and the coordinate vector Gq , and determine whether it is necessary to re-shoot the inspection image according to the determination result. The damage determination module 210 is used to determine whether the wind turbine blade is damaged according to the inspection image, and if so, determine the damage location according to the inspection image corresponding to the damage. Referring to FIG. 7 , in some implementations, the determination module 209 includes: The first calculation submodule 209a is used to calculate and derive the marking point P q according to the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system, where P q = Q q -Q 0 . The second calculation submodule 209b is used to calculate the measurement mark point Dq according to the coordinate vector Gq of the laser mark relative to the optical center of the binocular camera device, Dq = Qq- ( Pg - Gq ). The third calculation submodule 209c is used to calculate the error level R according to the derived marking point Pq and the measured marking point Dq ; the calculation formula of the error level R is: The reshoot submodule 209d is used to reshoot the inspection image when the error level R exceeds the threshold F. In some embodiments, the damage determination module is specifically used to: perform image processing on the inspection image, and extract color features, shape features, and texture features to determine whether the blade area corresponding to the inspection image is damaged and the type of damage.) (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) It would have been obvious for one of ordinary skill in the art at the time the invention was made to combine the teachings of HUNAN with the disclosure of the primary reference since HUNAN teaches that there can be a first drone that has cameras and a laser scanner. The drone can inspect the wind turbine blades to detect if the wind turbine to see if there can be bulging, cracking and instead requires maintenance and requires replacement. This can provide a constant inspection of the blades to determine early damage control and damage positioning can be performed after the inspection. See abstract and claims 1-10. Tanahasi discloses “... 18. The method of component inspection of claim 15, further comprising accepting distance data from at least one tracker in optical communication with the airborne detector; and translating the coordinates of the local reference frame into the coordinates of the global reference frame using the accepted distance data. (see drone in paragraph 1-5 and 19-21 that can fly about the construction area and vehicles and excavator in a pattern that is shown for scanning the construction site and vehicles) (see paragraph 37-41 where the drone has a sensor that is a LIDAR sensor and camera and GPS detection to detect the parameters of the vehicle and the construction site) (see element 11 and 13 and where the drone is controlled to move in a zig zag pattern over the construction machines and the site to capture the lidar and camera data) (see paragraph 100-105 where the drone in FIG. 8 moves in a back and forth pattern around each object under scrutiny to scan the construction site) (see paragraph 44 where the camera, lidar and the location data are all associated using the GPS device) Tanahasi discloses “... 19. The method of component inspection of claim 18, further comprising accepting the distance data from a mobile tracker traversing a tracker path that corresponds to the flight path, and from a base tracker that remains stationary relative to the component. (see drone in paragraph 1-5 and 19-21 that can fly about the construction area and vehicles and excavator in a pattern that is shown for scanning the construction site and vehicles) (see paragraph 37-41 where the drone has a sensor that is a LIDAR sensor and camera and GPS detection to detect the parameters of the vehicle and the construction site) (see element 11 and 13 and where the drone is controlled to move in a zig zag pattern over the construction machines and the site to capture the lidar and camera data) (see paragraph 100-105 where the drone in FIG. 8 moves in a back and forth pattern around each object under scrutiny to scan the construction site) (see paragraph 44 where the camera, lidar and the location data are all associated using the GPS device) Claim 19 is amended to recite and Hunan teaches “..the previously determine component specific flight path.....of the physical structure or machine under scrutiny; (see Fig. 5 where the drone has a number of sensors including 1. A binocular camera device and also a laser projection device for emitting a laser to form a water mark on the wind turbine; Step S103, generating an inspection flight path of the UAV according to the translation mapping of the inspection path; wherein the UAV is equipped with a binocular camera device for photographing the wind turbine and a laser projection device for emitting laser to form a laser mark on the wind turbine; Step S104, before the UAV performs an inspection flight, the center of the wind turbine hub is used to locate the initial position and collect an initial image; Step S105, identifying the center of the wind turbine hub in the initial image, and determining the coordinate vector Q 0 of the optical center of the binocular camera device of the drone in the first space coordinate system at the initial position;) (Step S107, determining the coordinate vector P g of the laser mark relative to the optical center of the binocular camera device in the initial image according to the coordinate Q 0 of the optical center of the binocular camera device of the drone in the initial image in the first space coordinate system and the coordinate vector G 0 of the laser mark relative to the optical center of the binocular camera device; wherein P g =G 0 -(-Q 0 ); Step S108, obtaining L inspection images captured by the drone at a set frequency during the inspection flight of the wind turbine blade to be inspected, and determining the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system when shooting the qth inspection image, and determining the coordinate vector G q of the laser mark relative to the optical center of the binocular camera device in the qth inspection image; Step S109, performing error determination on the inspection image according to the coordinate vector Qq and the coordinate vector Gq , and determining whether to re-shoot the inspection image according to the determination result; Step S110, determining whether the wind turbine blade is damaged according to the inspection image, and if so, determining the damage location according to the inspection image corresponding to the damage. In some technical solutions, step S109 includes: Step S109a, calculating and deriving a marking point P q according to the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system, P q =Q q -Q 0 ; Step S109b, calculating the measurement mark point Dq according to the coordinate vector Gq of the laser mark relative to the optical center of the binocular camera device, Dq = Qq- ( Pg - Gq ); Step S109c, calculating the error level R according to the derived marking point Pq and the measured marking point Dq ; the calculation formula of the error level R is: Step S109d: when the error level R exceeds the threshold F, the inspection image is retaken. In some technical solutions, step S110 specifically includes: performing image processing on the inspection image, and extracting color features, shape features, and texture features to determine whether the blade area corresponding to the inspection image is damaged and the type of damage.) (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) ( The determination module 209 is used to perform error determination on the inspection image according to the coordinate vector Qq and the coordinate vector Gq , and determine whether it is necessary to re-shoot the inspection image according to the determination result. The damage determination module 210 is used to determine whether the wind turbine blade is damaged according to the inspection image, and if so, determine the damage location according to the inspection image corresponding to the damage. Referring to FIG. 7 , in some implementations, the determination module 209 includes: The first calculation submodule 209a is used to calculate and derive the marking point P q according to the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system, where P q = Q q -Q 0 . The second calculation submodule 209b is used to calculate the measurement mark point Dq according to the coordinate vector Gq of the laser mark relative to the optical center of the binocular camera device, Dq = Qq- ( Pg - Gq ). The third calculation submodule 209c is used to calculate the error level R according to the derived marking point Pq and the measured marking point Dq ; the calculation formula of the error level R is: The reshoot submodule 209d is used to reshoot the inspection image when the error level R exceeds the threshold F. In some embodiments, the damage determination module is specifically used to: perform image processing on the inspection image, and extract color features, shape features, and texture features to determine whether the blade area corresponding to the inspection image is damaged and the type of damage.) (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) It would have been obvious for one of ordinary skill in the art at the time the invention was made to combine the teachings of HUNAN with the disclosure of the primary reference since HUNAN teaches that there can be a first drone that has cameras and a laser scanner. The drone can inspect the wind turbine blades to detect if the wind turbine to see if there can be bulging, cracking and instead requires maintenance and requires replacement. This can provide a constant inspection of the blades to determine early damage control and damage positioning can be performed after the inspection. See abstract and claims 1-10. Tanahasi discloses “... 20. The method of component inspection of claim 19, further comprising translating the coordinates of the local reference frame into the coordinates of the global reference frame using both the distance data from the mobile tracker and the distance data from the base tracker. (see drone in paragraph 1-5 and 19-21 that can fly about the construction area and vehicles and excavator in a pattern that is shown for scanning the construction site and vehicles) (see paragraph 37-41 where the drone has a sensor that is a LIDAR sensor and camera and GPS detection to detect the parameters of the vehicle and the construction site) (see element 11 and 13 and where the drone is controlled to move in a zig zag pattern over the construction machines and the site to capture the lidar and camera data) (see paragraph 100-105 where the drone in FIG. 8 moves in a back and forth pattern around each object under scrutiny to scan the construction site) (see paragraph 44 where the camera, lidar and the location data are all associated using the GPS device) Claim 21 is added to recite and Hunan teaches “...21. (New) The airborne inspection metrology system of claim 7, wherein the previously determined part-specific slight path is created based on a selection of the part of the construction structure or machine to scrutinize, wherein the part of the construction structure or machine under scrutiny is stationary during the inspection processing, and wherein the previously determined part-specific flight path includes scanning angles specific to the surface of the part of the construction structure or machine under scrutiny”. (see Fig. 5 where the drone has a number of sensors including 1. A binocular camera device and also a laser projection device for emitting a laser to form a water mark on the wind turbine; Step S103, generating an inspection flight path of the UAV according to the translation mapping of the inspection path; wherein the UAV is equipped with a binocular camera device for photographing the wind turbine and a laser projection device for emitting laser to form a laser mark on the wind turbine; Step S104, before the UAV performs an inspection flight, the center of the wind turbine hub is used to locate the initial position and collect an initial image; Step S105, identifying the center of the wind turbine hub in the initial image, and determining the coordinate vector Q 0 of the optical center of the binocular camera device of the drone in the first space coordinate system at the initial position;) (Step S107, determining the coordinate vector P g of the laser mark relative to the optical center of the binocular camera device in the initial image according to the coordinate Q 0 of the optical center of the binocular camera device of the drone in the initial image in the first space coordinate system and the coordinate vector G 0 of the laser mark relative to the optical center of the binocular camera device; wherein P g =G 0 -(-Q 0 ); Step S108, obtaining L inspection images captured by the drone at a set frequency during the inspection flight of the wind turbine blade to be inspected, and determining the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system when shooting the qth inspection image, and determining the coordinate vector G q of the laser mark relative to the optical center of the binocular camera device in the qth inspection image; Step S109, performing error determination on the inspection image according to the coordinate vector Qq and the coordinate vector Gq , and determining whether to re-shoot the inspection image according to the determination result; Step S110, determining whether the wind turbine blade is damaged according to the inspection image, and if so, determining the damage location according to the inspection image corresponding to the damage. In some technical solutions, step S109 includes: Step S109a, calculating and deriving a marking point P q according to the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system, P q =Q q -Q 0 ; Step S109b, calculating the measurement mark point Dq according to the coordinate vector Gq of the laser mark relative to the optical center of the binocular camera device, Dq = Qq- ( Pg - Gq ); Step S109c, calculating the error level R according to the derived marking point Pq and the measured marking point Dq ; the calculation formula of the error level R is: Step S109d: when the error level R exceeds the threshold F, the inspection image is retaken. In some technical solutions, step S110 specifically includes: performing image processing on the inspection image, and extracting color features, shape features, and texture features to determine whether the blade area corresponding to the inspection image is damaged and the type of damage.) (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) ( The determination module 209 is used to perform error determination on the inspection image according to the coordinate vector Qq and the coordinate vector Gq , and determine whether it is necessary to re-shoot the inspection image according to the determination result. The damage determination module 210 is used to determine whether the wind turbine blade is damaged according to the inspection image, and if so, determine the damage location according to the inspection image corresponding to the damage. Referring to FIG. 7 , in some implementations, the determination module 209 includes: The first calculation submodule 209a is used to calculate and derive the marking point P q according to the coordinate vector Q q of the optical center of the binocular camera device of the drone in the first space coordinate system, where P q = Q q -Q 0 . The second calculation submodule 209b is used to calculate the measurement mark point Dq according to the coordinate vector Gq of the laser mark relative to the optical center of the binocular camera device, Dq = Qq- ( Pg - Gq ). The third calculation submodule 209c is used to calculate the error level R according to the derived marking point Pq and the measured marking point Dq ; the calculation formula of the error level R is: The reshoot submodule 209d is used to reshoot the inspection image when the error level R exceeds the threshold F. In some embodiments, the damage determination module is specifically used to: perform image processing on the inspection image, and extract color features, shape features, and texture features to determine whether the blade area corresponding to the inspection image is damaged and the type of damage.) (see claims 1-4 where the wind turbine is inspected as having a smooth surface or alternatively the damage and it includes 1. Bulging, corrosion, cracking, surface cracking and blisters and then calling for a replacement of the item Referring to FIG1 , the embodiment of the present application proposes a method for repairing wind turbine blades assisted by a drone, which includes steps S101 to S110. The method can efficiently locate damage, so that maintenance personnel can accurately and quickly go to the damaged location for maintenance, thereby improving work efficiency and reducing the time for operators to work at height. The following is a detailed description of each step in conjunction with the accompanying drawings.) It would have been obvious for one of ordinary skill in the art at the time the invention was made to combine the teachings of HUNAN with the disclosure of the primary reference since HUNAN teaches that there can be a first drone that has cameras and a laser scanner. The drone can inspect the wind turbine blades to detect if the wind turbine to see if there can be bulging, cracking and instead requires maintenance and requires replacement. This can provide a constant inspection of the blades to determine early damage control and damage positioning can be performed after the inspection. See abstract and claims 1-10. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JEAN PAUL CASS whose telephone number is (571)270-1934. The examiner can normally be reached Monday to Friday 7 am to 7 pm; Saturday 10 am to 12 noon. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Scott A. Browne can be reached on 571-270-0151. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JEAN PAUL CASS/Primary Examiner, Art Unit 3666
Read full office action

Prosecution Timeline

May 31, 2023
Application Filed
Apr 18, 2025
Non-Final Rejection — §103
Jul 21, 2025
Response Filed
Sep 25, 2025
Final Rejection — §103
Dec 30, 2025
Request for Continued Examination
Feb 11, 2026
Response after Non-Final Action
Feb 23, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12593752
SYSTEM AND METHOD FOR CONTROLLING HARVESTING IMPLEMENT OPERATION OF AN AGRICULTURAL HARVESTER BASED ON TILT ACTUATOR FORCE
2y 5m to grant Granted Apr 07, 2026
Patent 12596986
GLOBAL ADDRESS SYSTEM AND METHOD
2y 5m to grant Granted Apr 07, 2026
Patent 12590801
REAL TIME DETERMINATION OF PEDESTRIAN DIRECTION OF TRAVEL
2y 5m to grant Granted Mar 31, 2026
Patent 12583572
MARINE VESSEL AND MARINE VESSEL PROPULSION CONTROL SYSTEM
2y 5m to grant Granted Mar 24, 2026
Patent 12571183
EXCAVATOR
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
73%
Grant Probability
99%
With Interview (+25.9%)
3y 1m
Median Time to Grant
High
PTA Risk
Based on 984 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month